401 1 year ago

Qwen2.5-Coder is the latest series of Code-Specific Qwen large language models (formerly known as CodeQwen)

tools

1 year ago

87098ba7390d · 4.7GB ·

qwen2
·
7.62B
·
Q4_K_M
{{- if .Suffix }}<|fim_prefix|>{{ .Prompt }}<|fim_suffix|>{{ .Suffix }}<|fim_middle|> {{- else if .M
Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR US
{ "stop": [ "<|endoftext|>" ] }
You are a helpful assistant.

Readme

Introduction

Quantized version of Qwen2.5-Coder - the latest series of Code-Specific Qwen large language models (formerly known as CodeQwen). For Qwen2.5-Coder, we release three base language models and instruction-tuned language models, 1.5, 7 and 32 (coming soon) billion parameters. Qwen2.5-Coder brings the following improvements upon CodeQwen1.5:

Significantly improvements in code generation, code reasoning and code fixing. Base on the strong Qwen2.5, we scale up the training tokens into 5.5 trillion including source code, text-code grounding, Synthetic data, etc. A more comprehensive foundation for real-world applications such as Code Agents. Not only enhancing coding capabilities but also maintaining its strengths in mathematics and general competencies.

Long-context Support up to 128K tokens.

More info: https://huggingface.co/Qwen/Qwen2.5-Coder-7B-Instruct