CodeQwen1.5 is a large language model pretrained on a large amount of code data.
7b
114.4K Pulls Updated 4 months ago
Updated 7 months ago
7 months ago
f076b41b0d2e · 7.7GB
model
archqwen2
·
parameters7.25B
·
quantizationQ8_0
7.7GB
license
Tongyi Qianwen LICENSE AGREEMENT
Tongyi Qianwen Release Date: August 3, 2023
By clicking to agree
6.9kB
Readme
CodeQwen1.5 is based on Qwen1.5. It is trained on 3 trillion tokens of code data. Its major features include:
- Strong code generation capabilities and competitive performance across a series of benchmarks
- Support for long context understanding and generation with a maximum context length of 64K tokens
- Support for 92 coding languages
- Excellent performance in Text-to-SQL, fixing bugs and other coding use cases.