This is a modified model that adds support for autonomous coding agents like Cline
tools
1.5b
7b
8b
14b
32b
70b
544.8K Pulls Updated 7 weeks ago
Updated 7 weeks ago
7 weeks ago
5ca6ed1a6404 · 66GB
model
archqwen2
·
parameters32.8B
·
quantizationF16
66GB
params
{
"num_ctx": 32768,
"stop": [
"<|im_start|>",
"<|im_end|>",
"<|endof
119B
license
MIT License
Copyright (c) 2023 DeepSeek
Permission is hereby granted, free of charge, to any perso
1.1kB
template
{{- if .Suffix }}<|fim_prefix|>{{ .Prompt }}<|fim_suffix|>{{ .Suffix }}<|fim_middle|>
{{- else if .M
1.6kB
Readme
This enhanced version has been specifically modified to provide better support for autonomous coding agents, offering improved code understanding, generation, and interaction capabilities. The model is shipped with optimal parameter settings to ensure consistent and reliable performance with Cline. It also supports the new Plan and Act modes.
Technical Details
Context Window
- Extended context window: 32k tokens
- Enables processing of larger codebases and complex prompting required by coding agents
- Maintains longer conversation history for better context retention
- Improved project-wide understanding and consistency
Temperature Settings
- Default temperature: 0.6
- Optimized for:
- Consistent code generation
- Reduced randomness in outputs and prevents endless chain of thought
We have tested performance on models with 32b and 70b parameters. Models with 1.5b, 7b, 8b and 14b parameters are provided for reference only.