403 1 year ago

Qwen2.5 models are pretrained on Alibaba's latest large-scale dataset, encompassing up to 18 trillion tokens. The model supports up to 128K tokens and has multilingual capabilities. The following model is specialized on Cline (previously Claude-dev)

tools
ollama run mbenhamd/qwen2.5-14b-instruct-cline-128k-q8_0

Applications

Claude Code
Claude Code ollama launch claude --model mbenhamd/qwen2.5-14b-instruct-cline-128k-q8_0
Codex
Codex ollama launch codex --model mbenhamd/qwen2.5-14b-instruct-cline-128k-q8_0
OpenCode
OpenCode ollama launch opencode --model mbenhamd/qwen2.5-14b-instruct-cline-128k-q8_0
OpenClaw
OpenClaw ollama launch openclaw --model mbenhamd/qwen2.5-14b-instruct-cline-128k-q8_0

Models

View all →

Readme

No readme