deepseek-v2:236b-chat-q5_1
166.7K Downloads Updated 1 year ago
A strong, economical, and efficient Mixture-of-Experts language model.
16b
236b
Updated 1 year ago
1 year ago
bd9e0b0c8aa1 · 177GB
{{ if .System }}{{ .System }}
{{ end }}{{ if .Prompt }}User: {{ .Prompt }}
{{ end }}Assistant:{{ .
111B
DEEPSEEK LICENSE AGREEMENT
Version 1.0, 23 October 2023
Copyright (c) 2023 DeepSeek
Section I: PR
14kB
Readme
Note: this model requires Ollama 0.1.40.
DeepSeek-V2 is a a strong Mixture-of-Experts (MoE) language model characterized by economical training and efficient inference.
Note: this model is bilingual in English and Chinese.
The model comes in two sizes:
- 16B Lite:
ollama run deepseek-v2:16b
- 236B:
ollama run deepseek-v2:236b