deepseek-v2:16b-lite-chat-q5_K_S

159.8K 12 months ago

A strong, economical, and efficient Mixture-of-Experts language model.

16b 236b

12 months ago

d904df1d3b59 · 11GB

deepseek2
·
15.7B
·
Q5_K_S
{{ if .System }}{{ .System }} {{ end }}{{ if .Prompt }}User: {{ .Prompt }} {{ end }}Assistant:{{ .
DEEPSEEK LICENSE AGREEMENT Version 1.0, 23 October 2023 Copyright (c) 2023 DeepSeek Section I: PR
{ "stop": [ "User:", "Assistant:" ] }

Readme

Note: this model requires Ollama 0.1.40.

DeepSeek-V2 is a a strong Mixture-of-Experts (MoE) language model characterized by economical training and efficient inference.

Note: this model is bilingual in English and Chinese.

The model comes in two sizes:

  • 16B Lite: ollama run deepseek-v2:16b
  • 236B: ollama run deepseek-v2:236b

References

GitHub