A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.
2,385 Pulls Updated 2 months ago
Updated 2 months ago
2 months ago
12b8c735ab7d · 244GB
model
archdeepseek2
·
parameters671B
·
quantizationQ2_K
244GB
params
{
"stop": [
"<|begin▁of▁sentence|>",
"<|end▁of▁sentence|>",
148B
template
{{- range $i, $_ := .Messages }}
{{- if eq .Role "user" }}<|User|>
{{- else if eq .Role "assista
359B
license
DEEPSEEK LICENSE AGREEMENT
Version 1.0, 23 October 2023
Copyright (c) 2023 DeepSeek
Section I: PR
14kB
Readme
Note: this model requires Ollama 0.5.5 or later.