huihui_ai/ deepseek-v3:671b-q2_K

7,146 1 year ago

A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.

ollama run huihui_ai/deepseek-v3:671b-q2_K

Details

1 year ago

12b8c735ab7d · 244GB ·

deepseek2
·
671B
·
Q2_K
DEEPSEEK LICENSE AGREEMENT Version 1.0, 23 October 2023 Copyright (c) 2023 DeepSeek Section I: PREAM
{ "stop": [ "<|begin▁of▁sentence|>", "<|end▁of▁sentence|>",
{{- range $i, $_ := .Messages }} {{- if eq .Role "user" }}<|User|> {{- else if eq .Role "assista

Readme

Note: this model requires Ollama 0.5.5 or later.