7,146 1 year ago

A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.

ollama run huihui_ai/deepseek-v3:671b-q3_K

Models

View all →

Readme

Note: this model requires Ollama 0.5.5 or later.