92 11 months ago

Single file version with (Dynamic Quants) A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.

ollama run org/deepseek-v3-fast

Models

View all →

Readme

No readme