Models
GitHub
Discord
Docs
Cloud
Sign in
Download
Models
Download
GitHub
Discord
Docs
Cloud
Sign in
org
/
deepseek-v3-fast
:Q4_K_M
85
Downloads
Updated
7 months ago
Single file version with (Dynamic Quants) A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.
Single file version with (Dynamic Quants) A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.
Cancel
Updated 7 months ago
7 months ago
1150b30b4e73 · 404GB ·
model
arch
deepseek2
·
parameters
671B
·
quantization
Q4_K_M
404GB
system
You are a friendly assistant.
29B
Readme
No readme
Write
Preview
Paste, drop or click to upload images (.png, .jpeg, .jpg, .svg, .gif)