Models
GitHub
Discord
Turbo
Sign in
Download
Models
Download
GitHub
Discord
Sign in
deepseek-v3
2.2M
Downloads
Updated
7 months ago
A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.
A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.
Cancel
671b
Name
5 models
Size
Context
Input
deepseek-v3:latest
5da0e2d4a9e0
• 404GB • 160K context window •
Text input • 7 months ago
Text input • 7 months ago
deepseek-v3:latest
404GB
160K
Text
5da0e2d4a9e0
· 7 months ago
deepseek-v3:671b
latest
5da0e2d4a9e0
• 404GB • 160K context window •
Text input • 7 months ago
Text input • 7 months ago
deepseek-v3:671b
latest
404GB
160K
Text
5da0e2d4a9e0
· 7 months ago
deepseek-v3:671b-q4_K_M
5da0e2d4a9e0
• 404GB • 160K context window •
Text input • 7 months ago
Text input • 7 months ago
deepseek-v3:671b-q4_K_M
404GB
160K
Text
5da0e2d4a9e0
· 7 months ago
deepseek-v3:671b-q8_0
96061c74c1a5
• 713GB • 4K context window •
Text input • 7 months ago
Text input • 7 months ago
deepseek-v3:671b-q8_0
713GB
4K
Text
96061c74c1a5
· 7 months ago
deepseek-v3:671b-fp16
7770bf5a5ed8
• 1.3TB • 4K context window •
Text input • 7 months ago
Text input • 7 months ago
deepseek-v3:671b-fp16
1.3TB
4K
Text
7770bf5a5ed8
· 7 months ago