Models
GitHub
Discord
Turbo
Sign in
Download
Models
Download
GitHub
Discord
Sign in
lucataco
/
deepseek-v3-64k
18
Downloads
Updated
7 months ago
A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.
A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.
Cancel
Name
1 model
Size
Context
Input
deepseek-v3-64k:latest
74b583cd0ec8
• 404GB • 160K context window •
Text input • 7 months ago
Text input • 7 months ago
deepseek-v3-64k:latest
404GB
160K
Text
74b583cd0ec8
· 7 months ago