Blog
Discord
GitHub
Models
Sign in
Download
Models
Discord
Blog
GitHub
Download
Sign in
mannix
/
mixtral_7bx2_moe
A high-quality Mixture of Experts (MoE) model with open weights by Mistral AI.
A high-quality Mixture of Experts (MoE) model with open weights by Mistral AI.
Cancel
197
Pulls
Updated
7 months ago
q3_K_M
q3_K_M
6.2GB
View all
11 Tags
mixtral_7bx2_moe:q3_K_M
...
/
template
b967f49e09bf · 44B
"[INST] {{ .System }} {{ .Prompt }} [/INST]"