Models
GitHub
Discord
Docs
Cloud
Sign in
Download
Models
Download
GitHub
Discord
Docs
Cloud
Sign in
mixtral
:instruct
1.4M
Downloads
Updated
9 months ago
A set of Mixture of Experts (MoE) model with open weights by Mistral AI in 8x7b and 8x22b parameter sizes.
A set of Mixture of Experts (MoE) model with open weights by Mistral AI in 8x7b and 8x22b parameter sizes.
Cancel
tools
8x7b
8x22b
mixtral:instruct
...
/
params
ed11eda7790d · 30B
{
"stop": [
"[INST]",
"[/INST]"
]
}