A set of Mixture of Experts (MoE) model with open weights by Mistral AI in 8x7b and 8x22b parameter sizes.

tools 8x7b 8x22b

492.8K 4 months ago

ed11eda7790d · 30B
{
"stop": [
"[INST]",
"[/INST]"
]
}