-
mixtral
A set of Mixture of Experts (MoE) model with open weights by Mistral AI in 8x7b and 8x22b parameter sizes.
tools 8x7b 8x22b503.7K Pulls 70 Tags Updated yesterday
-
dolphin-mixtral
Uncensored, 8x7b and 8x22b fine-tuned models based on the Mixtral mixture of experts models that excels at coding tasks. Created by Eric Hartford.
8x7b 8x22b456.2K Pulls 70 Tags Updated 2 days ago
-
nous-hermes2-mixtral
The Nous Hermes 2 model from Nous Research, now trained over Mixtral.
8x7b36.2K Pulls 18 Tags Updated 2 days ago