
-
mixtral
A set of Mixture of Experts (MoE) model with open weights by Mistral AI in 8x7b and 8x22b parameter sizes.
tools 8x7b 8x22b614.4K Pulls 70 Tags Updated 3 months ago
-
dolphin-mixtral
Uncensored, 8x7b and 8x22b fine-tuned models based on the Mixtral mixture of experts models that excels at coding tasks. Created by Eric Hartford.
8x7b 8x22b531.4K Pulls 70 Tags Updated 3 months ago
-
nous-hermes2-mixtral
The Nous Hermes 2 model from Nous Research, now trained over Mixtral.
8x7b39K Pulls 18 Tags Updated 3 months ago