A top-performing mixture of experts model, fine-tuned with high-quality data.

8x7B

12.6K Pulls Updated 6 months ago

Readme

This model is a fine-tuned version of Mixtral using a high-quality, curated dataset. As of Dec 26th 2023, this model is the top ranked MoE (Mixture of Experts) model on the Hugging Face Open LLM Leaderboard.

References

HuggingFace

Argilla