37.6K Downloads Updated 1 year ago
Updated 1 year ago
1 year ago
f5d5ac5b61ef · 35GB ·
This model is a fine-tuned version of Mixtral using a high-quality, curated dataset. As of Dec 26th 2023, this model is the top ranked MoE (Mixture of Experts) model on the Hugging Face Open LLM Leaderboard.