13B
95 Pulls Updated 5 months ago
Readme
MixTAO-7Bx2-MoE
MixTAO-7Bx2-MoE is a Mixure of Experts (MoE).
This model is mainly used for large model technology experiments, and increasingly perfect iterations will eventually create high-level large language models.
🦒 Colab
Link | Info - Model Name |
---|---|
MixTAO-7Bx2-MoE-v8.1 | |
mixtao-7bx2-moe-v8.1.Q4_K_M.gguf | GGUF of MixTAO-7Bx2-MoE-v8.1 Only Q4_K_M in https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1-GGUF |
Open LLM Leaderboard Evaluation Results
Detailed results can be found here
Metric | Value |
---|---|
Avg. | 77.50 |
AI2 Reasoning Challenge (25-Shot) | 73.81 |
HellaSwag (10-Shot) | 89.22 |
MMLU (5-Shot) | 64.92 |
TruthfulQA (0-shot) | 78.57 |
Winogrande (5-shot) | 87.37 |
GSM8k (5-shot) | 71.11 |