895 9 months ago

Huihui-MoE is a Mixture of Experts (MoE) language model developed by huihui.ai

tools thinking 1.2b 23b
ollama run huihui_ai/huihui-moe:1.2b

Applications

Claude Code
Claude Code ollama launch claude --model huihui_ai/huihui-moe:1.2b
Codex
Codex ollama launch codex --model huihui_ai/huihui-moe:1.2b
OpenCode
OpenCode ollama launch opencode --model huihui_ai/huihui-moe:1.2b
OpenClaw
OpenClaw ollama launch openclaw --model huihui_ai/huihui-moe:1.2b

Models

View all →

Readme

Huihui-MoE is a Mixture of Experts (MoE) language model developed by huihui.ai, built upon the Qwen/qwen3 base model. It enhances the standard Transformer architecture by replacing MLP layers with MoE layers, each containing 2-8 experts, to achieve high performance with efficient inference. The model is designed for natural language processing tasks, including text generation, question answering, and conversational applications.

References

HuggingFace

Donation

If you like it, please click ‘like’ and follow us for more updates.
You can follow x.com/support_huihui to get the latest model information from huihui.ai.

Your donation helps us continue our further development and improvement, a cup of coffee can do it.
  • bitcoin:
  bc1qqnkhuchxw0zqjh2ku3lu4hq45hc6gy84uk70ge