465 Downloads Updated 4 months ago
Huihui-MoE is a Mixture of Experts (MoE) language model developed by huihui.ai, built upon the Qwen/qwen3 base model. It enhances the standard Transformer architecture by replacing MLP layers with MoE layers, each containing 2-8 experts, to achieve high performance with efficient inference. The model is designed for natural language processing tasks, including text generation, question answering, and conversational applications.
If you like it, please click ‘like’ and follow us for more updates.
You can follow x.com/support_huihui to get the latest model information from huihui.ai.
bc1qqnkhuchxw0zqjh2ku3lu4hq45hc6gy84uk70ge