465 4 months ago

Huihui-MoE is a Mixture of Experts (MoE) language model developed by huihui.ai

tools thinking 1.2b 23b

4 months ago

4eda1edfeb65 · 14GB ·

qwen3moe
·
23.2B
·
Q4_K_M
{{- $lastUserIdx := -1 -}} {{- range $idx, $msg := .Messages -}} {{- if eq $msg.Role "user" }}{{ $la
Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR US
{ "repeat_penalty": 1, "stop": [ "<|im_start|>", "<|im_end|>" ], "te

Readme

Huihui-MoE is a Mixture of Experts (MoE) language model developed by huihui.ai, built upon the Qwen/qwen3 base model. It enhances the standard Transformer architecture by replacing MLP layers with MoE layers, each containing 2-8 experts, to achieve high performance with efficient inference. The model is designed for natural language processing tasks, including text generation, question answering, and conversational applications.

References

HuggingFace

Donation

If you like it, please click ‘like’ and follow us for more updates.
You can follow x.com/support_huihui to get the latest model information from huihui.ai.

Your donation helps us continue our further development and improvement, a cup of coffee can do it.
  • bitcoin:
  bc1qqnkhuchxw0zqjh2ku3lu4hq45hc6gy84uk70ge