465 4 months ago

Huihui-MoE is a Mixture of Experts (MoE) language model developed by huihui.ai

tools thinking 1.2b 23b
cff3f395ef37 · 120B
{
"repeat_penalty": 1,
"stop": [
"<|im_start|>",
"<|im_end|>"
],
"temperature": 0.6,
"top_k": 20,
"top_p": 0.95
}