88 1 year ago

🐳 Aurora represents the Chinese version of the MoE model, refined from the Mixtral-8x7B architecture. It adeptly unlocks the model’s potential for bilingual dialogue in both Chinese and English across a wide range of open-domain topics.

f02dd72bb242 Ā· 59B
{
"stop": [
"<|im_start|>",
"<|im_end|>"
]
}