š³ Aurora represents the Chinese version of the MoE model, refined from the Mixtral-8x7B architecture. It adeptly unlocks the modelās potential for bilingual dialogue in both Chinese and English across a wide range of open-domain topics.
66 Pulls Updated 8 months ago
f02dd72bb242 Ā· 59B
{
"stop": [
"<|im_start|>",
"<|im_end|>"
]
}