šŸ³ Aurora represents the Chinese version of the MoE model, refined from the Mixtral-8x7B architecture. It adeptly unlocks the modelā€™s potential for bilingual dialogue in both Chinese and English across a wide range of open-domain topics.

8x7B

62 Pulls Updated 6 months ago

f02dd72bb242 Ā· 59B
{ "stop": [ "<|im_start|>", "<|im_end|>" ] }