🐳 Aurora represents the Chinese version of the MoE model, refined from the Mixtral-8x7B architecture. It adeptly unlocks the model’s potential for bilingual dialogue in both Chinese and English across a wide range of open-domain topics.
66 Pulls Updated 9 months ago
1 Tag
0a77bc646fee • 26GB •
9 months ago