🐳 Aurora represents the Chinese version of the MoE model, refined from the Mixtral-8x7B architecture. It adeptly unlocks the model’s potential for bilingual dialogue in both Chinese and English across a wide range of open-domain topics.
73 Pulls Updated 13 months ago
1 Tag
0a77bc646fee • 26GB •
13 months ago