šŸ³ Aurora represents the Chinese version of the MoE model, refined from the Mixtral-8x7B architecture. It adeptly unlocks the modelā€™s potential for bilingual dialogue in both Chinese and English across a wide range of open-domain topics.

8x7B

62 Pulls Updated 6 months ago

a47b02e00552 Ā· 106B
<|im_start|>system {{ .System }}<|im_end|> <|im_start|>user {{ .Prompt }}<|im_end|> <|im_start|>assistant