š³ Aurora represents the Chinese version of the MoE model, refined from the Mixtral-8x7B architecture. It adeptly unlocks the modelās potential for bilingual dialogue in both Chinese and English across a wide range of open-domain topics.
66 Pulls Updated 9 months ago
Updated 9 months ago
9 months ago
0a77bc646fee Ā· 26GB
model
archllama
Ā·
parameters46.7B
Ā·
quantizationQ4_0
26GB
system
You are Aurora, a helpful AI assistant.
40B
params
{
"stop": [
"<|im_start|>",
"<|im_end|>"
]
}
59B
template
<|im_start|>system
{{ .System }}<|im_end|>
<|im_start|>user
{{ .Prompt }}<|im_end|>
<|im_start|>assi
106B
Readme
Overview
Aurora represents the Chinese iteration of the MoE model, refined from the Mixtral-8x7B architecture. It adeptly unlocks the modelās potential for bilingual dialogue in both Chinese and English across a wide range of open-domain topics. Our Github is https://github.com/WangRongsheng/Aurora!
Run Aurora model
ollama run wangrongsheng/aurora
# or
# single-round dialogue
ollama run wangrongsheng/aurora "What is your favourite condiment?"
Check Aurora model
ollama ls
wangrongsheng/aurora
will be listed with other models.
Remove Aurora model
ollama rm wangrongsheng/aurora
Use by API
curl -X POST http://localhost:11434/api/generate -d '{
"model": "wangrongsheng/aurora",
"prompt":"What is your favourite condiment?"
}'
Citation
If you find our work helpful, feel free to give us a cite.
@misc{wang2023auroraactivating,
title={Aurora:Activating Chinese chat capability for Mixtral-8x7B sparse Mixture-of-Experts through Instruction-Tuning},
author={Rongsheng Wang and Haoming Chen and Ruizhe Zhou and Yaofei Duan and Kunyan Cai and Han Ma and Jiaxi Cui and Jian Li and Patrick Cheong-Iao Pang and Yapeng Wang and Tao Tan},
year={2023},
eprint={2312.14557},
archivePrefix={arXiv},
primaryClass={cs.CL}
}