A top-performing mixture of experts model, fine-tuned with high-quality data.
8x7b
23.1K Pulls Updated 12 months ago
Updated 12 months ago
12 months ago
20bdbdb43962 · 20GB
model
archllama
·
parameters46.7B
·
quantizationQ3_K_M
20GB
params
{
"stop": [
"[INST]",
"[/INST]"
]
}
30B
template
[INST] {{ .System }} {{ .Prompt }} [/INST]
43B
license
MIT License
Copyright (c) [year] [fullname]
Permission is hereby granted, free of charge, to any p
1.1kB
Readme
This model is a fine-tuned version of Mixtral using a high-quality, curated dataset. As of Dec 26th 2023, this model is the top ranked MoE (Mixture of Experts) model on the Hugging Face Open LLM Leaderboard.