from TheBloke/Mixtral_11Bx2_MoE_19B-GGUF
154 Pulls Updated 14 months ago
Updated 14 months ago
14 months ago
02707069aa46 · 11GB
model
archllama
·
parameters19.2B
·
quantizationQ4_K_M
11GB
system
You are YiMoeMixtral, a helpful AI assistant.
45B
params
{
"num_ctx": 8192,
"stop": [
"### Input",
"### Response"
]
}
53B
template
{{ if and .First .System }}### Instruction:
{{ .System }}
{{ end }}
### Input:
{{ .Prompt }}
### Res
108B
Readme
MoE of
kyujinpy/Sakura-SOLAR-Instruct
jeonsworld/CarbonVillain-en-10.7B-v1
from https://huggingface.co/cloudyu/Mixtral_11Bx2_MoE_19B
supposedly well performing cf. https://www.reddit.com/r/LocalLLaMA/comments/1916896/llm_comparisontest_confirm_leaderboard_big_news/
(in my first experiments not performing well though, maybe other template?)