quantization of perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE
348 Pulls Updated 14 months ago
Updated 14 months ago
14 months ago
ad95f022607a · 7.3GB
model
archllama
·
parameters12.9B
·
quantizationQ4_K_M
7.3GB
template
[INST] {{ .System }} {{ .Prompt }} [/INST]
42B
params
{
"num_ctx": 32000,
"stop": [
"[INST]",
"[/INST]"
]
}
46B
Readme
No readme