q3_K_M and q4_K_M configured with 10k context
156 Pulls Updated 12 months ago
Updated 12 months ago
12 months ago
e94962e27fc2 · 21GB
model
archllama
·
parameters34.4B
·
quantizationQ4_K_M
21GB
params
{
"num_ctx": 5125,
"stop": [
"</s>"
]
}
43B
template
SYSTEM: {{ .System }}
USER: {{ .Prompt }}
ASSISTANT:
53B
Readme
Merge of NousResearch/Nous-Capybara-34B, migtissera/Tess-M-v1.3 and bhenrym14/airoboros-3_1-yi-34b-200k, because why not?