q3_K_M and q4_K_M configured with 10k context
156 Pulls Updated 11 months ago
Updated 11 months ago
11 months ago
ae7070da3937 · 17GB
model
archllama
·
parameters34.4B
·
quantizationQ3_K_M
17GB
params
{
"num_ctx": 5125,
"stop": [
"</s>"
]
}
43B
template
SYSTEM: {{ .System }}
USER: {{ .Prompt }}
ASSISTANT:
53B
Readme
Merge of NousResearch/Nous-Capybara-34B, migtissera/Tess-M-v1.3 and bhenrym14/airoboros-3_1-yi-34b-200k, because why not?