Quantized versions of a model merge between nous-capybara and tess-yi.
205 Pulls Updated 11 months ago
Updated 11 months ago
11 months ago
12e4ab84e404 · 17GB
model
archllama
·
parameters34.4B
·
quantizationQ3_K_M
17GB
params
{"num_ctx":5125,"stop":["\u003c/s\u003e"]}
43B
template
SYSTEM: {{ .System }}
USER: {{ .Prompt }}
ASSISTANT:
53B
Readme
I’m providing q4_0, q3_K_M, and q2_K quantizations of brucethemoose/Capybara-Tess-Yi-34B-200K-DARE-Ties (HF).
This model was created by brucethemoose by merging between Nous-Capybara and Tess-Yi using a new, experimental, merge technique.
The base model can support up to 200k context, but the models I’ve pushed have 4-5k context sizes.