269 1 year ago

Quantized versions of a model merge between nous-capybara and tess-yi.

1 year ago

12e4ab84e404 · 17GB

llama
·
34.4B
·
Q3_K_M
SYSTEM: {{ .System }} USER: {{ .Prompt }} ASSISTANT:
{ "num_ctx": 5125, "stop": [ "</s>" ] }

Readme

I’m providing q4_0, q3_K_M, and q2_K quantizations of brucethemoose/Capybara-Tess-Yi-34B-200K-DARE-Ties (HF).

This model was created by brucethemoose by merging between Nous-Capybara and Tess-Yi using a new, experimental, merge technique.

The base model can support up to 200k context, but the models I’ve pushed have 4-5k context sizes.