269 1 year ago

Quantized versions of a model merge between nous-capybara and tess-yi.

1 year ago

a5bc1bf581f7 · 15GB

llama
·
34.4B
·
Q2_K
{ "num_ctx": 5125, "stop": [ "</s>" ] }
SYSTEM: {{ .System }} USER: {{ .Prompt }} ASSISTANT:

Readme

I’m providing q4_0, q3_K_M, and q2_K quantizations of brucethemoose/Capybara-Tess-Yi-34B-200K-DARE-Ties (HF).

This model was created by brucethemoose by merging between Nous-Capybara and Tess-Yi using a new, experimental, merge technique.

The base model can support up to 200k context, but the models I’ve pushed have 4-5k context sizes.