1,238 Downloads Updated 1 year ago
Nous-Capybara-34B V1.9
This is trained on the Yi-34B model with 200K context length, for 3 epochs on the Capybara dataset!
I’m providing q4_K_M & q3_K_M quantizations. Models are set up for 10k context. You can try larger contexts with >>>/set parameter num_ctx XXXXXXX
in the CLI, or by creating a custom model.