2,252 Downloads Updated 1 year ago
Name
6 models
beyonder-4x7b-v2:latest
15GB · 8K context window · Text · 1 year ago
beyonder-4x7b-v2:q8
26GB · 8K context window · Text · 1 year ago
beyonder-4x7b-v2:q4_k_s
14GB · 8K context window · Text · 1 year ago
beyonder-4x7b-v2:q4_k_m
15GB · 8K context window · Text · 1 year ago
beyonder-4x7b-v2:q5_k_s
17GB · 8K context window · Text · 1 year ago
beyonder-4x7b-v2:q6_k
20GB · 8K context window · Text · 1 year ago
This is medium sized MoE model that combines, openchat/openchat-3.5-1210, beowolx/CodeNinja-1.0-OpenChat-7B, maywell/PiVoT-0.1-Starling-LM-RP and WizardLM/WizardMath-7B-V1.1.
I find this model to be very good. Lighter weight and better than Mixtral.
This is V2 of the model and features quants of the most recent updates (The Bloke’s quants are outdated).
Feel free to contact me if you have problems
Reddit: /u/spooknik | Discord: .spooknik