MoE model combining 4 of the best 7B models
2,211 Pulls Updated 7 months ago
Readme
This is medium sized MoE model that combines, openchat/openchat-3.5-1210, beowolx/CodeNinja-1.0-OpenChat-7B, maywell/PiVoT-0.1-Starling-LM-RP and WizardLM/WizardMath-7B-V1.1.
I find this model to be very good. Lighter weight and better than Mixtral.
This is V2 of the model and features quants of the most recent updates (The Bloke’s quants are outdated).
Feel free to contact me if you have problems
Reddit: /u/spooknik | Discord: .spooknik