beyonder
This model is a Mixture of Experts (MoE) made with mergekit (mixtral branch). It uses the following base models: "openchat/openchat-3.5-1210", "beowolx/CodeNinja-1.0-OpenChat-7B", "maywell/PiVoT-0.1-Starling-LM-RP", "WizardLM/WizardMath-7B-V1.1"