NeuralStar_FusionWriter_4x7b is a Mixture of Experts (MoE) made with the following models using LazyMergekit:
108 Pulls Updated 8 months ago
Updated 8 months ago
8 months ago
06d8665bf9c7 · 26GB
Readme
Source: https://huggingface.co/OmnicromsBrain/NeuralStar_FusionWriter_4x7b
NeuralStar_FusionWriter_4x7b
NeuralStar_FusionWriter_4x7b is a Mixture of Experts (MoE) made with the following models using LazyMergekit: * mlabonne/AlphaMonarch-7B * OmnicromsBrain/Eros_Scribe-7b * SanjiWatsuki/Kunoichi-DPO-v2-7B * OmnicromsBrain/NeuralStar_Fusion-7B
⚡ Quantized Models
Special thanks to MRadermacher for the static and imatrix quantized models
.GGUF https://huggingface.co/mradermacher/NeuralStar_FusionWriter_4x7b-GGUF
IMatrix https://huggingface.co/mradermacher/NeuralStar_FusionWriter_4x7b-i1-GGUF