NeuralStar_FusionWriter_4x7b is a Mixture of Experts (MoE) made with the following models using LazyMergekit:

30 Pulls Updated 2 months ago

Readme

Source: https://huggingface.co/OmnicromsBrain/NeuralStar_FusionWriter_4x7b

FusionWriter-7b.png

NeuralStar_FusionWriter_4x7b

NeuralStar_FusionWriter_4x7b is a Mixture of Experts (MoE) made with the following models using LazyMergekit:
* mlabonne/AlphaMonarch-7B
* OmnicromsBrain/Eros_Scribe-7b
* SanjiWatsuki/Kunoichi-DPO-v2-7B
* OmnicromsBrain/NeuralStar_Fusion-7B

⚡ Quantized Models

Special thanks to MRadermacher for the static and imatrix quantized models

.GGUF https://huggingface.co/mradermacher/NeuralStar_FusionWriter_4x7b-GGUF

IMatrix https://huggingface.co/mradermacher/NeuralStar_FusionWriter_4x7b-i1-GGUF