208 1 year ago

NeuralStar_FusionWriter_4x7b is a Mixture of Experts (MoE) made with the following models using LazyMergekit:

1 year ago

4d89ee3fd09e · 17GB ·

llama
·
24.2B
·
Q5_K_M
{{ if .System }}<|im_start|>system {{ .System }}<|im_end|> {{ end }}{{ if .Prompt }}<|im_start|>user
You are a friendly and knowledgeable AI assistant. Help users to the best of your abilities by answe
{ "stop": [ "<|start_header_id|>", "<|end_header_id|>", "<|eot_id|>",

Readme

Source: https://huggingface.co/OmnicromsBrain/NeuralStar_FusionWriter_4x7b

FusionWriter-7b.png

NeuralStar_FusionWriter_4x7b

NeuralStar_FusionWriter_4x7b is a Mixture of Experts (MoE) made with the following models using LazyMergekit: * mlabonne/AlphaMonarch-7B * OmnicromsBrain/Eros_Scribe-7b * SanjiWatsuki/Kunoichi-DPO-v2-7B * OmnicromsBrain/NeuralStar_Fusion-7B

⚡ Quantized Models

Special thanks to MRadermacher for the static and imatrix quantized models

.GGUF https://huggingface.co/mradermacher/NeuralStar_FusionWriter_4x7b-GGUF

IMatrix https://huggingface.co/mradermacher/NeuralStar_FusionWriter_4x7b-i1-GGUF