dolphin-mixtral:8x7b-v2.7-q4_K_M
570.2K Downloads Updated 4 months ago
Uncensored, 8x7b and 8x22b fine-tuned models based on the Mixtral mixture of experts models that excels at coding tasks. Created by Eric Hartford.
8x7b
8x22b
Updated 4 months ago
4 months ago
9edda92db764 · 28GB
{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
182B
Readme
The Dolphin model by Eric Hartford based on Mixtral that is trained with additional datasets:
- Synthia, OpenHermes and PureDove
- New Dolphin-Coder
- MagiCoder
Sizes
dolphin-mixtral:8x22b
dolphin-mixtral:8x7b