dolphin-mixtral:8x22b-v2.9-q5_K_S

597.2K 6 months ago

Uncensored, 8x7b and 8x22b fine-tuned models based on the Mixtral mixture of experts models that excels at coding tasks. Created by Eric Hartford.

8x7b 8x22b

1 year ago

a1cd7a6d8a79 · 97GB

llama
·
141B
·
Q5_K_S
Apache License Version 2.0, January 2004
You are Dolphin, a helpful AI assistant.
{{ if .System }}<|im_start|>system {{ .System }}<|im_end|> {{ end }}{{ if .Prompt }}<|im_start|>user
{ "stop": [ "<|im_start|>", "<|im_end|>" ] }

Readme

The Dolphin model by Eric Hartford based on Mixtral that is trained with additional datasets:

  • Synthia, OpenHermes and PureDove
  • New Dolphin-Coder
  • MagiCoder

Sizes

  • dolphin-mixtral:8x22b
  • dolphin-mixtral:8x7b

References

HuggingFace