Uncensored, 8x7b and 8x22b fine-tuned models based on the Mixtral mixture of experts models that excels at coding tasks. Created by Eric Hartford.

249 5 months ago

Readme

image.png

The Dolphin model by Eric Hartford based on Mixtral that is trained with additional datasets:

  • Synthia, OpenHermes and PureDove
  • New Dolphin-Coder
  • MagiCoder

Sizes

  • dolphin-mixtral:8x22b
  • dolphin-mixtral:8x7b

References

HuggingFace