stablelm2

Stable LM 2 1.6B is a state-of-the-art 1.6 billion parameter small language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.

5,102 Pulls Updated 4 weeks ago

Stable LM 2 1.6B is a state-of-the-art 1.6 billion parameter small language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.

The model is trained on a mix of publicly available datasets and synthetic datasets, utilizing Direct Preference Optimization (DPO).

The zephyr version of this model is an instruction-tuned language model inspired by HuggingFaceH4’s Zephyr 7B training pipeline.

References

Announcement

HuggingFace