stablelm2:12b-q6_K

112.9K 1 year ago

Stable LM 2 is a state-of-the-art 1.6B and 12B parameter language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.

1.6b 12b

1 year ago

b289916cd903 · 10.0GB

stablelm
·
12.1B
·
Q6_K
STABILITY AI NON-COMMERCIAL RESEARCH COMMUNITY LICENSE AGREEMENT Dated: December 06, 2023 By using

Readme

Stable LM 2 1.6B is a state-of-the-art 1.6 and 12B billion parameter small language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.

The model is trained on a mix of publicly available datasets and synthetic datasets, utilizing Direct Preference Optimization (DPO).

References

Announcement

HuggingFace