stablelm2:12b-chat

112.6K 1 year ago

Stable LM 2 is a state-of-the-art 1.6B and 12B parameter language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.

1.6b 12b

1 year ago

34b434945650 · 7.0GB

stablelm
·
12.1B
·
Q4_0
You are a helpful assistant.
{ "stop": [ "<|im_start|>", "<|im_end|>" ] }
{{ if .System }}<|im_start|>system {{ .System }}<|im_end|> {{ end }}{{ if .Prompt }}<|im_start|>user
STABILITY AI NON-COMMERCIAL RESEARCH COMMUNITY LICENSE AGREEMENT Dated: December 06, 2023 By using

Readme

Stable LM 2 1.6B is a state-of-the-art 1.6 and 12B billion parameter small language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.

The model is trained on a mix of publicly available datasets and synthetic datasets, utilizing Direct Preference Optimization (DPO).

References

Announcement

HuggingFace