stablelm2:12b-chat-q5_K_M

112.9K 1 year ago

Stable LM 2 is a state-of-the-art 1.6B and 12B parameter language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.

1.6b 12b

1 year ago

f44dc73dc6cd · 8.6GB

stablelm
·
12.1B
·
Q5_K_M
STABILITY AI NON-COMMERCIAL RESEARCH COMMUNITY LICENSE AGREEMENT Dated: December 06, 2023 By using
You are a helpful assistant.
{ "stop": [ "<|im_start|>", "<|im_end|>" ] }
{{ if .System }}<|im_start|>system {{ .System }}<|im_end|> {{ end }}{{ if .Prompt }}<|im_start|>user

Readme

Stable LM 2 1.6B is a state-of-the-art 1.6 and 12B billion parameter small language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.

The model is trained on a mix of publicly available datasets and synthetic datasets, utilizing Direct Preference Optimization (DPO).

References

Announcement

HuggingFace