Stable LM 2 is a state-of-the-art 1.6B and 12B parameter language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.
1.6b
12b
98.7K Pulls Updated 6 months ago
Updated 6 months ago
6 months ago
7047024d0f2e · 5.4GB
model
archstablelm
·
parameters12.1B
·
quantizationQ3_K_S
5.4GB
system
You are a helpful assistant.
29B
params
{"stop":["\u003c|im_start|\u003e","\u003c|im_end|\u003e"]}
59B
template
{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
182B
license
STABILITY AI NON-COMMERCIAL RESEARCH COMMUNITY LICENSE AGREEMENT
Dated: December 06, 2023
By using
7.4kB
Readme
Stable LM 2 1.6B is a state-of-the-art 1.6 and 12B billion parameter small language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.
The model is trained on a mix of publicly available datasets and synthetic datasets, utilizing Direct Preference Optimization (DPO).