Stable LM 2 is a state-of-the-art 1.6B and 12B parameter language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.
1.6b
12b
98.7K Pulls Updated 6 months ago
Updated 7 months ago
7 months ago
7e806e51ba84 · 1.1GB
model
archstablelm
·
parameters1.64B
·
quantizationQ4_1
1.1GB
system
You are a helpful assistant.
29B
params
{"stop":["\u003c|im_start|\u003e","\u003c|im_end|\u003e"]}
59B
template
{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
182B
license
STABILITY AI NON-COMMERCIAL RESEARCH COMMUNITY LICENSE AGREEMENT
Dated: December 06, 2023
By using
7.4kB
Readme
Stable LM 2 1.6B is a state-of-the-art 1.6 and 12B billion parameter small language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.
The model is trained on a mix of publicly available datasets and synthetic datasets, utilizing Direct Preference Optimization (DPO).