🪐 A family of small models with 135M, 360M, and 1.7B parameters, trained on a new high-quality dataset.
135m
360m
1.7b
81.1K Pulls Updated 2 months ago
Updated 2 months ago
2 months ago
f5ef53463545 · 386MB
model
archllama
·
parameters362M
·
quantizationQ8_0
386MB
params
{"stop":["\u003c|im_start|\u003e","\u003c|im_end|\u003e"],"temperature":0.2,"top_p":0.9}
89B
template
{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
182B
license
Apache License
Version 2.0, January 200
11kB
Readme
SmolLM is a series of small language models available in three sizes: 135M, 360M, and 1.7B parameters.