🪐 A family of small models with 135M, 360M, and 1.7B parameters, trained on a new high-quality dataset.
135m
360m
1.7b
107.4K Pulls Updated 3 months ago
Updated 3 months ago
3 months ago
95f6557a0f0f · 991MB
model
archllama
·
parameters1.71B
·
quantizationQ4_0
991MB
params
{"stop":["\u003c|im_start|\u003e","\u003c|im_end|\u003e"],"temperature":0.2,"top_p":0.9}
89B
template
{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
182B
license
Apache License
Version 2.0, January 200
11kB
Readme
SmolLM is a series of small language models available in three sizes: 135M, 360M, and 1.7B parameters.