🪐 A family of small models with 135M, 360M, and 1.7B parameters, trained on a new high-quality dataset.
135m
360m
1.7b
157.6K Pulls Updated 4 months ago
Updated 4 months ago
4 months ago
4f0e5ec42d5b · 1.2GB
model
archllama
·
parameters1.71B
·
quantizationQ5_K_S
1.2GB
license
Apache License
Version 2.0, January 200
11kB
Readme
SmolLM is a series of small language models available in three sizes: 135M, 360M, and 1.7B parameters.