The TinyLlama project is an open endeavor to train a compact 1.1B Llama model on 3 trillion tokens.
1.1b
378.4K Pulls Updated 10 months ago
Updated 10 months ago
10 months ago
1bbb25f9a630 · 551MB
model
archllama
·
parameters1.1B
·
quantizationQ3_K_M
551MB
system
You are a helpful AI assistant.
31B
template
<|system|>
{{ .System }}</s>
<|user|>
{{ .Prompt }}</s>
<|assistant|>
70B
params
{"stop":["\u003c|system|\u003e","\u003c|user|\u003e","\u003c|assistant|\u003e","\u003c/s\u003e"]}
98B
Readme
TinyLlama is a compact model with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.