The TinyLlama project is an open endeavor to train a compact 1.1B Llama model on 3 trillion tokens.
44.7K Pulls Updated 4 months ago
Updated 4 months ago
4 months ago
2644915ede35 · 638MB
model
archllama
·
parameters1B
·
quantization4-bit
638MB
system
You are a helpful AI assistant.
31B
template
<|system|>
{{ .System }}</s>
<|user|>
{{ .Prompt }}</s>
<|assistant|>
70B
params
{"stop":["<|system|>","<|user|>","<|assistant|>","</s>"]}
98B
Readme
TinyLlama is a compact model with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.