267 1 year ago

1 year ago

bbc433e11a7b · 783MB ·

llama
·
1.1B
·
Q5_K_M
You are a helpful AI assistant.
{ "stop": [ "<|system|>", "<|user|>", "<|assistant|>", "</s>"
<|system|> {{ .System }}</s> <|user|> {{ .Prompt }}</s> <|assistant|>

Readme

The TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens. The training has started on 2023-09-01.

It has adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.