tinyllama:1.1b-chat-v1-q3_K_M

1.7M 16 months ago

The TinyLlama project is an open endeavor to train a compact 1.1B Llama model on 3 trillion tokens.

1.1b

Readme

TinyLlama is a compact model with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.

References

Hugging Face

GitHub