The TinyLlama project is an open endeavor to train a compact 1.1B Llama model on 3 trillion tokens.
1B
52.2K Pulls Updated 4 months ago
chat
638MB
v0.6
638MB
v1
638MB
1.1b-chat
638MB
1.1b-chat-v0.6-q4_0
638MB
1.1b-chat-v1-q4_0
638MB
1.1b-chat-v1-q4_1
702MB
1.1b-chat-v0.6-q4_1
702MB
1.1b-chat-v1-q5_0
767MB
1.1b-chat-v0.6-q5_0
767MB
1.1b-chat-v1-q5_1
832MB
1.1b-chat-v0.6-q5_1
832MB
1.1b-chat-v0.6-q8_0
1.2GB
1.1b-chat-v1-q8_0
1.2GB
1.1b-chat-v1-q2_K
483MB
1.1b-chat-v0.6-q2_K
483MB
1.1b-chat-v0.6-q3_K_S
500MB
1.1b-chat-v1-q3_K_S
500MB
1.1b-chat-v1-q3_K_M
551MB
1.1b-chat-v0.6-q3_K_M
551MB
1.1b-chat-v1-q3_K_L
593MB
1.1b-chat-v0.6-q3_K_L
593MB
1.1b-chat-v0.6-q4_K_S
644MB
1.1b-chat-v1-q4_K_S
644MB
1.1b-chat-v1-q4_K_M
669MB
1.1b-chat-v0.6-q4_K_M
669MB
1.1b-chat-v0.6-q5_K_S
767MB
1.1b-chat-v1-q5_K_S
767MB
1.1b-chat-v1-q5_K_M
783MB
1.1b-chat-v0.6-q5_K_M
783MB
1.1b-chat-v1-q6_K
904MB
1.1b-chat-v0.6-q6_K
904MB
1.1b-chat-v1-fp16
2.2GB
1.1b-chat-v0.6-fp16
2.2GB