Meta Llama 3: The most capable openly available LLM to date
1.3M Pulls Updated 9 days ago
instruct
4.7GB
text
4.7GB
70b-instruct
40GB
70b-text
40GB
70b-instruct-q4_0
40GB
70b-instruct-q4_1
44GB
70b-instruct-q5_0
49GB
70b-instruct-q5_1
53GB
70b-instruct-q8_0
75GB
70b-instruct-q2_K
26GB
70b-instruct-q3_K_S
31GB
70b-instruct-q3_K_M
34GB
70b-instruct-q3_K_L
37GB
70b-instruct-q4_K_S
40GB
70b-instruct-q4_K_M
43GB
70b-instruct-q5_K_S
49GB
70b-instruct-q5_K_M
50GB
70b-instruct-q6_K
58GB
70b-instruct-fp16
141GB
70b-text-q4_0
40GB
70b-text-q4_1
44GB
70b-text-q5_0
49GB
70b-text-q5_1
53GB
70b-text-q8_0
75GB
70b-text-q2_K
26GB
70b-text-q3_K_S
31GB
70b-text-q3_K_M
34GB
70b-text-q3_K_L
37GB
70b-text-q4_K_S
40GB
70b-text-q4_K_M
43GB
70b-text-q5_K_S
49GB
70b-text-q5_K_M
50GB
70b-text-q6_K
58GB
70b-text-fp16
141GB
8b-text
4.7GB
8b-instruct-q4_0
4.7GB
8b-instruct-q4_1
5.1GB
8b-instruct-q5_0
5.6GB
8b-instruct-q5_1
6.1GB
8b-instruct-q8_0
8.5GB
8b-instruct-q2_K
3.2GB
8b-instruct-q3_K_S
3.7GB
8b-instruct-q3_K_M
4.0GB
8b-instruct-q3_K_L
4.3GB
8b-instruct-q4_K_S
4.7GB
8b-instruct-q4_K_M
4.9GB
8b-instruct-q5_K_S
5.6GB
8b-instruct-q5_K_M
5.7GB
8b-instruct-q6_K
6.6GB
8b-instruct-fp16
16GB
8b-text-q4_0
4.7GB
8b-text-q4_1
5.1GB
8b-text-q5_0
5.6GB
8b-text-q5_1
6.1GB
8b-text-q8_0
8.5GB
8b-text-q2_K
3.2GB
8b-text-q3_K_S
3.7GB
8b-text-q3_K_M
4.0GB
8b-text-q3_K_L
4.3GB
8b-text-q4_K_S
4.7GB
8b-text-q4_K_M
4.9GB
8b-text-q5_K_S
5.6GB
8b-text-q6_K
6.6GB
8b-text-fp16
16GB