DeepSeek Coder is a capable coding model trained on two trillion code and natural language tokens.
1B
7B
33B
141.9K Pulls Updated 4 months ago
base
776MB
instruct
776MB
33b-base
19GB
33b-instruct
19GB
33b-base-q4_0
19GB
33b-base-q4_1
21GB
33b-base-q5_0
23GB
33b-base-q5_1
25GB
33b-base-q8_0
35GB
33b-base-q2_K
14GB
33b-base-q3_K_S
14GB
33b-base-q3_K_M
16GB
33b-base-q3_K_L
18GB
33b-base-q4_K_S
19GB
33b-base-q4_K_M
20GB
33b-base-q5_K_S
23GB
33b-base-q5_K_M
24GB
33b-base-q6_K
27GB
33b-base-fp16
67GB
33b-instruct-q4_0
19GB
33b-instruct-q4_1
21GB
33b-instruct-q5_0
23GB
33b-instruct-q5_1
25GB
33b-instruct-q8_0
35GB
33b-instruct-q2_K
14GB
33b-instruct-q3_K_S
14GB
33b-instruct-q3_K_M
16GB
33b-instruct-q3_K_L
18GB
33b-instruct-q4_K_S
19GB
33b-instruct-q4_K_M
20GB
33b-instruct-q5_K_S
23GB
33b-instruct-q5_K_M
24GB
33b-instruct-q6_K
27GB
33b-instruct-fp16
67GB
6.7b-base
3.8GB
6.7b-instruct
3.8GB
6.7b-base-q4_0
3.8GB
6.7b-base-q4_1
4.2GB
6.7b-base-q5_0
4.7GB
6.7b-base-q5_1
5.1GB
6.7b-base-q8_0
7.2GB
6.7b-base-q2_K
2.8GB
6.7b-base-q3_K_S
3.0GB
6.7b-base-q3_K_M
3.3GB
6.7b-base-q3_K_L
3.6GB
6.7b-base-q4_K_S
3.9GB
6.7b-base-q4_K_M
4.1GB
6.7b-base-q5_K_S
4.7GB
6.7b-base-q5_K_M
4.8GB
6.7b-base-q6_K
5.5GB
6.7b-base-fp16
13GB
6.7b-instruct-q4_0
3.8GB
6.7b-instruct-q4_1
4.2GB
6.7b-instruct-q5_0
4.7GB
6.7b-instruct-q5_1
5.1GB
6.7b-instruct-q8_0
7.2GB
6.7b-instruct-q2_K
2.8GB
6.7b-instruct-q3_K_S
3.0GB
6.7b-instruct-q3_K_M
3.3GB
6.7b-instruct-q3_K_L
3.6GB
6.7b-instruct-q4_K_S
3.9GB
6.7b-instruct-q4_K_M
4.1GB
6.7b-instruct-q5_K_S
4.7GB
6.7b-instruct-q5_K_M
4.8GB
6.7b-instruct-q6_K
5.5GB
6.7b-instruct-fp16
13GB
1.3b-base
776MB
1.3b-instruct
776MB
1.3b-base-q4_0
776MB
1.3b-base-q4_1
856MB
1.3b-base-q5_0
936MB
1.3b-base-q5_1
1.0GB
1.3b-base-q8_0
1.4GB
1.3b-base-q2_K
632MB
1.3b-base-q3_K_S
659MB
1.3b-base-q3_K_M
705MB
1.3b-base-q3_K_L
745MB
1.3b-base-q4_K_S
815MB
1.3b-base-q4_K_M
874MB
1.3b-base-q5_K_S
953MB
1.3b-base-q5_K_M
1.0GB
1.3b-base-q6_K
1.2GB
1.3b-base-fp16
2.7GB
1.3b-instruct-q4_0
776MB
1.3b-instruct-q4_1
856MB
1.3b-instruct-q5_0
936MB
1.3b-instruct-q5_1
1.0GB
1.3b-instruct-q8_0
1.4GB
1.3b-instruct-q2_K
632MB
1.3b-instruct-q3_K_S
659MB
1.3b-instruct-q3_K_M
705MB
1.3b-instruct-q3_K_L
745MB
1.3b-instruct-q4_K_S
815MB
1.3b-instruct-q4_K_M
874MB
1.3b-instruct-q5_K_S
953MB
1.3b-instruct-q5_K_M
1.0GB
1.3b-instruct-q6_K
1.2GB
1.3b-instruct-fp16
2.7GB