deepseek-coder-v2:236b-base-q3_K_M

1.1M 12 months ago

An open-source Mixture-of-Experts code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks.

16b 236b

12 months ago

e8d5da110401 · 113GB

deepseek2
·
236B
·
Q3_K_M
{{- if .Suffix }}<|fim_begin|>{{ .Prompt }}<|fim_hole|>{{ .Suffix }}<|fim_end|> {{- else }}{{ .Promp
DEEPSEEK LICENSE AGREEMENT Version 1.0, 23 October 2023 Copyright (c) 2023 DeepSeek Section I: PREAM
MIT License Copyright (c) 2023 DeepSeek Permission is hereby granted, free of charge, to any person
{ "stop": [ "System:", "User:", "Assistant:", "<|begin_of_text|>

Readme

DeepSeek-Coder-V2 is an open-source Mixture-of-Experts (MoE) code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks. DeepSeek-Coder-V2 is further pre-trained from DeepSeek-Coder-V2-Base with 6 trillion tokens sourced from a high-quality and multi-source corpus.

References

Hugging Face