deepseek-coder-v2:236b-base-q4_0
928.5K Downloads Updated 10 months ago
An open-source Mixture-of-Experts code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks.
16b
236b
Updated 10 months ago
10 months ago
2dc89d24571b · 133GB
MIT License
Copyright (c) 2023 DeepSeek
Permission is hereby granted, free of charge, to any perso
1.1kB
DEEPSEEK LICENSE AGREEMENT
Version 1.0, 23 October 2023
Copyright (c) 2023 DeepSeek
Section I: PR
14kB
{{- if .Suffix }}<|fim_begin|>{{ .Prompt }}<|fim_hole|>{{ .Suffix }}<|fim_end|>
{{- else }}{{ .Promp
115B
Readme
DeepSeek-Coder-V2 is an open-source Mixture-of-Experts (MoE) code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks. DeepSeek-Coder-V2 is further pre-trained from DeepSeek-Coder-V2-Base with 6 trillion tokens sourced from a high-quality and multi-source corpus.