818.8K 8 months ago

An open-source Mixture-of-Experts code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks.

16b 236b

8 months ago

63fb193b3a9b · 8.9GB

deepseek2
·
15.7B
·
Q4_0
{ "stop": [ "User:", "Assistant:" ] }
{{- if .Suffix }}<|fim▁begin|>{{ .Prompt }}<|fim▁hole|>{{ .Suffix }}<|fim▁end|> {{
MIT License Copyright (c) 2023 DeepSeek Permission is hereby granted, free of charge, to any perso
DEEPSEEK LICENSE AGREEMENT Version 1.0, 23 October 2023 Copyright (c) 2023 DeepSeek Section I: PR

Readme

DeepSeek-Coder-V2 is an open-source Mixture-of-Experts (MoE) code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks. DeepSeek-Coder-V2 is further pre-trained from DeepSeek-Coder-V2-Base with 6 trillion tokens sourced from a high-quality and multi-source corpus.

References

Hugging Face