deepseek-coder-v2:16b-lite-instruct-q2_K

1.1M 12 months ago

An open-source Mixture-of-Experts code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks.

16b 236b
{
"stop": [
"User:",
"Assistant:"
]
}