deepseek-v2:16b-lite-chat-q2_K
167.1K Downloads Updated 1 year ago
A strong, economical, and efficient Mixture-of-Experts language model.
16b
236b
19f2fb9e8bc6 · 32B
{
"stop": [
"User:",
"Assistant:"
]
}