57 3 weeks ago

Instruct version of the large language model YandexGPT 5 Lite with 8B parameters with a context length of 32k tokens. (quantised version of Q5_K_M)

8b
9358f67ad765 · 68B
{
"stop": [
"<s>",
"[SEP]",
"Response }}\n\n Пользователь:"
]
}