57 3 weeks ago

Instruct version of the large language model YandexGPT 5 Lite with 8B parameters with a context length of 32k tokens. (quantised version of Q5_K_M)

8b
b57cf61fe80b · 211B
<s> Ассистент:[SEP]{{- range .Messages }}{{- if eq .Role "user" }}Response }}
Пользователь: {{ .Content }}
Ассистент:[SEP]{{- else if eq .Role "assistant" }}{{- end }}{{- end }}