Gemma2 but with the max context size
358 Pulls Updated 5 months ago
Updated 5 months ago
5 months ago
4f4b5dc2443e · 16GB
model
archgemma2
·
parameters27.2B
·
quantizationQ4_0
16GB
params
{
"num_ctx": 8192,
"stop": [
"<start_of_turn>",
"<end_of_turn>"
]
}
80B
template
<start_of_turn>user
{{ if .System }}{{ .System }} {{ end }}{{ .Prompt }}<end_of_turn>
<start_of_turn
136B
license
Gemma Terms of Use
Last modified: February 21, 2024
By using, reproducing, modifying, distributin
8.4kB
Readme
This is simply the regular Gemma 2 model with one exception and that is setting num_ctx to 8192 which is the maximum context size. Source model is https://ollama.com/library/gemma2