doomgrave/
gemma3:12b-it-q2_K

113 6 months ago

Gemma3:12b quantized to Q2_K and Q3_K_S for GPU with 8gb Vram or less. Vision module fully working.

6 months ago

ae7bb77a27e0 · 5.6GB ·

gemma3
·
12.2B
·
Q2_K
{{- range $i, $_ := .Messages }} {{- $last := eq (len (slice $.Messages $i)) 1 }} {{- if or (eq .Rol
Gemma Terms of Use Last modified: February 21, 2024 By using, reproducing, modifying, distributing,
{ "stop": [ "<end_of_turn>" ], "temperature": 1, "top_k": 64, "top_p": 0

Readme

No readme