456 5 months ago

Gemma3:12b, with added tools, quantized to Q2_K and Q3_K_S for GPU with 8gb Vram or less. Vision module fully working.

tools
3116c5225075 · 77B
{
"stop": [
"<end_of_turn>"
],
"temperature": 1,
"top_k": 64,
"top_p": 0.95
}