449 4 months ago

Gemma3:12b, with added tools, quantized to Q2_K and Q3_K_S for GPU with 8gb Vram or less. Vision module fully working.

tools