594 10 months ago

Gemma3:12b, with added tools, quantized to Q2_K and Q3_K_S for GPU with 8gb Vram or less. Vision module fully working.

tools
ollama run doomgrave/gemma3-tools

Models

View all →

Readme

No readme