Models
GitHub
Discord
Turbo
Sign in
Download
Models
Download
GitHub
Discord
Sign in
doomgrave
/
gemma3
95
Downloads
Updated
5 months ago
Gemma3:12b quantized to Q2_K and Q3_K_S for GPU with 8gb Vram or less. Vision module fully working.
Gemma3:12b quantized to Q2_K and Q3_K_S for GPU with 8gb Vram or less. Vision module fully working.
Cancel
Name
3 models
Size
Context
Input
gemma3:latest
9eef8cdc5fc2
• 6.3GB • 128K context window •
Text input • 5 months ago
Text input • 5 months ago
gemma3:latest
6.3GB
128K
Text
9eef8cdc5fc2
· 5 months ago
gemma3:12b-it-q2_K
ae7bb77a27e0
• 5.6GB • 128K context window •
Text input • 5 months ago
Text input • 5 months ago
gemma3:12b-it-q2_K
5.6GB
128K
Text
ae7bb77a27e0
· 5 months ago
gemma3:12b-it-q3_K_S
9eef8cdc5fc2
• 6.3GB • 128K context window •
Text input • 5 months ago
Text input • 5 months ago
gemma3:12b-it-q3_K_S
6.3GB
128K
Text
9eef8cdc5fc2
· 5 months ago