249 3 days ago

Gemma 4 26B MoE (Google DeepMind) with thinking mode disabled. Mixture-of-Experts — 25.2B total / 3.8B active parameters, 256K context. Supports text and image input. Knowledge cutoff: January 2025.

vision tools thinking
ollama run bjoernb/gemma4-26b-fast

Details

3 days ago

efcbb7535c94 · 18GB ·

gemma4
·
25.8B
·
Q4_K_M
Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR US
{ "temperature": 1, "top_k": 64, "top_p": 0.95 }
You are Gemma 4 26B, a Mixture-of-Experts language model by Google DeepMind (25.2B total / 3.8B acti

Readme

No readme