223 3 weeks ago

Gemma 4 26B MoE (Google DeepMind) with thinking mode enabled. Mixture-of-Experts — 25.2B total / 3.8B active parameters, 256K context. Supports text and image input. Knowledge cutoff: January 2025.

vision tools thinking
ollama run bjoernb/gemma4-26b-think

Details

3 weeks ago

c3295d01fe2d · 18GB ·

gemma4
·
25.8B
·
Q4_K_M
Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR US
{ "temperature": 1, "top_k": 64, "top_p": 0.95 }
<|think|> You are Gemma 4 26B, a Mixture-of-Experts language model by Google DeepMind (25.2B total /

Readme

No readme