257 3 days ago

Gemma 4 26B MoE (Google DeepMind) with thinking mode disabled. Mixture-of-Experts — 25.2B total / 3.8B active parameters, 256K context. Supports text and image input. Knowledge cutoff: January 2025.

vision tools thinking
56380ca2ab89 · 42B
{
"temperature": 1,
"top_k": 64,
"top_p": 0.95
}