249 3 days ago

Gemma 4 26B MoE (Google DeepMind) with thinking mode disabled. Mixture-of-Experts — 25.2B total / 3.8B active parameters, 256K context. Supports text and image input. Knowledge cutoff: January 2025.

vision tools thinking
ollama run bjoernb/gemma4-26b-fast

Applications

Claude Code
Claude Code ollama launch claude --model bjoernb/gemma4-26b-fast
Codex
Codex ollama launch codex --model bjoernb/gemma4-26b-fast
OpenCode
OpenCode ollama launch opencode --model bjoernb/gemma4-26b-fast
OpenClaw
OpenClaw ollama launch openclaw --model bjoernb/gemma4-26b-fast

Models

View all →

Readme

No readme