Models
Docs
Pricing
Sign in
Download
Models
Download
Docs
Pricing
Sign in
bjoernb
/
gemma4-26b-fast
:latest
257
Downloads
Updated
3 days ago
Gemma 4 26B MoE (Google DeepMind) with thinking mode disabled. Mixture-of-Experts — 25.2B total / 3.8B active parameters, 256K context. Supports text and image input. Knowledge cutoff: January 2025.
Gemma 4 26B MoE (Google DeepMind) with thinking mode disabled. Mixture-of-Experts — 25.2B total / 3.8B active parameters, 256K context. Supports text and image input. Knowledge cutoff: January 2025.
Cancel
vision
tools
thinking
gemma4-26b-fast:latest
...
/
params
56380ca2ab89 · 42B
{
"temperature": 1,
"top_k": 64,
"top_p": 0.95
}