Models
Docs
Pricing
Sign in
Download
Models
Download
Docs
Pricing
Sign in
bjoernb
/
gemma4-26b-think
223
Downloads
Updated
3 weeks ago
Gemma 4 26B MoE (Google DeepMind) with thinking mode enabled. Mixture-of-Experts — 25.2B total / 3.8B active parameters, 256K context. Supports text and image input. Knowledge cutoff: January 2025.
Gemma 4 26B MoE (Google DeepMind) with thinking mode enabled. Mixture-of-Experts — 25.2B total / 3.8B active parameters, 256K context. Supports text and image input. Knowledge cutoff: January 2025.
Cancel
vision
tools
thinking
Name
1 model
Size
Context
Input
gemma4-26b-think:latest
c3295d01fe2d
• 18GB • 256K context window •
Text, Image input • 3 weeks ago
Text, Image input • 3 weeks ago
gemma4-26b-think:latest
18GB
256K
Text, Image
c3295d01fe2d
· 3 weeks ago