Models
Docs
Pricing
Sign in
Download
Models
Download
Docs
Pricing
Sign in
bjoernb
/
gemma4-26b-fast
249
Downloads
Updated
3 days ago
Gemma 4 26B MoE (Google DeepMind) with thinking mode disabled. Mixture-of-Experts — 25.2B total / 3.8B active parameters, 256K context. Supports text and image input. Knowledge cutoff: January 2025.
Gemma 4 26B MoE (Google DeepMind) with thinking mode disabled. Mixture-of-Experts — 25.2B total / 3.8B active parameters, 256K context. Supports text and image input. Knowledge cutoff: January 2025.
Cancel
vision
tools
thinking
Name
1 model
Size
Context
Input
gemma4-26b-fast:latest
efcbb7535c94
• 18GB • 256K context window •
Text, Image input • 3 days ago
Text, Image input • 3 days ago
gemma4-26b-fast:latest
18GB
256K
Text, Image
efcbb7535c94
· 3 days ago