222K Downloads Updated 2 weeks ago
Updated 2 weeks ago
2 weeks ago
076afb3855dc · 1.9GB ·
Granite 4.0 models are finetuned from their base models using a combination of open source instruction datasets with permissive license and internally collected synthetic datasets. They feature improved instruction following (IF) and tool-calling capabilities, making them more effective in enterprise applications.
Please Note: the 3b, 1b, and 350m model sizes are alternative options for users when mamba-2 support is not yet optimized. Models denoted -h use the hybrid mamba-2 architecture.
350m
ollama run granite4:350m
350m-h
ollama run granite4:350m-h
1b
ollama run granite4:1b
1b-h
ollama run granite4:1b-h
3b (micro)
ollama run granite4:3b
ollama run granite4:micro
3b-h (micro-h)
ollama run granite4:3b-h
ollama run granite4:micro-h
7b-a1b-h (tiny-h)
ollama run granite4:7b-a1b-h
ollama run granite4:tiny-h
32b-a9b-h (small-h)
ollama run granite4:32b-a9b-h
ollama run granite4:small-h
English, German, Spanish, French, Japanese, Portuguese, Arabic, Czech, Italian, Korean, Dutch, and Chinese. Users may finetune Granite 4.0 models for languages beyond these languages.
This model is designed to handle general instruction-following tasks and can be integrated into AI assistants across various domains, including business applications.