The IBM Granite 1B and 3B models are long-context mixture of experts (MoE) Granite models from IBM designed for low latency usage.
1M Pulls 33 Tags Updated 9 months ago
IBM Granite 2B and 8B models are 128K context length language models that have been fine-tuned for improved reasoning and instruction-following capabilities.
672.6K Pulls 3 Tags Updated 6 months ago
A compact and efficient vision-language model, specifically designed for visual document understanding, enabling automated content extraction from tables, charts, infographics, plots, diagrams, and more.
422.5K Pulls 5 Tags Updated 8 months ago
A family of open foundation models by IBM for Code Intelligence
301.3K Pulls 162 Tags Updated 1 year ago
Granite-3.2 is a family of long-context AI models from IBM Granite fine-tuned for thinking capabilities.
163K Pulls 9 Tags Updated 8 months ago
The IBM Granite 2B and 8B models are designed to support tool-based use cases and support for retrieval augmented generation (RAG), streamlining code generation, translation and bug fixing.
128.3K Pulls 33 Tags Updated 11 months ago
The IBM Granite 2B and 8B models are text-only dense LLMs trained on over 12 trillion tokens of data, demonstrated significant improvements over their predecessors in performance and speed in IBM’s initial testing.
124.3K Pulls 33 Tags Updated 9 months ago
The IBM Granite Embedding 30M and 278M models models are text-only dense biencoder embedding models, with 30M available in English only and 278M serving multilingual use cases.
110.2K Pulls 6 Tags Updated 10 months ago
The IBM Granite 1B and 3B models are the first mixture of experts (MoE) Granite models from IBM designed for low latency usage.
92.4K Pulls 33 Tags Updated 11 months ago
Granite 4 features improved instruction following (IF) and tool-calling capabilities, making them more effective in enterprise applications.
77.7K Pulls 17 Tags Updated 4 days ago
The IBM Granite Guardian 3.0 2B and 8B models are designed to detect risks in prompts and/or responses.
40.4K Pulls 10 Tags Updated 11 months ago
A general-purpose model ranging from 3 billion parameters to 70 billion, suitable for entry-level hardware.
952.1K Pulls 119 Tags Updated 2 years ago
36.1K Pulls 64 Tags Updated 2 months ago
4,711 Pulls 5 Tags Updated 8 months ago
3,624 Pulls 261 Tags Updated 4 days ago
1,541 Pulls 11 Tags Updated 8 months ago
1,082 Pulls 5 Tags Updated 2 months ago
969 Pulls 16 Tags Updated 2 months ago
4-bit quantized version of instructlab/granite-7b-lab
838 Pulls 1 Tag Updated 1 year ago
822 Pulls 11 Tags Updated 10 months ago