The IBM Granite 1B and 3B models are long-context mixture of experts (MoE) Granite models from IBM designed for low latency usage.
1.2M Pulls 33 Tags Updated 10 months ago
IBM Granite 2B and 8B models are 128K context length language models that have been fine-tuned for improved reasoning and instruction-following capabilities.
699.6K Pulls 3 Tags Updated 7 months ago
A compact and efficient vision-language model, specifically designed for visual document understanding, enabling automated content extraction from tables, charts, infographics, plots, diagrams, and more.
455.6K Pulls 5 Tags Updated 8 months ago
A family of open foundation models by IBM for Code Intelligence
322.7K Pulls 162 Tags Updated 1 year ago
Granite 4 features improved instruction following (IF) and tool-calling capabilities, making them more effective in enterprise applications.
213.4K Pulls 17 Tags Updated 2 weeks ago
Granite-3.2 is a family of long-context AI models from IBM Granite fine-tuned for thinking capabilities.
167.9K Pulls 9 Tags Updated 8 months ago
The IBM Granite 2B and 8B models are designed to support tool-based use cases and support for retrieval augmented generation (RAG), streamlining code generation, translation and bug fixing.
132.7K Pulls 33 Tags Updated 11 months ago
The IBM Granite 2B and 8B models are text-only dense LLMs trained on over 12 trillion tokens of data, demonstrated significant improvements over their predecessors in performance and speed in IBM’s initial testing.
128.3K Pulls 33 Tags Updated 10 months ago
The IBM Granite Embedding 30M and 278M models models are text-only dense biencoder embedding models, with 30M available in English only and 278M serving multilingual use cases.
116.7K Pulls 6 Tags Updated 11 months ago
The IBM Granite 1B and 3B models are the first mixture of experts (MoE) Granite models from IBM designed for low latency usage.
96.4K Pulls 33 Tags Updated 11 months ago
The IBM Granite Guardian 3.0 2B and 8B models are designed to detect risks in prompts and/or responses.
44.1K Pulls 10 Tags Updated 11 months ago
A general-purpose model ranging from 3 billion parameters to 70 billion, suitable for entry-level hardware.
1M Pulls 119 Tags Updated 2 years ago
38.3K Pulls 64 Tags Updated 2 months ago
5,239 Pulls 261 Tags Updated 2 weeks ago
4,816 Pulls 5 Tags Updated 8 months ago
1,570 Pulls 11 Tags Updated 8 months ago
1,252 Pulls 5 Tags Updated 2 months ago
1,001 Pulls 16 Tags Updated 2 months ago
4-bit quantized version of instructlab/granite-7b-lab
842 Pulls 1 Tag Updated 1 year ago
834 Pulls 11 Tags Updated 10 months ago