Ollama
Models GitHub Discord Docs Pricing
Sign in Download
Models Download GitHub Discord Docs Pricing Sign in
⇅
mistral · Ollama
Search for models on Ollama.
  • mistral-large-3

    A general-purpose multimodal mixture-of-experts model for production-grade tasks and enterprise workloads.

    cloud

    14.8K  Pulls 1  Tag Updated  2 months ago

  • mistral-small3.2

    An update to Mistral Small that improves on function calling, instruction following, and less repetition errors.

    vision tools 24b

    1.2M  Pulls 5  Tags Updated  7 months ago

  • mistral-small3.1

    Building upon Mistral Small 3, Mistral Small 3.1 (2503) adds state-of-the-art vision understanding and enhances long context capabilities up to 128k tokens without compromising text performance.

    vision tools 24b

    569.1K  Pulls 5  Tags Updated  10 months ago

  • mistral

    The 7B model released by Mistral AI, updated to version 0.3.

    tools 7b

    24.8M  Pulls 84  Tags Updated  6 months ago

  • mistral-nemo

    A state-of-the-art 12B model with 128k context length, built by Mistral AI in collaboration with NVIDIA.

    tools 12b

    3.3M  Pulls 17  Tags Updated  6 months ago

  • mistral-small

    Mistral Small 3 sets a new benchmark in the “small” Large Language Models category below 70B.

    tools 22b 24b

    2.3M  Pulls 21  Tags Updated  1 year ago

  • mistral-large

    Mistral Large 2 is Mistral's new flagship model that is significantly more capable in code generation, mathematics, and reasoning with 128k context window and support for dozens of languages.

    tools 123b

    477.3K  Pulls 32  Tags Updated  1 year ago

  • mistral-openorca

    Mistral OpenOrca is a 7 billion parameter model, fine-tuned on top of the Mistral 7B model using the OpenOrca dataset.

    7b

    309.9K  Pulls 17  Tags Updated  2 years ago

  • mistrallite

    MistralLite is a fine-tuned model based on Mistral with enhanced capabilities of processing long contexts.

    7b

    166.3K  Pulls 17  Tags Updated  2 years ago

  • mixtral

    A set of Mixture of Experts (MoE) model with open weights by Mistral AI in 8x7b and 8x22b parameter sizes.

    tools 8x7b 8x22b

    1.8M  Pulls 70  Tags Updated  1 year ago

  • codestral

    Codestral is Mistral AI’s first-ever code model designed for code generation tasks.

    22b

    761K  Pulls 17  Tags Updated  1 year ago

  • dolphin-mistral

    The uncensored Dolphin model based on Mistral that excels at coding tasks. Updated to version 2.8.

    7b

    654.6K  Pulls 120  Tags Updated  1 year ago

  • bakllava

    BakLLaVA is a multimodal model consisting of the Mistral 7B base model augmented with the LLaVA architecture.

    vision 7b

    475.6K  Pulls 17  Tags Updated  2 years ago

  • zephyr

    Zephyr is a series of fine-tuned versions of the Mistral and Mixtral models that are trained to act as helpful assistants.

    7b 141b

    490.1K  Pulls 40  Tags Updated  1 year ago

  • openhermes

    OpenHermes 2.5 is a 7B model fine-tuned by Teknium on Mistral with fully open datasets.

    385.4K  Pulls 35  Tags Updated  2 years ago

  • neural-chat

    A fine-tuned model based on Mistral with good coverage of domain and language.

    7b

    351.7K  Pulls 50  Tags Updated  2 years ago

  • samantha-mistral

    A companion assistant trained in philosophy, psychology, and personal relationships. Based on Mistral.

    7b

    310.4K  Pulls 49  Tags Updated  2 years ago

  • mathstral

    MathΣtral: a 7B model designed for math reasoning and scientific discovery by Mistral AI.

    7b

    179.9K  Pulls 17  Tags Updated  1 year ago

  • yarn-mistral

    An extension of Mistral to support context windows of 64K or 128K.

    7b

    250.2K  Pulls 33  Tags Updated  2 years ago

  • ministral-3

    The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware.

    vision tools cloud 3b 8b 14b

    359.3K  Pulls 16  Tags Updated  1 month ago

© 2026 Ollama
Blog Contact