Ollama
Models Docs Pricing
Sign in Download
Models Download Docs Pricing Sign in
⇅
deepseek · Ollama
Search for models on Ollama.
  • deepseek-v4-flash

    DeepSeek-V4-Flash is a preview of the DeepSeek-V4 series, a Mixture-of-Experts model with 284B total parameters and 13B activated, built for efficient reasoning across a 1M-token context window.

    tools thinking cloud

    44.3K  Pulls 1  Tag Updated  1 week ago

  • deepseek-v4-pro

    DeepSeek-V4-Pro is a frontier Mixture-of-Experts model with a 1M-token context window and three reasoning modes.

    tools thinking cloud

    35.2K  Pulls 1  Tag Updated  1 week ago

  • deepseek-ocr

    DeepSeek-OCR is a vision-language model that can perform token-efficient OCR.

    vision 3b

    436.1K  Pulls 3  Tags Updated  5 months ago

  • deepseek-v3.2

    DeepSeek-V3.2, a model that harmonizes high computational efficiency with superior reasoning and agent performance.

    tools thinking cloud

    156.7K  Pulls 1  Tag Updated  4 months ago

  • deepseek-v3.1

    DeepSeek-V3.1-Terminus is a hybrid model that supports both thinking mode and non-thinking mode.

    tools thinking cloud 671b

    668.4K  Pulls 8  Tags Updated  7 months ago

  • deepseek-r1

    DeepSeek-R1 is a family of open reasoning models with performance approaching that of leading models, such as O3 and Gemini 2.5 Pro.

    tools thinking 1.5b 7b 8b 14b 32b 70b 671b

    84.6M  Pulls 35  Tags Updated  10 months ago

  • deepseek-v3

    A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.

    671b

    3.8M  Pulls 5  Tags Updated  1 year ago

  • deepseek-coder

    DeepSeek Coder is a capable coding model trained on two trillion code and natural language tokens.

    1.3b 6.7b 33b

    4.1M  Pulls 102  Tags Updated  2 years ago

  • deepseek-coder-v2

    An open-source Mixture-of-Experts code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks.

    16b 236b

    2.5M  Pulls 64  Tags Updated  1 year ago

  • deepseek-v2

    A strong, economical, and efficient Mixture-of-Experts language model.

    16b 236b

    1.1M  Pulls 34  Tags Updated  1 year ago

  • deepseek-llm

    An advanced language model crafted with 2 trillion bilingual tokens.

    7b 67b

    1.1M  Pulls 64  Tags Updated  2 years ago

  • deepseek-v2.5

    An upgraded version of DeekSeek-V2 that integrates the general and coding abilities of both DeepSeek-V2-Chat and DeepSeek-Coder-V2-Instruct.

    236b

    271.9K  Pulls 7  Tags Updated  1 year ago

  • deepscaler

    A fine-tuned version of Deepseek-R1-Distilled-Qwen-1.5B that surpasses the performance of OpenAI’s o1-preview with just 1.5B parameters on popular math evaluations.

    1.5b

    1.2M  Pulls 5  Tags Updated  1 year ago

  • openthinker

    A fully open-source family of reasoning models built using a dataset derived by distilling DeepSeek-R1.

    7b 32b

    1.1M  Pulls 15  Tags Updated  1 year ago

  • r1-1776

    A version of the DeepSeek-R1 model that has been post trained to provide unbiased, accurate, and factual information by Perplexity.

    70b 671b

    399.7K  Pulls 9  Tags Updated  1 year ago

  • deepseek-140B/DeepSeekAI140B

    5,748  Pulls 1  Tag Updated  1 year ago

  • brsilvapimentel/DeepSeek-R1-0528-Qwen3-8B

    DeepSeek-R1-0528-Qwen3-8B

    tools

    19  Pulls 1  Tag Updated  yesterday

  • casistre/deepseek

    tools

    9  Pulls 1  Tag Updated  6 days ago

  • matich123tv/deepseek

    tools

    3  Pulls 1  Tag Updated  6 days ago

  • nexusriot/deepseek-r1-abliterated

    thinking 8b 14b

    900  Pulls 2  Tags Updated  3 weeks ago

© 2026 Ollama
Blog Contact