Ollama
Models Docs Pricing
Sign in Download
Models Download Docs Pricing Sign in
⇅
DeepSeek · Ollama
Search for models on Ollama.
  • deepseek-v4-flash

    DeepSeek-V4-Flash is a preview of the DeepSeek-V4 series, a Mixture-of-Experts model with 284B total parameters and 13B activated, built for efficient reasoning across a 1M-token context window.

    tools thinking cloud

    59K  Pulls 1  Tag Updated  2 weeks ago

  • deepseek-v4-pro

    DeepSeek-V4-Pro is a frontier Mixture-of-Experts model with a 1M-token context window and three reasoning modes.

    tools thinking cloud

    49.2K  Pulls 1  Tag Updated  2 weeks ago

  • deepseek-v3.2

    DeepSeek-V3.2, a model that harmonizes high computational efficiency with superior reasoning and agent performance.

    tools thinking cloud

    735K  Pulls 1  Tag Updated  4 months ago

  • deepseek-ocr

    DeepSeek-OCR is a vision-language model that can perform token-efficient OCR.

    vision 3b

    441.2K  Pulls 3  Tags Updated  5 months ago

  • deepseek-v3.1

    DeepSeek-V3.1-Terminus is a hybrid model that supports both thinking mode and non-thinking mode.

    tools thinking cloud 671b

    675.4K  Pulls 8  Tags Updated  7 months ago

  • deepseek-r1

    DeepSeek-R1 is a family of open reasoning models with performance approaching that of leading models, such as O3 and Gemini 2.5 Pro.

    tools thinking 1.5b 7b 8b 14b 32b 70b 671b

    85.1M  Pulls 35  Tags Updated  10 months ago

  • deepseek-coder

    DeepSeek Coder is a capable coding model trained on two trillion code and natural language tokens.

    1.3b 6.7b 33b

    4.2M  Pulls 102  Tags Updated  2 years ago

  • openthinker

    A fully open-source family of reasoning models built using a dataset derived by distilling DeepSeek-R1.

    7b 32b

    1.1M  Pulls 15  Tags Updated  1 year ago

  • r1-1776

    A version of the DeepSeek-R1 model that has been post trained to provide unbiased, accurate, and factual information by Perplexity.

    70b 671b

    402K  Pulls 9  Tags Updated  1 year ago

  • deepseek-v2.5

    An upgraded version of DeekSeek-V2 that integrates the general and coding abilities of both DeepSeek-V2-Chat and DeepSeek-Coder-V2-Instruct.

    236b

    273.8K  Pulls 7  Tags Updated  1 year ago

  • deepseek-v3

    A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.

    671b

    3.8M  Pulls 5  Tags Updated  1 year ago

  • brsilvapimentel/DeepSeek-R1-0528-Qwen3-8B

    DeepSeek-R1-0528-Qwen3-8B

    tools

    96  Pulls 1  Tag Updated  1 week ago

  • novaforgeai/deepseek-coder

    NovaForge AI – DeepSeek Coder 6.7B Pro is a professional-grade coding AI built for production-level development.

    2,094  Pulls 1  Tag Updated  4 months ago

  • iradukundadev/finetuned-deepseek-r1_7b

    Huggingface link - https://huggingface.co/iradukunda-dev/law-finetuned-DeepSeek-R1-Distill-Qwen-7B

    thinking

    360  Pulls 1  Tag Updated  4 months ago

  • DedeProGames/smallcoder

    SmallCoder is a compact reasoning-focused coding model, fine-tuned from DeepSeek-R1 1.5B using a code dataset that includes step-by-step reasoning.

    1.5b

    227  Pulls 1  Tag Updated  2 months ago

  • rjeffvalle/coder-pro

    Based on DeepSeek R1 because OpenCode tries to verify on the registry for tool compatibility

    tools

    190  Pulls 1  Tag Updated  2 months ago

  • fhagenciadigital/ds-go-pro

    Senior Go & SpecKit engineering agent powered by DeepSeek-v3.1 671B, optimized for idiomatic development and deterministic BDD testing.

    cloud

    58  Pulls 1  Tag Updated  1 month ago

  • second_constantine/deepseek-coder-v2

    This is a brand new Mixture of Export (MoE) model from DeepSeek, specializing in coding instructions. (quantized IQ4_XS)

    tools 16b

    11.3K  Pulls 3  Tags Updated  3 months ago

  • LONGTIME/DeepSeek-R1-Qwen3Based

    46  Pulls 2  Tags Updated  2 months ago

  • rockn/DeepSeek-R1-0528-Qwen3-8B-IQ4_NL

    DeepSeek-R1-0528-Qwen3-8B-IQ4_NL

    3,724  Pulls 1  Tag Updated  11 months ago

© 2026 Ollama
Blog Contact