Ollama
Discord GitHub Models
Sign in Download
Models Discord GitHub Download Sign in
⇅
deepseek v3 · Ollama Search
Search for models on Ollama.
  • deepseek-v3

    A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.

    671b

    1.3M  Pulls 5  Tags Updated  4 months ago

  • Leungwaikuen818928/deepseekv.3

    For job and study

  • Seamanlee/deepseekv3

    tryTo

  • dhampir/deepseekv3

    fiirst tiime

  • easternland/deepseekv3

  • Midagama/deepseekV3

  • coolbamboo/deepseekV3

    deepseekV3

  • nezahatkorkmaz/deepseek-v3

    DeepSeek-V3 from Huggingface: Your powerful solution for handling complex requests and advanced coding tasks. Enhance your development workflow with state-of-the-art code assistance and intelligent problem-solving capabilities.

    tools

    17.3K  Pulls 1  Tag Updated  4 months ago

  • huihui_ai/deepseek-v3

    A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.

    7,001  Pulls 2  Tags Updated  3 months ago

  • milkey/deepseek-v3-UD

    (Unsloth Dynamic Quants) A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.

    1,933  Pulls 3  Tags Updated  3 months ago

  • 8b-wraith/deepseek-v3-0324

    deepseek-v3-0324-Quants. - Q2_K is the lowest here - quantized = round((original - zero_point) / scale)

    737  Pulls 1  Tag Updated  1 month ago

  • lwk/v3

    ollama run deepseek-v3

    tools

    629  Pulls 1  Tag Updated  3 months ago

  • huihui_ai/deepseek-v3-abliterated

    A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.

    671b

    619  Pulls 5  Tags Updated  1 month ago

  • sunny-g/deepseek-v3-0324

    dynamic quants from unsloth, merged

    246  Pulls 1  Tag Updated  1 month ago

  • huihui_ai/deepseek-v3-pruned

    DeepSeek-V3-Pruned-Coder-411B is a pruned version of the DeepSeek-V3 reduced from 256 experts to 160 experts, The pruned model is mainly used for code generation.

    411b

    178  Pulls 5  Tags Updated  1 month ago

  • MFDoom/deepseek-v3-tool-calling

    tools 671b

    101  Pulls 2  Tags Updated  3 months ago

  • xiaowangge/deepseek-v3-qwen2.5

    This model has been developed based on DistilQwen2.5-DS3-0324-Series.

    tools 32b

    82  Pulls 7  Tags Updated  1 week ago

  • org/deepseek-v3-fast

    Single file version with (Dynamic Quants) A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.

    66  Pulls 4  Tags Updated  2 months ago

  • lucataco/deepseek-v3-64k

    A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.

    18  Pulls 1  Tag Updated  3 months ago

  • iriver/deepseek-v3-0324

© 2025 Ollama
Blog Docs GitHub Discord X (Twitter) Meetups Download
  • Blog
  • Download
  • Docs
  • GitHub
  • Discord
  • X (Twitter)
  • Meetups
© 2025 Ollama Inc.