MathΣtral: a 7B model designed for math reasoning and scientific discovery by Mistral AI.
80.1K Pulls 17 Tags Updated 1 year ago
A family of efficient AI models under 10B parameters performant in science, math, and coding through innovative training techniques.
1.6M Pulls 17 Tags Updated 12 months ago
Phi-4-mini brings significant enhancements in multilingual support, reasoning, and mathematics, and now, the long-awaited function calling feature is finally supported.
645.7K Pulls 5 Tags Updated 9 months ago
EXAONE Deep exhibits superior capabilities in various reasoning tasks including math and coding benchmarks, ranging from 2.4B to 32B parameters developed and released by LG AI Research.
310.7K Pulls 13 Tags Updated 9 months ago
Mistral Large 2 is Mistral's new flagship model that is significantly more capable in code generation, mathematics, and reasoning with 128k context window and support for dozens of languages.
297.9K Pulls 32 Tags Updated 1 year ago
Qwen2 Math is a series of specialized math language models built upon the Qwen2 LLMs, which significantly outperforms the mathematical capabilities of open-source models and even closed-source models (e.g., GPT4o).
198K Pulls 52 Tags Updated 1 year ago
Model focused on math and logic problems
142.8K Pulls 64 Tags Updated 1 year ago
Athene-V2 is a 72B parameter model which excels at code completion, mathematics, and log extraction tasks.
124.4K Pulls 17 Tags Updated 1 year ago
Building upon Mistral Small 3, Mistral Small 3.1 (2503) adds state-of-the-art vision understanding and enhances long context capabilities up to 128k tokens without compromising text performance.
443.3K Pulls 5 Tags Updated 8 months ago
🎩 Magicoder is a family of 7B parameter models trained on 75K synthetic instruction data using OSS-Instruct, a novel approach to enlightening LLMs with open-source code snippets.
67.8K Pulls 18 Tags Updated 2 years ago
Google Gemma 2 is a high-performing and efficient model available in three sizes: 2B, 9B, and 27B.
11.4M Pulls 94 Tags Updated 1 year ago
State-of-the-art large embedding model from mixedbread.ai
5.7M Pulls 4 Tags Updated 1 year ago
A state-of-the-art 12B model with 128k context length, built by Mistral AI in collaboration with NVIDIA.
3M Pulls 17 Tags Updated 4 months ago
SmolLM2 is a family of compact language models available in three size: 135M, 360M, and 1.7B parameters.
2.2M Pulls 49 Tags Updated 1 year ago
Kimi K2 Thinking, Moonshot AI's best open-source thinking model.
14.1K Pulls 1 Tag Updated 1 month ago
7B parameter text-to-SQL model made by MotherDuck and Numbers Station.
70K Pulls 17 Tags Updated 1 year ago
Sailor2 are multilingual language models made for South-East Asia. Available in 1B, 8B, and 20B parameter sizes.
56.7K Pulls 13 Tags Updated 1 year ago
1,291 Pulls 1 Tag Updated 1 year ago
32 Pulls 1 Tag Updated 1 year ago
Chatbot AI Model
15 Pulls 1 Tag Updated 1 year ago