Models
GitHub
Discord
Docs
Cloud
Sign in
Download
Models
Download
GitHub
Discord
Docs
Cloud
Sign in
LoTUs5494
/
mistral-small-3.1
1,110
Downloads
Updated
8 months ago
Mistral-Small-3.1-24B-Instruct-2503 is a quantized GGUF variant of Mistral-small 3.1 24B Instruct (2503), optimized to run smoothly on GPUs with as low as 16GB VRAM.
Mistral-Small-3.1-24B-Instruct-2503 is a quantized GGUF variant of Mistral-small 3.1 24B Instruct (2503), optimized to run smoothly on GPUs with as low as 16GB VRAM.
Cancel
tools
24b
Name
5 models
Size
Context
Input
mistral-small-3.1:latest
a21a779aea67
• 15GB • 128K context window •
Text input • 8 months ago
Text input • 8 months ago
mistral-small-3.1:latest
15GB
128K
Text
a21a779aea67
· 8 months ago
mistral-small-3.1:Q4_K_L
a21a779aea67
• 15GB • 128K context window •
Text input • 8 months ago
Text input • 8 months ago
mistral-small-3.1:Q4_K_L
15GB
128K
Text
a21a779aea67
· 8 months ago
mistral-small-3.1:24b
4e994e0f85a0
• 13GB • 128K context window •
Text input • 8 months ago
Text input • 8 months ago
mistral-small-3.1:24b
13GB
128K
Text
4e994e0f85a0
· 8 months ago
mistral-small-3.1:Q4_K_M
659a973c06d5
• 14GB • 128K context window •
Text input • 8 months ago
Text input • 8 months ago
mistral-small-3.1:Q4_K_M
14GB
128K
Text
659a973c06d5
· 8 months ago
mistral-small-3.1:24b-instruct-2503-iq4_NL
4e994e0f85a0
• 13GB • 128K context window •
Text input • 8 months ago
Text input • 8 months ago
mistral-small-3.1:24b-instruct-2503-iq4_NL
13GB
128K
Text
4e994e0f85a0
· 8 months ago