Mistral Small is a lightweight model designed for cost-effective use in tasks like translation and summarization.
tools
22b
43.8K Pulls Updated 8 weeks ago
Updated 8 weeks ago
8 weeks ago
28fc9c710e5f · 44GB
model
archllama
·
parameters22.2B
·
quantizationF16
44GB
params
{"stop":["[INST]","[/INST]","\u003c/s\u003e"]}
47B
template
{{- if .Messages }}
{{- range $index, $_ := .Messages }}
{{- if eq .Role "user" }}
{{- if and (le (l
900B
license
# Mistral AI Research License
If You want to use a Mistral Model, a Derivative or an Output for any
11kB
Readme
Mistral Small v24.09 is an advanced small language model of 22B parameters with improved human alignment, reasoning capabilities, and code generation.
Key features
- Cost-efficient: Offers a mid-point between Mistral NeMo 12B and Mistral Large 2 for various use cases.
- Versatile: Excels in tasks such as translation, summarization, and sentiment analysis.
- Flexible deployment: Can be deployed across various platforms and environments.
- Performance upgrade: Significant improvements over the previous Mistral Small v24.02 model.
- Balanced solution: Provides a fast and reliable option without the need for full-blown general purpose models.
- 128k sequence length