Mistral Small is a lightweight model designed for cost-effective use in tasks like translation and summarization.
tools
22b
41.6K Pulls Updated 7 weeks ago
Updated 7 weeks ago
7 weeks ago
d095cd553b04 · 13GB
model
archllama
·
parameters22.2B
·
quantizationQ4_0
13GB
params
{"stop":["[INST]","[/INST]","\u003c/s\u003e"]}
47B
template
{{- if .Messages }}
{{- range $index, $_ := .Messages }}
{{- if eq .Role "user" }}
{{- if and (le (l
900B
license
# Mistral AI Research License
If You want to use a Mistral Model, a Derivative or an Output for any
11kB
Readme
Mistral Small v24.09 is an advanced small language model of 22B parameters with improved human alignment, reasoning capabilities, and code generation.
Key features
- Cost-efficient: Offers a mid-point between Mistral NeMo 12B and Mistral Large 2 for various use cases.
- Versatile: Excels in tasks such as translation, summarization, and sentiment analysis.
- Flexible deployment: Can be deployed across various platforms and environments.
- Performance upgrade: Significant improvements over the previous Mistral Small v24.02 model.
- Balanced solution: Provides a fast and reliable option without the need for full-blown general purpose models.
- 128k sequence length