The 7B model released by Mistral AI, updated to version 0.3.
tools
7b
5.6M Pulls Updated 4 months ago
Updated 4 months ago
4 months ago
f974a74358d6 · 4.1GB
model
archllama
·
parameters7.25B
·
quantizationQ4_0
4.1GB
params
{"stop":["[INST]","[/INST]"]}
30B
template
{{- if .Messages }}
{{- range $index, $_ := .Messages }}
{{- if eq .Role "user" }}
{{- if and (eq (l
801B
license
Apache License
Version 2.0, January 2004
11kB
Readme
Mistral is a 7B parameter model, distributed with the Apache license. It is available in both instruct (instruction following) and text completion.
The Mistral AI team has noted that Mistral 7B:
- Outperforms Llama 2 13B on all benchmarks
- Outperforms Llama 1 34B on many benchmarks
- Approaches CodeLlama 7B performance on code, while remaining good at English tasks
Versions
Tag | Date | Notes |
---|---|---|
v0.3 latest |
05/22/2024 | A new version of Mistral 7B that supports function calling. |
v0.2 |
03/23/2024 | A minor release of Mistral 7B |
v0.1 |
09/27/2023 | Initial release |
Function calling
Mistral 0.3 supports function calling with Ollama’s raw mode.
Example raw prompt
[AVAILABLE_TOOLS] [{"type": "function", "function": {"name": "get_current_weather", "description": "Get the current weather", "parameters": {"type": "object", "properties": {"location": {"type": "string", "description": "The city and state, e.g. San Francisco, CA"}, "format": {"type": "string", "enum": ["celsius", "fahrenheit"], "description": "The temperature unit to use. Infer this from the users location."}}, "required": ["location", "format"]}}}][/AVAILABLE_TOOLS][INST] What is the weather like today in San Francisco [/INST]
Example response
[TOOL_CALLS] [{"name": "get_current_weather", "arguments": {"location": "San Francisco, CA", "format": "celsius"}}]
For more information on raw mode, see the API documentation.
Variations
instruct |
Instruct models follow instructions |
text |
Text models are the base foundation model without any fine-tuning for conversations, and are best used for simple text completion. |
Usage
CLI
Instruct:
ollama run mistral
API
Example:
curl -X POST http://localhost:11434/api/generate -d '{
"model": "mistral",
"prompt":"Here is a story about llamas eating grass"
}'