NousResearch/Hermes-2-Pro-Mistral-7B
4,322 Pulls Updated 8 months ago
Updated 8 months ago
8 months ago
be0ad79940b4 · 3.2GB
model
archllama
·
parameters7.24B
·
quantizationQ3_K_S
3.2GB
license
Apache License 2.0
20B
params
{"stop":["\u003c|im_start|\u003e","\u003c|im_end|\u003e"]}
59B
template
{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
156B
system
You are a function calling AI model. You are provided with function signatures within <tools></tools
707B
Readme
github.com/adrienbrault/ollama-nous-hermes2pro
Ollama models of NousResearch/Hermes-2-Pro-Mistral-7B-GGUF.
$ ollama run adrienbrault/nous-hermes2pro:Q4_0 'Hey!'
Hello! How can I help you today? If you have any questions or need assistance, feel free to ask.
There are -tools
and -json
tags with the recommended system prompt for function calling and json mode.
You provide the tools with the user message:
$ ollama run adrienbrault/nous-hermes2pro:Q4_0-tools "<tools>$(cat examples/tool-stock.json)</tools>
Fetch the stock fundamentals data for Tesla (TSLA)"
<tool_call>
{"arguments": {"symbol": "TSLA"}, "name": "get_stock_fundamentals"}
</tool_call>
Or a schema for the json mode:
$ ollama run adrienbrault/nous-hermes2pro:Q4_0-json "<schema>$(cat examples/user-schema.json)<schema>
Adrien Brault was born in 1991"
{"firstName": "Adrien", "lastName": "Brault", "age": 30}