185.5K 1 year ago

OpenHermes 2.5 is a 7B model fine-tuned by Teknium on Mistral with fully open datasets.

Models

View all →

35 models

openhermes:latest

4.1GB · 32K context window · Text · 1 year ago

openhermes:v2

4.1GB · 32K context window · Text · 1 year ago

openhermes:v2.5

latest

4.1GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2-q2_K

3.1GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2-q3_K_S

3.2GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2-q3_K_M

3.5GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2-q3_K_L

3.8GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2-q4_0

4.1GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2-q4_1

4.6GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2-q4_K_S

4.1GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2-q4_K_M

4.4GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2-q5_0

5.0GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2-q5_1

5.4GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2-q5_K_S

5.0GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2-q5_K_M

5.1GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2-q6_K

5.9GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2-q8_0

7.7GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2-fp16

14GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2.5-q2_K

3.1GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2.5-q3_K_S

3.2GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2.5-q3_K_M

3.5GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2.5-q3_K_L

3.8GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2.5-q4_0

4.1GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2.5-q4_1

4.6GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2.5-q4_K_S

4.1GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2.5-q4_K_M

4.4GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2.5-q5_0

5.0GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2.5-q5_1

5.4GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2.5-q5_K_S

5.0GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2.5-q5_K_M

5.1GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2.5-q6_K

5.9GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2.5-q8_0

7.7GB · 32K context window · Text · 1 year ago

openhermes:7b-mistral-v2.5-fp16

14GB · 32K context window · Text · 1 year ago

openhermes:7b-v2

4.1GB · 32K context window · Text · 1 year ago

openhermes:7b-v2.5

4.1GB · 32K context window · Text · 1 year ago

Readme

Open Hermes 2 a Mistral 7B fine-tuned with fully open datasets. Matching 70B models on benchmarks, this model has strong multi-turn chat skills and system prompt capabilities. In total, the model was trained on 900,000 instructions, and surpasses all previous versions of Nous-Hermes 13B and below.

Versions

Tag Date Notes
v2.5 latest 11/02/2023 Added ~100k examples of Code Instructions
v2 10/16/2023 Initial release of Open Hermes 2

Usage

CLI

ollama run openhermes

API

Example:

curl -X POST http://localhost:11434/api/generate -d '{
  "model": "openhermes",
  "prompt": "Here is a story about llamas eating grass"
}'

References

Hugging Face