An extension of Mistral to support context windows of 64K or 128K.

9,592 Pulls Updated 4 months ago

Readme

Yarn Mistral is a model based on Mistral that extends its context size up to 128k context. It is developed by Nous Research by implementing the YaRN method to further train the model to support larger context windows.

CLI

64k context size:

ollama run yarn-mistral

128k context size:

ollama run yarn-mistral:7b-128k

API

Example:

curl -X POST http://localhost:11434/api/generate -d '{
  "model": "yarn-mistral:7b-128k",
  "prompt":"Here is a story about llamas eating grass"
 }'

References

Hugging Face

YaRN: Efficient Context Window Extension of Large Language Models