everythinglm

Uncensored Llama2 based model with support for a 16K context window.

7,990 Pulls Updated 2 months ago

The Everything Language Model is a Llama 2-based model with a 16k context released by Totally Not An LLM (Kai Howard). It was trained with the EverythingLM Dataset and is uncensored.

CLI

ollama run everythinglm

Once loaded, change the context size to 16K

/set parameter num_ctx 16384

API

Example:

curl -X POST http://localhost:11434/api/generate -d '{
  "model": "everythinglm",
  "prompt": "Why is the sky blue?"
  "options": {
    "num_ctx": 16384
  }
 }'

Reference

13b parameters original source: Totally Not An LLM