everythinglm:13b-16k-q4_1

40.6K 1 year ago

Uncensored Llama2 based model with support for a 16K context window.

13b

1 year ago

9108ed6fceeb · 8.2GB

llama
·
13B
·
Q4_1
You are a helpful AI assistant.
{ "stop": [ "User:", "Assistant:" ] }
{{ .System }} User: {{ .Prompt }} Assistant:
LLAMA 2 COMMUNITY LICENSE AGREEMENT Llama 2 Version Release Date: July 18, 2023 "Agreement" means

Readme

The Everything Language Model is a Llama 2-based model with a 16k context released by Totally Not An LLM (Kai Howard). It was trained with the EverythingLM Dataset and is uncensored.

CLI

ollama run everythinglm

Once loaded, change the context size to 16K

/set parameter num_ctx 16384

API

Example:

curl -X POST http://localhost:11434/api/generate -d '{
  "model": "everythinglm",
  "prompt": "Why is the sky blue?"
  "options": {
    "num_ctx": 16384
  }
 }'

Reference

13b parameters original source: Totally Not An LLM