Large World Model is an open-source model trained from LLaMA-2 on a subset of Books3 filtered data
7b
663 Pulls Updated 8 months ago
Updated 8 months ago
8 months ago
6434e863af34 · 7.2GB
model
archllama
·
parameters6.74B
·
quantizationQ8_0
7.2GB
system
"""You are a helpful assistant."""
34B
template
{{ if and .First .System }}{{ .System }}
{{ end }}
USER: {{ .Prompt }}
ASSISTANT:
82B
Readme
A family of 7B parameter models capable of processing long text documents (LWM-Text, LWM-Text-Chat) of over 1M tokens.
CLI
ollama run ifioravanti/lwm
API
Example:
curl -X POST http://localhost:11434/api/generate -d '{
"model": "ifioravanti/lwm",
"prompt": "Here is a story about llamas eating grass"
}'
Memory requirements
7b models generally require at least 8GB of RAM but due to 1M context size this requires a ton of memory depending on the context passed