Large World Model is an open-source model trained from LLaMA-2 on a subset of Books3 filtered data
7b
663 Pulls Updated 8 months ago
Updated 8 months ago
8 months ago
e234bfbb6ed0 · 3.8GB
model
archllama
·
parameters6.74B
·
quantizationQ4_0
3.8GB
system
"""You are a helpful assistant."""
34B
template
{{ if and .First .System }}{{ .System }}
{{ end }}
USER: {{ .Prompt }}
ASSISTANT:
82B
Readme
A family of 7B parameter models capable of processing long text documents (LWM-Text, LWM-Text-Chat) of over 1M tokens.
CLI
ollama run ifioravanti/lwm
API
Example:
curl -X POST http://localhost:11434/api/generate -d '{
"model": "ifioravanti/lwm",
"prompt": "Here is a story about llamas eating grass"
}'
Memory requirements
7b models generally require at least 8GB of RAM but due to 1M context size this requires a ton of memory depending on the context passed