Large World Model is an open-source model trained from LLaMA-2 on a subset of Books3 filtered data
7b
686 Pulls Updated 10 months ago
Updated 10 months ago
10 months ago
9a9a14cb0bde · 4.8GB
model
archllama
·
parameters6.74B
·
quantizationQ5_K_M
4.8GB
system
"""You are a helpful assistant."""
34B
template
{{ if and .First .System }}{{ .System }}
{{ end }}
USER: {{ .Prompt }}
ASSISTANT:
82B
Readme
A family of 7B parameter models capable of processing long text documents (LWM-Text, LWM-Text-Chat) of over 1M tokens.
CLI
ollama run ifioravanti/lwm
API
Example:
curl -X POST http://localhost:11434/api/generate -d '{
"model": "ifioravanti/lwm",
"prompt": "Here is a story about llamas eating grass"
}'
Memory requirements
7b models generally require at least 8GB of RAM but due to 1M context size this requires a ton of memory depending on the context passed