452 Downloads Updated 1 year ago
ollama run robmaster/Spatz-14B-de
Updated 1 year ago
1 year ago
1e7b36961a25 · 16GB ·
Attention, the context size must be specified explicitly, otherwise the default is 2048 tokens, up to 128,000 tokens are supported. A good value is 32,000. Start the Ollama server with the following parameters:
OLLAMA_CONTEXT_LENGTH=32000 ollama serve
Now you can test the model with the call
ollama run robmaster/Spatz-14B-de