916 Downloads Updated 7 months ago
Updated 7 months ago
7 months ago
67a2a027906e · 20GB ·
Quantized with YaRN RoPE scaling to 128k context (factor 4). This needs Ollama >=0.6.6 to run. The num_ctx in the Modelfile defaults to 64k just because I don’t have gobs of VRAM.