974 Downloads Updated 9 months ago
Updated 9 months ago
9 months ago
c1a3db3b81c6 · 19GB ·
The other mistral small 3.1 24B models on here didn’t have Q6 quantization and tool support. This one runs on a 3090 in vram with nice results.