19.4K 4 days ago

Specialized uncensored/abliterated quants for new OpenAI 20B MOE - Mixture of Experts Model at 80+ T/S (quantized Q5_1)

thinking 20b
d8ba2f9a17b3 · 18B
{
"temperature": 1
}