19.5K 4 days ago

Specialized uncensored/abliterated quants for new OpenAI 20B MOE - Mixture of Experts Model at 80+ T/S (quantized Q5_1)

thinking 20b