27.9K 1 month ago

Specialized uncensored quants for new OpenAI 20B MOE - Mixture of Experts Model at 80+ T/S. "HERETIC" method results in a model (quantized Q5_1)

thinking 20b