Models
Docs
Pricing
Sign in
Download
Models
Download
Docs
Pricing
Sign in
second_constantine
/
gpt-oss-u
27.9K
Downloads
Updated
1 month ago
Specialized uncensored quants for new OpenAI 20B MOE - Mixture of Experts Model at 80+ T/S. "HERETIC" method results in a model (quantized Q5_1)
Specialized uncensored quants for new OpenAI 20B MOE - Mixture of Experts Model at 80+ T/S. "HERETIC" method results in a model (quantized Q5_1)
Cancel
thinking
20b
Name
1 model
Size
Context
Input
gpt-oss-u:20b
2359bd174e97
• 16GB • 128K context window •
Text input • 1 month ago
Text input • 1 month ago
gpt-oss-u:20b
16GB
128K
Text
2359bd174e97
· 1 month ago