Uncensored Llama 2 model by George Sung and Jarrad Hope.
1.5M Pulls 34 Tags Updated 2 years ago
2.7B uncensored Dolphin model by Eric Hartford, based on the Phi language model by Microsoft Research.
851.9K Pulls 15 Tags Updated 1 year ago
Uncensored, 8x7b and 8x22b fine-tuned models based on the Mixtral mixture of experts models that excels at coding tasks. Created by Eric Hartford.
774.8K Pulls 70 Tags Updated 12 months ago
The uncensored Dolphin model based on Mistral that excels at coding tasks. Updated to version 2.8.
460.6K Pulls 120 Tags Updated 1 year ago
Wizard Vicuna Uncensored is a 7B, 13B, and 30B parameter model based on Llama 2 uncensored by Eric Hartford.
285.5K Pulls 49 Tags Updated 2 years ago
A 7B and 15B uncensored variant of the Dolphin model family that excels at coding, based on StarCoder2.
141K Pulls 35 Tags Updated 1 year ago
Uncensored version of Wizard LM model
113.5K Pulls 18 Tags Updated 2 years ago
Uncensored Llama2 based model with support for a 16K context window.
77.9K Pulls 18 Tags Updated 1 year ago
An uncensored version of the original Llama-3.2-3B-Instruct, created via abliteration
768.8K Pulls 1 Tag Updated 1 year ago
Uncensored version of llama3, 🦅
111.5K Pulls 1 Tag Updated 1 week ago
Uncensored version of dolphin family models 🐊
91K Pulls 1 Tag Updated 1 week ago
Qwen3, but Josiefied and uncensored.
81.6K Pulls 52 Tags Updated 1 month ago
68.8K Pulls 1 Tag Updated 1 year ago
59.8K Pulls 1 Tag Updated 1 year ago
A small Fine-tuned uncensored model for training 🐁
34.9K Pulls 1 Tag Updated 1 week ago
Ablitered v3 llama-3.1 8b with uncensored prompt
33.1K Pulls 38 Tags Updated 1 year ago
29.1K Pulls 4 Tags Updated 7 months ago
28K Pulls 1 Tag Updated 1 year ago
Dolphin is an uncensored multilingual chat tuned model by Eric Hartford, conformant to system prompt, good at coding and various tasks
26.6K Pulls 15 Tags Updated 1 year ago
Specialized uncensored/abliterated quants for new OpenAI 20B MOE - Mixture of Experts Model at 80+ T/S (quantized Q5_1)
25.6K Pulls 1 Tag Updated 3 months ago