1,210 7 months ago

Arcee-Blitz (24B) is a new Mistral-based 24B model distilled from DeepSeek, designed to be both fast and efficient. We view it as a practical “workhorse” model that can tackle a range of tasks without the overhead of larger architectures.

24b

7 months ago

d31a66a19728 · 14GB ·

llama
·
23.6B
·
Q4_K_M
Apache 2.0
{ "stop": [ "</s>" ] }
{{- /* Define default system message */ -}} {{- $default_system_message := "You are Virtouso Medium

Readme

This is an uncensored version of arcee-ai/Arcee-Blitz created with abliteration (see remove-refusals-with-transformers to know more about it).
This is a crude, proof-of-concept implementation to remove refusals from an LLM model without using TransformerLens.

References

Huggingface

x.com/support_huihui

Donation

Your donation helps us continue our further development and improvement, a cup of coffee can do it.
  • bitcoin:
  bc1qqnkhuchxw0zqjh2ku3lu4hq45hc6gy84uk70ge