Kanana, a series of bilingual language models (developed by Kakao) that demonstrate exceeding performance in Korean and competitive performance in English.
2.1b
232 Pulls Updated 13 days ago
Updated 13 days ago
13 days ago
8e8cea0cfad1 · 4.2GB
model
archllama
·
parameters2.09B
·
quantizationF16
4.2GB
params
{
"num_ctx": 8192,
"stop": [
"<|eot_id|>"
]
}
49B
system
You are a helpful AI assistant developed by Kakao.
50B
template
<|start_header_id|>system<|end_header_id|>
{{ .System }}<|eot_id|>
<|start_header_id|>user<|end_he
208B
Readme
This is an uncensored version of kakaocorp/kanana-nano-2.1b-instruct created with abliteration (see remove-refusals-with-transformers to know more about it).
This is a crude, proof-of-concept implementation to remove refusals from an LLM model without using TransformerLens.
References
Donation
Your donation helps us continue our further development and improvement, a cup of coffee can do it.
- bitcoin:
bc1qqnkhuchxw0zqjh2ku3lu4hq45hc6gy84uk70ge