Models
GitHub
Discord
Turbo
Sign in
Download
Models
Download
GitHub
Discord
Sign in
huihui_ai
/
deepseek-v3-abliterated
:671b-Q2_K
1,940
Downloads
Updated
5 months ago
A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.
A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.
Cancel
671b
deepseek-v3-abliterated:671b-Q2_K
...
/
params
d318c0731575 · 160B
{
"num_gpu": 1,
"stop": [
"<|begin▁of▁sentence|>",
"<|end▁of▁sentence|>",
"<|User|>",
"<|Assistant|>"
]
}