(for 24 GB) Uncensored L3.3 70B, IQ2_XXS and IQ2_XS version
tools
1,706 Pulls Updated 4 months ago
Updated 4 months ago
4 months ago
2b48a7eb95ee · 19GB
model
archllama
·
parameters70.6B
·
quantizationIQ2_XXS
19GB
params
{
"num_ctx": 6144,
"stop": [
"<|start_header_id|>",
"<|end_header_id|>",
111B
template
{{- if or .System .Tools }}<|start_header_id|>system<|end_header_id|>
{{- if .System }}
{{ .System
1.5kB
Readme
Llama 3.3 70B Abliterated (Uncensored) for 24 GB VRAM
IQ2_XS: Higher quality, 4096 context
IQ2_XXS: Lower quality 6144 context