https://huggingface.co/wolfram/miquliz-120b-v2.0-GGUF
114 Pulls Updated 13 months ago
Updated 13 months ago
13 months ago
ebfd5ed6b70f · 32GB
model
archllama
·
parameters120B
·
quantizationIQ2_XXS
32GB
params
{
"stop": [
"<|im_start|>",
"<|im_end|>"
]
}
59B
template
{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
155B