Models
GitHub
Discord
Turbo
Sign in
Download
Models
Download
GitHub
Discord
Sign in
rhundt
/
GLM-Z1-0414-32b-128k
:Q4_K_M
187
Downloads
Updated
4 months ago
GLM-Z1-0414-32b thinking model with YaRN RoPE scaling to 128k context
GLM-Z1-0414-32b thinking model with YaRN RoPE scaling to 128k context
Cancel
tools
Updated 4 months ago
4 months ago
693613b2b5db · 20GB
model
arch
glm4
·
parameters
32.6B
·
quantization
Q4_K_M
20GB
template
[gMASK]<sop>{{- /* ---------- tools section ---------- */}} {{- if .Tools }} <|system|> # Available
964B
params
{ "num_ctx": 64000, "stop": [ "<|system|>", "<|user|>", "<|assistant
115B
Readme
No readme
Write
Preview
Paste, drop or click to upload images (.png, .jpeg, .jpg, .svg, .gif)