ollama run frob/glm-5:744b-a40b-ud-q4_K_XL
Updated 1 month ago
1 month ago
072e21e05bf1 · 431GB ·
Imported from hc.co/unsloth/GLM-5-GGUF.
Note that ollama does not yet support this model. To run it, ollama needs to be patched with #14134.
$ git clone https://github.com/ollama/ollama.git .
$ git checkout 8224cce583e6e7253e2fdeee8f07ab4c8da7bce5
$ curl -L https://github.com/ollama/ollama/pull/14134.diff | patch -p1
$ docker build -t ollama/ollama:0.17.0-frob .
$ ollama run frob/glm-5 --experimental --experimental-yolo
This experimental version of Ollama has the bash tool enabled.
Models can read files on your computer, or run commands (after you allow them).
warning: yolo mode - all tool approvals will be skipped
>>> what is the time?
Thinking...
The user is asking for the current time. I can use the bash command `date`
to get the current date and time.
...done thinking.
running: Bash: date
Mo 16 Feb 2026 19:29:52 CET
Thinking...
The user asked for the time, and I got the result from the `date` command.
The output shows:
Mo 16 Feb 2026 19:29:52 CET
This means it's Monday, February 16th, 2026, at 19:29:52 (7:29:52 PM) in
Central European Time (CET).
I should provide this information to the user in a clear and friendly way.
...done thinking.
The current time is **19:29:52 CET** (Central European Time).
It's Monday, February 16th, 2026.
$ ollama run frob/glm-5 hello --think=false
Hello! 👋 I'm your GLM model trained by Z.ai. How can I help you today?
Whether you have questions, need assistance with tasks, or just want to
chat, I'm here for you. What would you like to discuss?