138 Downloads Updated yesterday
ollama run SimonPu/GLM-4.7-Flash
ollama launch claude --model SimonPu/GLM-4.7-Flash
ollama launch codex --model SimonPu/GLM-4.7-Flash
ollama launch opencode --model SimonPu/GLM-4.7-Flash
ollama launch openclaw --model SimonPu/GLM-4.7-Flash
👋 Join our Discord community.
📖 Check out the GLM-4.7 technical blog, technical report(GLM-4.5).
📍 Use GLM-4.7-Flash API services on Z.ai API Platform.
👉 One click to GLM-4.7.
GLM-4.7-Flash is a 30B-A3B MoE model. As the strongest model in the 30B class, GLM-4.7-Flash offers a new option for lightweight deployment that balances performance and efficiency.
Default Settings (Most Tasks) from Run GLM-4.7-Flash Guide!
1.00.950.011.0
If you find our work useful in your research, please consider citing the following paper:
@misc{5team2025glm45agenticreasoningcoding,
title={GLM-4.5: Agentic, Reasoning, and Coding (ARC) Foundation Models},
author={GLM Team and Aohan Zeng and Xin Lv and others},
year={2025},
eprint={2508.06471},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2508.06471},
}