ollama launch

January 23, 2026

Ollama Launch CLI ollama launch is a new command which sets up and runs your favorite coding tools like Claude Code, OpenCode, and Codex with local or cloud models. No environment variables or config files needed.

Get started

Download Ollama v0.15+, then open a terminal and run:

# ~23 GB VRAM required with 64000 tokens context length
ollama pull glm-4.7-flash 

# or use a cloud model (with full context length)
ollama pull glm-4.7:cloud

One command setup

Claude Code:

ollama launch claude

Ollama Launch Claude Code

OpenCode:

ollama launch opencode

Ollama Launch OpenCode

This will guide you to select models and launch your chosen integration. No environment variables or config files needed.

Supported integrations

Note: Coding tools work best with a full context length. Update the context length in Ollama’s settings to at least 64000 tokens. See the context length documentation on how to make changes.

Local models:

Cloud models:

Extended coding sessions

If you have trouble running these models locally, Ollama also offers a cloud service with hosted models that has full context length and generous limits even at the free tier.

With this update Ollama now offers more usage and an extended 5-hour coding session window. See ollama.com/pricing for details.

Configure only

To configure a tool without launching it immediately:

ollama launch opencode --config