ollama launch
15 days ago
- #cloud-models
- #ollama
- #coding-tools
- Ollama introduces a new command 'ollama launch' for setting up and running coding tools like Claude Code, OpenCode, and Codex with local or cloud models.
- No environment variables or config files are needed for setup.
- Users can download Ollama v0.15+ and pull models like 'glm-4.7-flash' or cloud variants.
- Supported integrations include Claude Code, OpenCode, Codex, and Droid.
- Recommended models for coding require a context length of at least 64000 tokens.
- Local models include 'glm-4.7-flash', 'qwen3-coder', and 'gpt-oss:20b'.
- Cloud models include 'glm-4.7:cloud', 'minimax-m2.1:cloud', and 'gpt-oss:120b-cloud'.
- Ollama offers a cloud service with hosted models for extended coding sessions, including a 5-hour window.
- Users can configure a tool without launching it immediately using the '--config' flag.