Request to integrate Octomind as a new option for `ollama launch`, allowing users to utilize its session-based AI development capabilities.
## Summary Add **Octomind** as a new integration for `ollama launch`. Octomind is a session-based AI development assistant written in Rust with MCP support and **native Ollama provider support**. ## What is Octomind? **Repository**: https://github.com/muvon/octomind **Website**: https://muvon.io **License**: Apache 2.0 Features: - **7 LLM providers** including native Ollama support (`ollama:model-name`) - **MCP Protocol Support** - Full Model Context Protocol implementation - **Session-based workflow** - Persistent context across conversations - **Plan-first architecture** - Multi-step planning with validation - **Built-in tools** - Shell, file editing, code search (ast_grep), web browsing - **Native binary** - Written in Rust, no Node.js required ## Native Ollama Support Octomind already has native Ollama provider support: ```toml # ~/.config/octomind/config.toml model = "ollama:qwen3-coder" [providers.ollama] # Uses OLLAMA_API_KEY or defaults to localhost ``` Environme