Loading request...
User requests integration with Ollama, noting its OpenAI-identical API call available from localhost, suggesting it would be a useful feature to run locally.
### What problem does this solve? ollama has an openai identical api call, but it's from [localhost:3000](http://localhost:3000/) would be neat to have this as a feature! ### Proposed solution would be neat to be able to run from localhost as a simple feature! ### Additional context _No response_