Loading request...
User requests built-in support for using models from open router or locally (ollama, vLLM, LMStudio etc.).
@lexfridman Feature request: Built-in support for using models from open router or even locally (ollama, vLLM, LMStudio etc.) Would love to hear what they think about this