Loading request...
Add support for configuring a custom URL for OpenAI API compatible providers. This would allow Archon to instantly connect with various inference servers like LM Studio or OpenWebUI (Ollama backend) that offer OpenAI API compatibility.
can you add custom url for OPENAI API connection ? many inference servers are openAI API compatible. it should instantly enable the use of lm studio or openwebui (openning ollama behind) and many others.