User has developed a plugin that adds support for any OpenAI-compatible endpoint, allowing the use of self-hosted and third-party models with WordPress's AI Client.
WordPress 7.0 will ship with a built-in AI Client but only includes connectors for OpenAI, Anthropic, and Google Gemini. I built a small plugin that adds support for **any OpenAI-compatible endpoint** – so you can use self-hosted and third-party models too. **Works with:** * **Ollama** – run Llama, Mistral, Gemma, etc. locally * **LM Studio** – one-click local LLM server * **OpenRouter** – unified API for 100+ models * **vLLM, text-generation-webui, LocalAI** – any OpenAI-compatible server Just enter your endpoint URL (and optionally an API key) under Settings → OpenAI Compatible. The plugin auto-discovers all available models and registers them with WordPress's AI Client, so other AI-powered plugins can use them automatically. It also handles practical stuff like extended HTTP timeouts for slow local inference and non-standard port support (Ollama on 11434, LM Studio on 1234, etc.). Passes the WordPress Plugin Check with zero errors. Single-file, \~450 lines, no dependencies beyond WP 7.0 beta. GitHub: [https://github.com/Ultimate-Multisite/openai-compatible-connector](https://github.com/Ultimate-Multisite/openai-compatible-connector) It will work with the official \[AI Experiments\](https://wordpress.org/plugins/ai/) plugin and hopefully all future AI plugins. Would love feedback – especially if you're running other OpenAI-compatible servers I haven't tested against yet.