Request to integrate Llama.cpp for CPU-only inference directly into Open WebUI.
feat: built-in cpu-only llama cpp integration