The app catalog version of Ollama doesn't expose the model directory as a configurable path by default. I want the ability to override it via an environment variable to point it at a dataset I've created.
spent about few days getting this working so figured I'd write it up since the existing guides are either outdated or skip the parts that actually break. goal: run Ollama as a persistent app on TrueNAS SCALE (Electric Eel), accessible from the same WebUI as my other services, with models stored on my NAS pool rather than eating the boot drive. what the guides don't tell you: 1. the app catalog version of Ollama doesn't expose the model directory as a configurable path by default. you have to override it via the OLLAMA\_MODELS env variable and point it at a dataset you've already created. if you set the variable but the dataset doesn't exist yet, it silently falls back to the default location. cost me an hour. 2. Open WebUI's default Ollama URL assumes localhost. on SCALE it needs to be the actual bridge IP of the Ollama container (usually something in the 172.x range), not 127.0.0.1. this isn't documented anywhere obvious. 3. GPU passthrough on SCALE with an AMD iGPU is still a mess. Nvidia works fine with the official plugin. AMD needs manual ROCm config and I gave up after 3 hours — just running on CPU for now which is fine for the 7B models I'm using daily. current setup that's stable: Qwen2.5-7B-Instruct-Q6\_K for general use, Nomic-embed-text for embeddings, everything stored on a mirrored vdev. WebUI is clean, history persists, it's been running for 3 weeks without a restart. anyone gotten AMD iGPU passthrough working on SCALE? or is the answer just "get a cheap Nvidia card and be done with it"