User wants to know if it's possible to connect a local LLM to Claude Code for coding tasks.
hello all, ill be moving from Asia to Europe and I need good local llm model for my macbook air m4 16gb RAM i have downloaded all movies and series but I dont think I can stand watching it all for 4 hours straight my usecase: \- coding mainly js/ts,go, \- wanna vibe code, is it possible to connect local llm to claude code? my knowledge, ive tried load tinyllama-1.1b-chat from this [guide](https://medium.com/@raviyadav0675/running-llama-models-locally-on-your-machine-macos-a-complete-guide-with-llama-cpp-808f6c806b95) to load it on my local and realised it is only in my cli and then it looks very weird like \`\`python it think it supposed to be in markdown? any feedback is great, thanks. edit: holy crap not even 1 hour im posting this and you guys are the most helpful person in out of all forum ive been out here in reddit. i feel like cryin rn edit: model thats working well on my macbook air m4 16gb ram via LM Studio \- ministral 3 14b reasoning \- qwen/qwen3-vl-8b \- codellama-7b-instruct \- deepseek/deepseek-r1-0528-qwen3-8b \- qwen3-8b-deepseek-v3.2-speciale-distill \- rnj-1 (8b) atp i need to set it up with opencode/claudecode