User has built a mobile app called ShellX that connects to n8n webhooks and wants feedback on additional features that could be included. They are looking for suggestions on what workflows users would want to run from mobile and any features that could enhance the app.
I've been running n8n workflows for my business and kept hitting the same problem - accessing them from mobile meant either opening a browser or building a Telegram bot. Both felt like workarounds, not solutions. So I built **ShellX** \- a dedicated mobile app that connects to n8n/Make/Zapier webhooks. **Streaming responses** \- Used SSE so AI responses appear word-by-word instead of dumping a full block of text. Makes it feel like actual chat. **Context switching** \- Added "labels" so one webhook can route to different workflows. Tap \[Home\] to control my house, \[Work\] to query company data, \[Personal\] for my AI assistant. No bot commands, just seamless switching. **Native features** \- Built-in QR scanner, voice notes, push notifications. Stuff that makes sense on mobile but is hard to hack into Telegram. **Privacy approach** \- The app doesn't see your data. It's just a UI layer that talks directly to your webhooks. Everything stays on your infrastructure. **What I'm curious about:** * What workflows would you want to run from mobile? * Is the Telegram workaround actually a problem for others or just me? * Any features you'd want to see? * Live on iOS/Android app stores (free) https://preview.redd.it/tv71umvcigmg1.png?width=768&format=png&auto=webp&s=5e9f5151532d35a88edf73e954373d3067cb1d3e Happy to answer technical questions about the implementation. [shellx.app](http://shellx.app) { "nodes": [ { "parameters": { "promptType": "define", "text": "={{ $items('Webhook')[0].json.body.chatInput }}", "options": { "systemMessage": "You are my assistant.\nAnswer the question directly and only ask a follow-up question if it is strictly necessary.", "enableStreaming": true } }, "type": "@n8n/n8n-nodes-langchain.agent", "typeVersion": 3.1, "position": [-32, -16], "id": "e4503e0f-0815-46ce-8b2e-3b26547cfeb9", "name": "AI Agent Chat", "onError": "continueRegularOutput" }, { "parameters": { "httpMethod": "POST", "path": "shellx-groq-chat", "responseMode": "streaming", "options": {} }, "type": "n8n-nodes-base.webhook", "typeVersion": 2.1, "position": [-240, -16], "id": "ce7a6c2a-bdc9-49c9-99dd-53010f9bee50", "name": "Webhook", "webhookId": "shellx-groq-chat" }, { "parameters": { "sessionIdType": "customKey", "sessionKey": "={{ $items('Webhook')[0].json.body.sessionId }}", "contextWindowLength": {} }, "type": "@n8n/n8n-nodes-langchain.memoryBufferWindow", "typeVersion": 1.3, "position": [64, 160], "id": "d0182671-d9e9-4142-9863-c9b681aff93f", "name": "Simple Memory" }, { "parameters": { "model": "openai/gpt-oss-120b", "options": {} }, "type": "@n8n/n8n-nodes-langchain.lmChatGroq", "typeVersion": 1, "position": [-176, 192], "id": "b221af9c-0245-4e83-846b-20679efeee99", "name": "Groq Chat Model" } ], "connections": { "Webhook": { "main": [ [ { "node": "AI Agent Chat", "type": "main", "index": 0 } ] ] }, "Simple Memory": { "ai_memory": [ [ { "node": "AI Agent Chat", "type": "ai_memory", "index": 0 } ] ] }, "Groq Chat Model": { "ai_languageModel": [ [ { "node": "AI Agent Chat", "type": "ai_languageModel", "index": 0 } ] ] } }, "meta": { "templateCredsSetupCompleted": false } }