A user suggests integrating source ingestion capabilities for local LLMs, allowing users to upload documents, URLs, or videos as sources for analysis and interaction, enhancing the functionality of local AI models.
https://preview.redd.it/unt7sqjhdxog1.png?width=1364&format=png&auto=webp&s=63936b7ce08703edb673625a26375e7625a0708d What it does Upload documents, URLs, or YouTube videos as sources. SoyLM analyzes them with a local LLM, stores structured summaries in SQLite, and lets you chat with your sources using RAG (FTS5 + BM25) and optional web search (DuckDuckGo). Features Source ingestion — Files, web URLs (with Playwright JS rendering fallback), YouTube transcripts Local LLM — Nemotron-Nano-9B via vLLM (OpenAI-compatible API), thinking mode for inference RAG search — SQLite FTS5 full-text search with BM25 ranking Web search — DuckDuckGo integration for supplementing source data SSE streaming — Real-time streamed responses Chat history — Persistent chat logs with JSON export Deduplication — SHA-256 hash prevents duplicate sources if you want to build: [https://github.com/soy-tuber/SoyLM](https://github.com/soy-tuber/SoyLM) my media: [https://media.patentllm.org/en/](https://media.patentllm.org/en/)