Implement an optional long-context memory extension to allow agents to reason over large codebases or long-lived knowledge without loading the full context into the prompt, acting as a pre-inference memory layer.
**Feature Description** Add an optional long-context memory extension that allows agents to reason over large codebases or long-lived knowledge without loading full context into the prompt. The feature would act as a pre-inference memory layer, complementary to the existing MVI / ContextScout system. **Problem It Solves** OAC is extremely efficient for pattern-level context (standards, conventions, workflows), but some tasks still require access to large raw material, such as: Full repositories (many thousands of lines) Large legacy files Long documentation sets Cross-file or cross-module reasoning over time Currently, these cases either: Exceed native model context limits, or Force users to manually trim or re-inject context, breaking the MVI flow This creates a gap between: Pattern knowledge (well handled by OAC) Large factual/code memory (hard to fit safely in context) **Proposed Solution** Introduce an optional, opt-in memory extension layer that: Compresses large documen