Context-aware refinement
DOM-scraped thread context · last 6 turns · 8k cap
The only prompt enhancer that reads your ongoing chat. Before a rewrite, the extension scans each platform's conversation DOM, extracts up to the last 6 turns (user + assistant), caps the total at 8,000 characters, and ships that context to the optimizer. The result: a vague mid-thread follow-up like "make it shorter" becomes a precise rewrite that actually references what you were just talking about.
const ctx = getConversationContext();
// up to 6 turns, capped at 2000 chars each
// attached to the optimize request for the LLM