Skip to Content
ArchitectureScan Aggregation

Scan Aggregation

How individual events become meaningful scans.

Overview

Multiple hook events can constitute a single “scan” - a logical unit representing one AI interaction session.

Aggregation Rules

Time Window

Events within 30 seconds of each other are grouped:

Event 1: 10:00:00 Event 2: 10:00:15 → Same scan Event 3: 10:00:45 → New scan

Same Action

Events for the same action type are grouped:

Event 1: chat, tokens=500 Event 2: chat, tokens=750 → One scan, 1250 tokens

Different Actions

Different action types create separate scans:

Event 1: chat → Scan A Event 2: complete → Scan B

Cross-Tool Sessions

Events from different tools in the same time window remain separate:

Event 1: cursor, chat → Scan A Event 2: claude, chat → Scan B

Pattern Detection

During aggregation, the CLI detects:

  • Retry loops - Same request repeated 3+ times
  • Token bloat - Single event with excessive tokens
  • Timeout chains - Sequential timeout errors

These patterns are flagged in the scan metadata.

Implementation

Aggregation runs locally via the CLI:

  1. Query events from SQLite buffer
  2. Group by source + time window
  3. Create or update scan record
  4. Run pattern detection
  5. Mark events as processed

Storage Schema

CREATE TABLE scans ( id TEXT PRIMARY KEY, source TEXT NOT NULL, action TEXT NOT NULL, tokens_in INTEGER, tokens_out INTEGER, event_count INTEGER, patterns TEXT, -- JSON array of detected patterns started_at DATETIME, ended_at DATETIME, synced_at DATETIME );