Flagship Engine

Evidence Compression Engine

Turn noisy logs, tickets, alerts, RCAs, and runbooks into compact, redacted, replayable EvidencePacks for simulation and safe AI reasoning.

The Problem: Raw Data is Too Noisy for Reasoning

Token Explosion

Dumping millions of log lines into an LLM context window causes hallucinations and massive costs.

Repeated Stack Traces

400 instances of the exact same connection timeout drown out the root cause signal.

Unsafe Secrets

Raw data contains live API keys, customer PII, and internal infrastructure topology.

Conflicting Timestamps

UTC, local time, and mixed format events make incident correlation impossible.

Fragmented Systems

Tickets in Jira, logs in Datadog, and runbooks in Notion cannot be reasoned over natively.

Weak Provenance

AI chatbot answers provide no trail back to the source log or original SOP.

The Evidence Pipeline

Raw Sources
Connector Layer
Canonical Normalization
Redaction
Deduplication
Semantic Clustering
Timeline Correlation
Evidence Scoring
EvidencePack Builder
Simulation Runtime
1

Multi-source Ingestion & Canonical Normalization

Vendor-native formats (CloudWatch, Datadog, ServiceNow, raw JSON) are converted into a common canonical event schema before any reasoning takes place.

RAW LOG (Vendor Specific)
{
  "timestamp": "2026-05-10T02:11:00Z",
  "message": "DB timeout after 30s",
  "req_id": "8x991a"
}
CANONICAL EVENT
{
  "source": "cloudwatch",
  "type": "log_event",
  "service": "payments-db",
  "severity": "critical",
  "timestamp": 1778385060,
  "payload": { "timeout": 30 }
}
3

Redaction Engine

Secrets and PII are masked before AI reasoning. We detect API keys, JWTs, passwords, emails, internal IPs, account IDs, and DB connection strings.

Connecting to db://admin:superSecret99@10.4.2.8
→ Connecting to db://[REDACTED_CREDS]@[MASKED_IP]
4

Deduplication & Semantic Clustering

482 repeated "connection refused" lines are collapsed into a single semantic cluster indicating the pattern, count, and active window. "DB timeout", "SQL timeout", and "connection timeout" are grouped under DATABASE_TIMEOUT.

9

EvidencePack Output

After normalizing timelines, correlating cross-system events, and scoring causal relevance, the engine produces an immutable EvidencePack.

  • Compressed incident timeline
  • Causal graph of failures
  • Redaction metadata map
  • Verified source provenance
  • Simulation-ready deterministic context

Why Evidence Compression Matters

BEFORE MAZELABS
  • 500MB of noisy raw logs
  • Fragmented tickets and chats
  • Live secrets exposed to AI models
  • Unpredictable AI hallucinations
AFTER MAZELABS
  • 8MB compressed EvidencePack
  • Fully redacted and sanitized
  • Bounded context for AI reasoning
  • Deterministic simulation replay

Enterprise-Grade Technical Trust

MazeLabs does not engage in uncontrolled AI scraping. The Evidence Compression Engine guarantees that the exact same operational inputs will consistently produce the exact same deterministic EvidencePack.

Deterministic OutputStrict Audit TrailsSource ProvenanceRedaction-First Policy