Context Stability

Why AI Answers Change When Your Data Didn’t

Inconsistent context, not data changes, often causes fluctuating AI answers.

Context Variance
Rendering diagram...
Same question yields different answers due to context changes.
Unstable context yields different answers for the same question.

TL;DR

  • Same data can yield different answers if context shifts.
  • Stability requires predictable filter paths.

The problem (layman)

  • AI answers vary across runs even when data is unchanged.
  • The model has ambiguous context paths.

Why it matters

  • Users lose trust when answers are unstable.
  • Decisions can flip based on hidden context changes.

Symptoms

  • Two similar prompts return different totals.
  • Slightly different filters change the result dramatically.

Root causes

  • Ambiguous relationships or bidirectional filters.
  • Hidden default filters in measures.

What good looks like

  • Deterministic filter paths to facts.
  • Explicit context documented for key measures.

How to fix (steps)

  • Map filter paths for each KPI.
  • Reduce ambiguity by simplifying relationships.
  • Add context tests for repeatable queries.

Pitfalls

  • Using bidirectional filtering as a shortcut.
  • Ignoring role‑playing dimensions.

Checklist

  • Filter paths documented for top KPIs.
  • Context tests created and reviewed.
  • Ambiguous relationships resolved.