Semantic Integrity
Semantic Drift: How Definitions Quietly Change Over Time
Semantic drift happens when metric meaning changes without clear communication.
Rendering diagram...
A sequence showing metric logic updates causing conflicting answers.
TL;DR
- • Drift is usually accidental and cumulative.
- • AI amplifies drift by spreading inconsistent definitions.
The problem (layman)
- • Metric definitions change as business rules evolve.
- • Old reports and AI prompts still assume previous meanings.
Why it matters
- • Drift makes trend analysis misleading.
- • AI may mix old and new definitions in a single answer.
Symptoms
- • Historical comparisons suddenly look “off.”
- • Same question yields different results before and after a model update.
Root causes
- • Changes to DAX logic without updating descriptions.
- • No change control for metric definitions.
What good looks like
- • Definition changes are versioned and documented.
- • Historical data is recomputed or clearly segmented.
How to fix (steps)
- • Add explicit version notes to metric descriptions.
- • Implement change review for KPI definitions.
- • Communicate drift events to report owners.
Pitfalls
- • Assuming “the data didn’t change” means the metric didn’t change.
- • Leaving old measures in place without deprecation.
Checklist
- • Definition change log exists.
- • Metrics have owners and review cycles.
- • Old definitions are deprecated or versioned.