When Data Lies by Telling the Truth


Confirmation bias shows up quietly. It shows up when numbers look clean, dashboards look green, and yet people keep saying something feels wrong. Our test infrastructure looked perfect on paper. The metrics showed less than 10% failure. With retries, it looked even better. But the test framework was failing constantly. Pass rates were under 20%. The framework team blamed infra for almost 40% of the failures. Infra teams pointed to healthy dashboards. Developers kept complaining. The data and the anecdotes were telling different stories.

The easy mistake is to trust the data and dismiss the anecdotes. But when they disagree, the anecdotes are usually right. Not because data is wrong, but because we measure the wrong thing. Our infra metrics measured availability, not stability. They measured uptime, not connection paths. They measured retries, not retry cost. It took weeks of pushing for a proper RCA. When the teams finally traced the problem end-to-end, we found multiple layers of routing and connection establishment issues that no metric had captured. The system was failing long before the dashboard admitted it.

Anecdotes are not noise. They are early signals. They tell you where to look. They help you question what the data hides. Tenet #9 — Confidence Compounds: Beat Confirmation Bias. Confidence comes from learning to trust uncomfortable signals early, before they become undeniable. Our dashboards were green. Our users were red. The anecdotes were right.

Comments

Popular posts from this blog

Breaking Systems, One Mistake at a Time

Why Now Matters More Than the Idea

Optimize for Being Less Wrong