Every dashboard is an argument about what matters. The metrics chosen, the charts displayed, the comparisons made — each is an editorial decision. What is shown is important. What is hidden may be more important.

The typical executive dashboard contains four to eight metrics. Revenue, active users, conversion rate, churn. These numbers are displayed as large figures with trend arrows and sparklines. Green means up, red means down. The executive glances at the dashboard, sees green, and moves on.

This ritual is dangerous.

Aggregation Conceals

The first thing dashboards hide is variance. A single number — "average order value: $47" — conceals the distribution beneath it. Is $47 the center of a tight cluster, or the mean of a bimodal distribution where half of orders are $12 and half are $82? These are radically different situations requiring radically different responses. The dashboard shows neither.

Simpson's paradox lurks in every aggregated metric. A company's overall conversion rate may be rising while the conversion rate in every individual segment is falling. This happens when the mix shifts — more traffic comes from a high-converting segment, masking declines everywhere else. The dashboard shows a green arrow. The reality is red.

Temporal Framing

Dashboards choose a time window and rarely explain why. "Month over month" is the standard comparison. But what if the previous month was anomalous? What if there is strong seasonality that makes month-over-month comparisons meaningless? The dashboard does not say.

A metric that looks flat over 30 days may reveal a sharp inflection point when viewed over 90 days. A number that looks alarming this week may be a routine fluctuation when viewed over a year. The choice of time window shapes the story entirely, and most dashboards offer the reader no control over this choice.

Absent Metrics

What is not on the dashboard often matters more than what is. Revenue is displayed; cost of acquisition is not. User growth is shown; engagement depth is not. Conversion rate is tracked; return rate is ignored.

These omissions are not always deliberate. They often reflect the path of least resistance: the metrics that are easy to compute and easy to display. But a dashboard built on convenience rather than completeness is a distortion engine.

At Spotify, the question was never just how many users streamed music. It was how many discovered something new, how many returned the next day, how many shared a track. The surface metric — total streams — could grow while the health of the ecosystem declined. Dashboards that show only the surface number create a false sense of security.

The Comparison Trap

Dashboards encourage comparison against the previous period. This quarter vs. last quarter. This month vs. the same month last year. But these comparisons are only meaningful if the underlying conditions are comparable. A pandemic, a product launch, a competitor's failure — any of these can make historical comparisons worthless.

Better dashboards include context: annotations showing major events, confidence intervals around metrics, and explicit statements about what changed in the environment. Most dashboards include none of these.

What Good Dashboards Do

The best dashboards are not summaries. They are investigative tools. They allow the reader to drill from aggregate to segment, to change the time window, to see distributions rather than averages, and to compare against meaningful baselines rather than arbitrary ones.

They also include text. A dashboard without written commentary is like a chart without annotation — technically complete but practically opaque. The best practice, rarely followed, is to pair every dashboard with a brief written analysis: what changed, why it matters, and what it does not show.

Tufte argued that the problem with most presentations is too much chrome and not enough data. The same applies to dashboards. Strip away the gradient backgrounds, the 3D effects, the decorative icons. Replace them with more data, more context, and more honest framing of uncertainty.

The goal of a dashboard is not to make the reader feel informed. It is to make the reader actually informed. These are different things, and most dashboards achieve only the first.