The phrase "data storytelling" has become a marketing category. A dozen tools now promise to transform raw data into narratives — complete with insights, charts, and written explanations. The promise is seductive. The reality is that these tools produce summaries, not stories.

A summary states what happened. Revenue increased 12%. Churn decreased. The Nordic region outperformed. These are facts. A story explains why they matter, connects them causally, and leads the reader to a conclusion. Stories require judgment about what is important, what is surprising, and what the audience cares about. No AI tool currently exercises this judgment well.

What the Tools Produce

Most AI data storytelling tools follow the same pattern. They ingest a dataset, run statistical summaries, identify trends and outliers, and generate a written narrative accompanied by charts. The narrative follows a template: "Key findings: [metric] changed by [amount]. [Category] was the top/bottom performer. Notable outlier: [data point]."

This is useful for the first ten seconds of analysis. It is the equivalent of a colleague glancing at a spreadsheet and saying "revenue is up, except in APAC." It saves time. It does not save thought.

The charts produced alongside these narratives suffer from the same defaults problem described in previous essays. Generic chart types, default palettes, no annotations. The narrative does not reference specific visual elements in the charts, which means the text and the graphics exist in parallel rather than in integration. This is the opposite of good data storytelling, where the narrative guides the reader through the visual evidence.

The Missing Element: Editorial Judgment

Alberto Cairo describes data communication as requiring three components: truthful data, functional visualization, and insightful interpretation. AI tools handle the first adequately, the second poorly, and the third not at all.

Insightful interpretation means knowing that a 12% revenue increase is unremarkable for a startup but extraordinary for a mature company. It means recognizing that a churn decrease during a price reduction is expected and therefore not newsworthy, while a churn decrease during a price increase is significant and demands explanation. It means understanding the business context that gives numbers meaning.

This context is not in the data. It is in the analyst's head. Until AI tools can access and reason about organizational context, their "stories" will remain shallow summaries.

The Narrative Arc

A genuine data story has structure. It opens with a question or a surprise. It builds through evidence. It arrives at a conclusion. The sequence matters — presenting the conclusion first and the evidence second is a different story than building to the conclusion gradually.

AI-generated narratives have no arc. They present findings in a flat list, ordered by statistical significance or by the column order in the dataset. There is no rhetorical structure, no sense of what the reader knows versus what will surprise them, no building toward a point.

The best data stories from organizations like The New York Times graphics desk or the Financial Times data team are crafted objects. The reporter chooses what to show first, what to withhold, where to let the reader discover something in a chart before the text confirms it. These are authorial decisions. They require knowing the audience and the subject deeply.

Where AI Storytelling Adds Value

Despite these limitations, AI data storytelling tools serve two legitimate functions.

First, they accelerate exploratory analysis. When facing an unfamiliar dataset, an AI-generated summary provides a fast overview of what the data contains. It identifies the basic shape — trends, outliers, distributions — that the analyst would otherwise discover manually. This saves minutes, not hours, but minutes matter.

Second, they produce adequate routine reports. A weekly summary of KPIs, distributed to a mailing list that rarely reads it, does not need editorial craft. It needs accuracy and consistency. AI tools can produce these reports reliably, freeing analysts for work that requires judgment.

The danger is when organizations mistake these routine summaries for strategic communication. The quarterly board presentation, the product launch analysis, the annual review — these require human judgment about what matters and why. Delegating them to AI produces documents that are technically accurate and strategically empty.