Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

In a zero-click world, the only content that survives is content AI cannot steal

Google's AI summaries have gutted evergreen publishing and informational content. What remains is a more demanding, more valuable form of journalism. Here is what it looks like, and whether it can actually pay

Ian Lyall profile image
by Ian Lyall
In a zero-click world, the only content that survives is content AI cannot steal
Photo by Annie Spratt / Unsplash

The most clarifying thing you can say about the zero-click era is this: Google has not killed journalism. It has killed cheap journalism. The question publishers need to answer now is whether the expensive kind can support the business models that replaced it.

To do that, you have to understand what AI Overviews can and cannot do, where breaking news actually stands in the new search hierarchy, and what readers are looking for when AI gives them a summary but they click through anyway.

What Google can and cannot do with breaking news

The received wisdom is that breaking news is safe. Google's AI cannot summarise what has not been written yet, and freshness creates a moat that evergreen content lacks. That is directionally correct. But it is more complicated than it sounds.

Research from NewzDash tracking thousands of trending news queries found a consistent pattern: AI Overviews appear 6 to 9 hours after a major news event, once live interest fades and real-time coverage slows. During the event itself, Top Stories dominate search results. Google's own systems prioritise freshness when the story is moving. Once coverage plateaus, AI steps in and synthesises what happened, and clicks to publishers drop.

That window between breaking and synthesis is real, and publishers that are fast, accurate, and first inside it still earn traffic. Data published recently from a panel of 64 publisher sites tracked by media analytics firm Define found that breaking news traffic across Google Search, Discover, and Google News rose 103% from November 2024 through early 2026. Organic search traffic from evergreen content fell 42% over the same period.

Breaking news is not safe in any absolute sense. It is safe for the first few hours. After that, Google's AI absorbs it.

The more significant finding in that data concerns Google Discover. For the first time, Discover now drives roughly equal traffic to web search for news publishers, and when breaking news traffic is isolated by surface, Discover accounts for most of the gains. Discover is an algorithmic feed, not a search result. It does not wait for users to type a query. It pushes content to users based on their interests, device history, and engagement patterns. Publishers that optimise for Discover are operating on a different logic from search: it rewards freshness, strong imagery, and a record of reader engagement rather than keyword targeting.

The implication is that publishers serious about capturing what remains of Google's referral traffic need to treat Discover as a standalone product, with its own editorial and technical strategy, rather than as a side effect of search optimisation.

The investigative question

Investigative journalism is the category publishers cite most often when making the case that quality content survives the AI era. The Global Investigative Journalism Network puts it plainly: reporting that involves systematic inquiry, multiple sources, data analysis, and fact-checking cannot be replicated by machine learning. Watchdog reporting offers readers something they cannot get from a summary.

That is true. It is also not the whole picture.

AI cannot conduct an interview. It cannot cultivate a source over months until that source is ready to talk. It cannot obtain a leaked document, verify it, and understand its institutional context. It cannot sit in a courtroom and read the room. These are not tasks that become easier as models improve, because they depend on human presence, trust, and judgment rather than pattern recognition across existing text.

What AI can do is summarise the results of an investigation after it has been published. If the Sunday Times breaks a story about financial fraud at a listed company, an AI Overview can tell users what happened, who was involved, and what the consequences were, without sending anyone to the Sunday Times website. The investigation itself required human journalism. The summary requires none.

This is the investigative content paradox. The work is irreplaceable. The traffic it generates is not protected.

The publishers winning on investigative content are the ones whose readers seek out the investigation itself, not just the facts of it. That is a subscription and brand loyalty question as much as a content question. If your readers come to you because they trust your journalism and want to read it in full, the summary does not displace them. If your readers were arriving from Google to get the gist of a story, the summary already has them.

Publishers including the Guardian, the Financial Times, and ProPublica have found that investigative content drives subscription conversion at rates that commodity news does not. Readers who engage with a long investigation are measurably more likely to subscribe than readers who land on a news brief. That relationship matters more now that the traffic model for informational content has effectively collapsed.

What AI summaries cannot replicate

The categories of content most resistant to AI displacement fall into a few clear groups.

Live and continuously updated content is the most obvious. Sports scores, earnings calls, political debates in real time, developing crime stories, weather emergencies, and market movements all depend on currency that AI summaries cannot manufacture. The value is not in the information itself but in its freshness. No summary of a football match satisfies the reader who wants to know what just happened.

Opinion and voice are harder to summarise without losing what makes them worth reading. A column works because of how it argues, not just what it concludes. AI can extract the conclusion. It cannot replicate the reasoning, the wit, the provocation, or the relationship between the writer and the reader built over years. The writers with genuine audiences, from individual Substack columnists to established newspaper voices, retain readers who specifically want that person's perspective rather than a factual digest.

Local and hyperlocal content is structurally protected. AI has no way to know who attended the council meeting, what was said off the record afterwards, which local official has a conflict of interest, or how the planning decision will affect the specific street in question. Local journalism is expensive precisely because it requires presence. That presence is also what makes it valuable in a zero-click world.

Specialist and proprietary data is another category AI cannot commoditise. Publishers that own databases, polling operations, or access to restricted information have content that no AI can summarise because no AI has it. The Economist's data journalism unit, Bloomberg's Terminal-adjacent editorial product, and trade publishers with exclusive industry data all sit in this category.

Human testimony and reported narrative cannot be extracted into a useful summary. A long reported feature that puts a reader inside an experience, whether a conflict, a hospital, a community, or a crisis, derives its value from the accumulation of detail and the quality of the writing. An AI can tell you what the piece is about. It cannot make you feel what the piece does.

What readers actually want from content, and from search

The Pew Research Center surveyed US adults in August 2025 and found that 65% now encounter AI-generated summaries in search results, and 45% see them often or very often. Opinions were mixed. Users found them useful for quick factual queries but expressed consistent scepticism about accuracy and a desire to verify important information at source.

That scepticism is the publisher's opportunity. AI has created a trust gap at the same moment it has created a traffic gap. Readers who care about the accuracy of what they are reading are more likely to click through to a known publisher, more likely to subscribe, and more likely to return. The audience for verified, accountable journalism has not shrunk. The cheap route to that audience has been cut off.

What publishers often misread is the difference between what readers say they want and what their behaviour shows. Readers say they want depth, investigation, and context. Their behaviour shows they also want speed, brevity, and frequency. The most resilient publishers are the ones that offer both: a fast daily habit product that keeps readers coming back, and deeper work that justifies the subscription.

Research tracking publisher experiments with a move away from commodity news found that long-form analysis and investigative content retained readers for significantly longer and drove higher return visit rates. Publishers that refocused on explanatory journalism and authorial voice saw measurable improvements in engagement. But engagement without a conversion mechanism is not a business. The traffic gains from investigative content only pay if there is a subscription, a membership, or a sufficiently premium ad product behind them.

The content hierarchy that survives

The publishers most likely to build sustainable businesses in a zero-click environment are those producing content that sits in at least one of four categories: content AI cannot have because it is new, content AI cannot replicate because it requires human presence, content AI cannot compress because its value is in the reading rather than the knowing, and content readers will pay for because they trust the source.

Breaking news is category one, for a few hours. Investigative and reported long-form is category two. Opinion and narrative are category three. Subscription journalism from trusted brands is category four.

Commodity informational content, the how-to piece, the evergreen explainer, the aggregated news brief, sits in none of these categories. It is the content AI has already replaced. Publishers still producing it at scale are not competing with each other. They are competing with a machine that works for free.

The editors who understood this earliest are the ones restructuring their newsrooms accordingly: fewer content producers generating volume, more journalists doing work that requires time, access, and judgment. That is a more expensive newsroom. It is also, in the current environment, the only kind worth building.

Ian Lyall profile image
by Ian Lyall