How brands and publishers can survive the era of clickless search
We are moving from a search landscape built around ‘blue links’ to one defined by AI summaries, instant answers, and agentic search. This shift, known as clickless search, is rewriting how consumers discover content, and how publishers and brands are found.
A growing proportion of search queries happen directly within AI interfaces, with the WSJ reporting that tools like ChatGPT and Perplexity captured 5.6% of US desktop search traffic in June 2025 – more than double the 2.48% from June 2024, and quadruple the 1.3% from January 2024.
What this looks like as a consumer
From the user perspective, this functionality delivers speed and ease. While some trust issues may remain at this relatively early stage, people will adapt and grow accustomed to these tools quickly – as we always do. In fact, the data suggests we are already well on the way. Although internet users show reluctance when it comes to turning to AI for the big stuff (think medical or financial research), nearly two thirds of consumers trust it to guide their brand choices.
Instead of typing ‘best trainers under £100’ into Google and browsing 10 sites, consumers can now ask an AI agent to compare reviews, check prices, and even place the order. This seamless journey means fewer visits to ecommerce sites, but potentially more direct, lower-funnel conversions for certain brands.
What it means for brands and publishers
For brands, this creates a paradox: visibility without engagement. Search Console data shows impressions rising while clicks decline, a trend dubbed the ‘Great Dislocation.’ But with fewer clicks comes less control. Publishers lose valuable site visits, and therefore, revenue from ads and subscriptions. The challenge is not just surviving, but finding new ways to measure and capture value when traffic itself is no longer the main currency.
And remember, this shift isn’t unique to search. Social platforms have long kept users in-app by surfacing content directly, limiting outbound clicks. Search is simply catching up, with AI-powered interfaces doubling down on the same strategy.
In response to this behavioural shift, Google and other search giants have been swiftly embedding AI features. For example, Google’s AI Overviews and AI Mode (launched in the UK this summer) are increasingly delivering conversational answers directly in search results, further reducing reliance on traditional links – and causing some consternation in the process, according to the New York Post: “the News Media Alliance has warned that AI Overviews and other Google-implemented AI features will have devastating consequences for the industry.”
How to remain relevant – and profitable
Chatbots curate content from multiple sources, making it harder for individual brands to stand out unless their content is structured, authoritative, and tailored for AI consumption. As such, a range of solutions and strategies are emerging to counter the effects of declining redirects from search engines:
1. Emerging optimisation practices including Answer Engine Optimisation (AEO), which ensures your content can appear in AI-powered answer boxes; Generative Engine Optimisation (GEO), which focuses on earning citations in AI-generated content; and Large Language Model Optimisation (LLMO), which broadens the view, ensuring consistent brand presence across platforms like Wikipedia, YouTube, Reddit, and LinkedIn, the sources LLMs reference most.
2. Diversification of formats: AI interfaces increasingly support multi-modal content. Engaging visuals, interactive content and video increase the chances of being pulled out and shared by the AI tool.
3. Writing for both humans and AI: Clear, semantically structured paragraphs that can stand alone make it easier for LLMs to cite your content. Authority still matters, but equally important is clarity, so that individual sentences can be lifted into AI answers without losing meaning.
Perhaps the most interesting solution to surface in recent months is Anthropic’s Model Context Protocol (MCP), which has already been adopted by major players like OpenAI and Google DeepMind.
What is MCP and how does it help publishers (and brands)?
Described as ‘robots.txt for AI‘, MCP is an open-source protocol that establishes a standardised framework for AI models to access structured data. In theory, this means publishers can decide what content AI models can access, and how it is represented.
In the context of AI-driven search, MCP allows publishers to:
– Expose high-quality, up-to-date data, such as article metadata, summaries, factual updates and author credentials – directly to AI systems.
– Ensure that AI-generated summaries or responses reference accurate, brand-approved information instead of potentially outdated or inferred content.
– Maintain control over how content is presented and cited by AI models, increasing transparency and reducing the risk of misrepresentation.
But the reality is more complex. Competing protocols from Microsoft (NLWeb) and Google (Agent2Agent) threaten to fragment adoption. Some publishers question whether MCP will truly provide more value than web crawling. And until there’s a proven market for monetising content through MCP servers, adoption will likely remain experimental.
While the clickless era has clear benefits for the consumer, it poses a serious threat to publishers relying on organic traffic. The future playbook rests on three pillars: visibility strategies (AEO, GEO, LLMO), monetisation innovations (licensing deals, MCP servers, pay-per-query models), and regulatory action (with the CMA and others pushing for fairness and transparency in AI search).
Ultimately, AI is not just a threat but a new audience. Optimising for AI means creating content ecosystems, ensuring structured and trusted data, and actively shaping the standards and regulations that will define the agentic web. Those who adapt quickly will not just survive the zero-click era – they may discover new ways to thrive in it.





