Agent Readability: Why Your Business Is Invisible to AI Agents

McKinsey projects $1 trillion in AI agent-mediated retail by 2030. 51% of internet traffic is already automated. If agents can't find you, customers won't either.

8 min read
Agent Readability: Why Your Business Is Invisible to AI Agents

51% of all internet traffic comes from bots. Not humans. The Imperva Bad Bot Report 2025, analyzing global web traffic data from 2024, confirmed this for the first time in a decade: automated traffic has surpassed human activity (1).

At the same time, McKinsey projects that by 2030, up to $1 trillion in US retail revenue alone will be orchestrated by AI agents. Globally: $3 to $5 trillion (2). Gartner predicts organic search engine traffic will drop 50% by 2028 as consumers shift to AI-powered search (3).

Three numbers. One reality: attention is moving from humans to machines. And attention has always been the precursor to revenue. If your business remains invisible to AI agents over the next 6 to 12 months, you will not participate in this trillion-dollar shift.

I am a physician. I do not look at data flows and API architectures. I look at the person making the decisions. And what I see is a pattern I recognize from clinical practice: organizations building walls out of a fear of scarcity. Walls that now separate them from their future customers.

20 Years of Anti-Bot Architecture

For two decades, enterprise IT has built defenses against automated access. CAPTCHAs. JavaScript-heavy frontends. Rate limiting. Bot detection. IP blocking. The entire web architecture of the last 20 years was optimized to keep machines out.

That was rational. The Imperva report shows: 37% of all internet traffic consists of malicious bots. Scraping, credential stuffing, DDoS attacks (1). The defense was necessary.

But something is happening that most IT departments have not registered. The same walls that protect against malicious bots now block the AI agents that make purchasing decisions. The defense mechanism has become a business risk.

McKinsey describes the shift: shopping is transforming from a sequence of discrete steps (searching, comparing, buying) into "a continuous, intent-driven flow powered by autonomous AI systems" (2). If your system is not readable by these agents, you do not exist in this flow.

An API Is Not Enough

The standard IT response: "We have an API." That is not sufficient.

An API returns raw data. Product numbers, prices, availability. An AI agent needs more. It needs interpreted data. Not "SKU-4892, price $119.99, delivery 3-5 business days." Instead: "Running shoe, under budget, correct size, delivery by Thursday possible, flexible returns."

The difference between an API and an MCP server (Model Context Protocol) is the same difference as between a filing cabinet and a competent employee. The filing cabinet contains the information. The employee understands the question.

McKinsey's report emphasizes: product directories must be optimized for "agent readability." Open protocols like MCP enable agents to read data, negotiate with other agents, and transact safely (2). But the prerequisite is: the data must be interpretable.

And this is where the problem begins that no IT consultant addresses.

80% of Your Knowledge Does Not Exist

In most companies, 80% of product value lives in what organizational researchers call "tribal knowledge." Fair-trade certification. Specific material properties. Why this product fits better than the competitor's. This knowledge lives in marketing copy, in the heads of sales reps, in internal wikis that have not been updated in three years.

No AI agent has access to any of it. And without that access, the agent makes its decision based on the 20% stored in structured database fields. Price. Availability. Delivery time. When two providers score equally on these 20%, the one whose remaining 80% is machine-readable wins.

Making implicit knowledge explicit is not a technical project. It is a psychological one. Because it requires an organization to admit what it has failed to document. And organizations avoid this admission for the same reason people postpone uncomfortable doctor visits.

The Pattern: Scarcity, Control, Invisibility

Sendhil Mullainathan and Eldar Shafir described in their 2013 research on scarcity psychology a phenomenon they called "tunneling": under the impression of scarcity, attention narrows to the immediate threat, at the expense of all other factors. Cognitive bandwidth for long-term planning drops measurably. Mullainathan and Shafir demonstrated that merely triggering scarcity-related thoughts reduces cognitive performance by a full standard deviation (4).

In my work with executives, I see this tunneling around AI agent readiness in three stages:

Stage 1: Scarcity fear. "We are losing control of our data." "Agents will scrape our pricing." "We are making ourselves vulnerable." The perceived scarcity is security. The response is predictable.

Stage 2: Control. Reinforcing CAPTCHAs. Restricting APIs. Hiding data behind logins. Every measure feels like protection. Every measure reduces visibility to AI agents.

Stage 3: Invisibility. The agent does not find the company. The agent does not recommend the product. The agent completes the purchase elsewhere. The human behind the agent never sees the offer.

This is experiential avoidance at the infrastructure level. Hayes, Strosahl, and Wilson described the mechanism in 1999: people avoid unpleasant internal experiences, even when the avoidance causes more long-term damage than the avoided feeling itself (5). The short-term relief ("Our data is secure") outweighs the long-term consequence ("We are invisible").

The Deloitte Trap

Monahan, Cotteleer, and Fisher described in their analysis for Deloitte Insights three ways scarcity mentality destroys cognitive performance: constant internal interruption, compulsive focus on the immediate threat, and mental exhaustion from endless trade-off decisions (6).

All three mechanisms appear in executives who face the agent readability decision.

The constant interruption: "What if agents undercut our pricing?" This thought runs in the background and blocks strategic thinking.

The compulsive focus: All energy flows into data security and bot defense. The question "How do we become visible to purchasing agents?" is never asked.

The mental exhaustion: The complexity of the transformation (cleaning data structures, making tribal knowledge explicit, setting up MCP servers) feels overwhelming. So nothing gets decided. I have described why this pattern of decision avoidance is so common in AI strategies.

The Window Is Closing

Gartner projects a 50% decline in organic search engine traffic by 2028 (3). By 2026, search engine volume drops 25%. These are not decade-long forecasts. This is happening now.

Salesforce reported in December 2025 that 1 in 5 orders during Cyber Weekend involved an AI agent, totaling $67 billion in sales (7). The shift has begun.

The data cleanup required to achieve agent readability takes months. Not weeks. Transforming implicit knowledge into machine-readable structures is a quarterly project, not a sprint. Those who "wait and see" lose 6 to 12 months. And in 6 to 12 months, the market will have moved.

In my article on the fear of productivity, I described why companies use AI to shrink rather than grow. Agent readability is the growth side. This is not about replacing employees with agents. This is about being visible to the agents that make purchasing decisions.

What Agent Readability Actually Requires

Agent readability is not an IT project. It is a transparency project.

First: your implicit knowledge must become explicit. Everything an experienced sales rep carries in their head and would explain to a customer on the phone must be converted into machine-readable structures.

Second: your data must be interpreted, not merely available. An MCP server does not deliver raw data. It delivers answers to questions an agent asks. This is a fundamental architectural shift.

Third: your defense mechanisms must differentiate. Block malicious bots. Let purchasing agents in. This requires a different mindset than "block everything that is not human."

None of these three steps fail because of technology. All three fail because of the willingness to loosen control over the flow of information. And that willingness fails because of the scarcity feeling.

The Question That Remains

McKinsey calls this a "seismic shift" (2). Gartner projects 50% less organic traffic (3). Imperva shows that the majority of internet traffic is already automated (1).

The technical question ("How do I build an MCP server?") is solvable. The psychological question ("Why am I not building one?") is the one nobody asks.

In my work with executives, I start exactly there. Not with the architecture. With the feeling that blocks the decision. Because the rigidity that promises security today costs attention tomorrow. And attention has never been as directly linked to revenue as in a world where agents prepare the purchasing decisions.

Walk the Talk: This website implements Agent Readability itself — with structured data, llms.txt, RSS feeds, and connected Schema.org graphs. All technical details and measures in the Agent Readability Report.


Sources with URLs:

  1. Imperva (2025). 2025 Bad Bot Report: AI-Driven Bots Surpass Human Traffic. Thales Group. https://cpl.thalesgroup.com/about-us/newsroom/2025-imperva-bad-bot-report-ai-internet-traffic

  2. McKinsey & Company (2025). The Agentic Commerce Opportunity: How AI Agents Are Ushering in a New Era for Consumers and Merchants. McKinsey QuantumBlack. https://www.mckinsey.com/capabilities/quantumblack/our-insights/the-agentic-commerce-opportunity-how-ai-agents-are-ushering-in-a-new-era-for-consumers-and-merchants

  3. Gartner (2024). Gartner Predicts Search Engine Volume Will Drop 25% by 2026, Due to AI Chatbots and Other Virtual Agents. Gartner Newsroom. https://www.gartner.com/en/newsroom/press-releases/2024-02-19-gartner-predicts-search-engine-volume-will-drop-25-percent-by-2026-due-to-ai-chatbots-and-other-virtual-agents

  4. Mullainathan, S. & Shafir, E. (2013). Scarcity: Why Having Too Little Means So Much. Times Books/Henry Holt and Company. https://en.wikipedia.org/wiki/Scarcity:_Why_Having_Too_Little_Means_So_Much

  5. Hayes, S.C., Strosahl, K.D. & Wilson, K.G. (1999). Acceptance and Commitment Therapy: An Experiential Approach to Behavior Change. Guilford Press. https://contextualscience.org/publications/hayes_strosahl_wilson_1999

  6. Monahan, K., Cotteleer, M. & Fisher, J. (2016). A Behavioral Understanding of the Scarcity Mind-Set. Deloitte Insights. https://www.deloitte.com/us/en/insights/topics/leadership/scarcity-mind-set-improving-decision-making.html

  7. Salesforce (2025). Salesforce Confirms AI Agents Hit 20% of Cyber Week Orders. https://paz.ai/blog/salesforce-confirms-ai-agents-hit-20-of-cyber-week-orders-the-tipping-point-has-arrived

More on this topic

Discover more insights about AI Readiness

AI Readiness

External Links