The Internet is at Risk of Splitting: Humans and Bots

Experts warn that the internet is on the brink of a major split, creating two separate worlds : one for users and one for bots. The change threatens to break the internet’s oldest “contract,” where websites and search engines benefit together.
For decades, websites have welcomed crawlers (bots) from search engines like Google.
This makes content indexed, ranked, and easily accessible, which in turn brings significant traffic to websites. The “two bots for one user” ratio creates a symbiotic relationship: bots collect data, websites receive traffic, and business grows.
However, the rise of generative AI is changing this completely. AI tools like ChatGPT (OpenAI) or Claude (Anthropic) are "devouring" the entire Internet to train their models.
According to Cloudflare, every time OpenAI has a user visit a website, there are simultaneously about 1,500 visits from bots operated by OpenAI.
For Anthropic, that number is even higher, reaching 60,000 bots per user. This shows that human traffic is decreasing, while automated traffic (caused by bots) is increasing rapidly, in many cases outpacing human traffic.
The existential threat from AI
The core problem is that these AI bots often don’t link back to the original source material. Instead, they summarize and provide answers directly within their own interfaces, keeping users engaged and cutting websites and content creators out of the value chain.

AI is changing the way we interact with the traditional Internet (Illustration photo: GARP).
Linda Tong, CEO of Webflow, a web design and hosting company, calls this one of the most profound changes she has seen in her 20 years of doing business on the Internet. “It is fundamentally changing the way people find and interact with brands,” she said. “And for some businesses, it is an existential threat.”
From SEO to AEO: The New Era of Search
For the past 30 years, visibility on Google has been the key to any website’s success. An entire search engine optimization (SEO) industry has exploded to help businesses compete better. But AI doesn’t play by the same old rules.
Instead of linking back to the source document, large language models (LLMs) like ChatGPT, Claude, or even Google's Gemini read and reuse that document to directly answer users' questions, largely without attributing the source.
This shift is giving rise to a new acronym: AEO (AI engine optimization). AEO is a strategy that makes content more visible to AI and more effective when digested by AI, even if the AI’s answer never results in a click. If SEO defined the era of search, then AEO may define the era of generative AI.
The Internet is Split: Humans and Bots
Tong says Webflow has seen traffic from its AI crawler increase by more than 125% in just six months.
Across the Internet ecosystem, more than 50% of all Internet traffic now comes from bots. As bot traffic soars, some companies are starting to draw the line—literally. They’re building two versions of their websites:
- Human version: With rich, engaging visuals, interactions, and brand stories.
- Bot version: Minimalist, optimized for machine readability, designed to “feed” AI without exposing the most important features, preserving the value needed to attract clicks from users.

Some publishers now only show summaries or excerpts to crawlers, hoping to attract indexation without compromising their monetization model.
Lessons from Facebook Instant Articles
For some companies like Webflow, having data collected by AI can actually be a business benefit. If Webflow is recommended when a user asks ChatGPT about the best website building platform, that’s valuable exposure.
Users who come through AI tend to be more educated and have “higher intent,” or are more ready to become customers, than those who come from search.
However, that logic breaks down for businesses that depend on both traffic and readers, especially media channels, content creators, and anyone whose business model depends on traditional web traffic.
If a chatbot summarizes an article or quotes the core information, the user will probably never click. No clicks means no ad impressions, no email signups, no audience data, no revenue, no real value.
Adam Singholda, CEO of Taboola (ad tech platform), put it bluntly: "We've seen this before. Publishers have given their content to Facebook to use Instant Articles, and what happened? No traffic and no money."
Facebook launched Instant Articles in 2015, promising fast page loading speeds and a seamless mobile experience. However, it failed to generate significant revenue for publishers. Readers stayed on Facebook, bypassing publishers’ websites and, with them, the advertising, email signups, and tracking tools that fueled their business models.
Ultimately, Facebook quietly shut down the program in 2023.
Singholda believes AI tools like Perplexity and ChatGPT are repeating that mistake on a larger scale. Many publishers have reported a 20-30% drop in search traffic over the past year, even as AI tools have become widely adopted.
Pay or get creative to survive
Faced with this slow disconnect between content and traffic, publishers and platforms are responding. Some have signed licensing deals (like Reddit, The New York Times, and Vox Media) that allow certain AI companies to access their content in exchange for high fees. But those deals are the exception.
Tong sees a different future: one where publishers control who can access their content and what they can see. Through a partnership between Webflow and Cloudflare, businesses can now differentiate between good bots, bad bots, and LLMs. They can choose to share a portion of the content, a summary, or not share it at all.
Enforcement remains tricky, however. Not all bots respect robots.txt (a site’s crawling policy). Some companies have been accused of using proxy servers to crawl content even after it has been blocked. This means that even when walls are up, crawling continues.
In a world where bots answer first, the difference between being recognized and being outcompeted could mean the difference between success and failure of an entire industry. There are already websites created by AI that are not meant to be read by humans but rather to be mined by other AIs—a closed loop of content created by machines, for machines.

Media companies and content producers are facing major challenges due to AI (Photo: GARP).
To cope, companies like Taboola are betting on new models, including Deeper Dive — an AI experience built into publishers' own websites.
Instead of losing users to outside bots, it allows readers to ask questions and get answers based on the publication’s existing reporting. “You get the interactivity of AI, but the publisher retains the relationship, the traffic, and the trust,” Singholda said.
That trust could become the most valuable currency in the AI era. In a world of fluent answers, people still want something tangible. “We’re human beings. When it comes to something important, like money, health, or children, we still want to know who’s talking,” Singholda said.
Source: https://dantri.com.vn/cong-nghe/ngay-tan-cua-internet-truyen-thong-dang-den-gan-20250721090232643.htm
Comment (0)