- Network vendors are calling for a surge in baseline traffic levels thanks to AI
- But analyst Dean Bubley isn’t convinced
- Telecom veteran Azita Arvani thinks traffic will rise, but flagged security as the bigger issue
Operators may be bracing themselves for a wave of AI traffic. But how worried should they be? Will the AI traffic doomsday ever arrive?
The theory, according to Cisco, goes something like this: Inference workloads today are spikey. Traffic jumps when, for instance, someone asks a generative AI chatbot a question, but returns to a low-level baseline in between queries. But that will all change as AI agents proliferate. These agents will engage “continuously” with end-users, keeping traffic levels consistently high.
“As we move towards agentic AI and the demand for inferencing expands to the enterprise and end user networking environments, traffic on the network will reach unprecedented levels,” Cisco CEO Chuck Robbins said on the company’s recent earnings call. “Network traffic will not only increase beyond the peaks of current chatbot interaction, but will remain consistently high with agents in constant interaction.”
But is this really how things will work?
Disruptive Analysis Founder Dean Bubley isn’t sold on this vision of the future.
“Many in the telecom industry and vendor community are trying to talk up AI as driving future access network traffic and therefore demand for investment, spectrum etc., but there is no evidence of this at present,” he told Fierce.
False alarm?
The prospect of a wave of AI traffic is something that has mobile network vendors ringing alarm bells.
Nokia recently predicted that both direct and indirect AI traffic on mobile networks will grow at a faster pace than regular, non-AI traffic. Direct AI traffic stems from users creating prompts and interacting with AI services and tools, while indirect traffic is related to things like the AI algorithms running behind the curtain for social media applications.
The vendor warned that the AI wave could bring “a potential surge in uplink data traffic that could overwhelm our current network infrastructure if we’re not prepared,” noting that the rise of hybrid on-device and cloud tools will require much more than the 5-15 Mbps uplink available on today’s networks.
All told, Nokia said AI traffic is expected to hit 1088 exabytes (EB) per month by 2033. That means overall traffic will grow 5x in a best case scenario and 9x in a worse case.
Nokia’s report notably didn’t mention agentic AI. Ericsson, however, did point to traffic increases stemming from the use of AI-based assistants in its 2024 Mobility Report. In particular, it predicted the majority of traffic would be related to the use of consumer video AI assistants rather than text-based applications and – outside the consumer realm – forecast increased traffic from “AI agents interacting with drones and droids.”
“Accelerated consumer uptake of GenAI will cause a steady increase of traffic in addition to the baseline increase,” Ericsson said of its middling traffic growth scenario.
Shape of the problem
But Bubley argued AI agents won’t really create much traffic on access networks to homes or businesses. Instead, he said, they will drive traffic “inside corporate networks, and inside and between data centers on backbone networks and ‘inside the cloud.’”
“There might be a bit more uplink traffic if video/images are sent to the cloud for AI purposes, but again that’s hypothetical,” he told Fierce.
Judging from the comments on a recent LinkedIn post by Ookla Analyst Mike Dano, Bubley isn’t the only skeptic. But it still seems to be an open question whether the AI Armageddon will actually materialize.
Azita Arvani, Stanford Sloan Fellow and former CEO for Rakuten Symphony North America, told Fierce that in her view, the situation around AI traffic is a bit nuanced.
AI isn’t beating video services like Netflix or YouTube in terms of traffic generation on the access edge yet, but back end network providers are cranking up the capacity to 800G and beyond “agentic and model-serving traffic keeps servers talking to other servers all day.”
“When agents cross company boundaries, that is public-internet traffic but it looks more like lots of HTTPS/API requests than a few giant video streams,” she explained. Arvani pointed to recent Cloudflare data which showed an increase in AI crawler and agent activity as a bellweather.
The takeaway? “It’s not video-scale in raw bytes today, but it’s steadier and more persistent, which matters for capacity planning, rate-limiting and telemetry at peering edges and CDNs.”
For Arvani, the question around AI traffic is less about whether or not networks can handle it (yes, it’ll require capex and upgrade) and more about securing the influx of machine-to-machine calls.
“Each hop should prove who it is and why it’s allowed (service identity, short-lived creds, micro-segmentation) and leave a tamper-evident breadcrumb,” she concluded. “Agentic AI amplifies lots of internal machine-to-machine calls that must be verified, authorized, and auditable.”