Cloudflare Hits 500 Tbps Milestone as AI Traffic Surges
- •Cloudflare achieves 500 Tbps global network capacity milestone across 330+ cities
- •Automated edge systems now mitigate massive 30+ Tbps DDoS attacks without human intervention
- •AI agents and model training pipelines now account for over 4% of total web traffic
The internet is often conceptualized as a vast, ethereal cloud, but at its foundation, it remains a physical tapestry of cables, servers, and data centers. Cloudflare recently marked a significant milestone in this physical reality, announcing that its global network has surpassed 500 terabits per second (Tbps) of provisioned external interconnection capacity. This capacity is not just a measure of traffic, but a metric of defensive potential—a necessary buffer against the growing scale of modern cyber threats. Sixteen years ago, the company launched from a modest office above a nail salon in Palo Alto with only a single transit provider. Today, it operates a backbone spanning over 330 cities, protecting more than 20% of the public web through a decentralized infrastructure that pushes computing power to the very edge of the network.
As the footprint of the network expanded, so did the nature of the threats it faces. A decade ago, neutralizing a massive distributed denial-of-service (DDoS) attack—an attempt to crash a server by flooding it with traffic—would have required significant, often manual, intervention from nation-state-level resources. In 2025, Cloudflare's autonomous systems mitigated a 31.4 Tbps attack in mere seconds. This feat is accomplished by moving intelligence to every server in the fleet, allowing the network to defend itself locally at line rate. Instead of routing potentially malicious traffic to a centralized "scrubbing center," the system uses advanced technologies like eBPF to evaluate and drop unwanted packets before they consume any significant processing power. This shift from centralized to distributed defense is the backbone of modern internet security.
The rise of artificial intelligence has further complicated this landscape. For years, the internet was primarily a human-centric medium, defined by users clicking links in browsers. Today, the landscape is shifting as AI crawlers, model training pipelines, and autonomous agents account for more than 4% of all HTML requests across the network—a volume comparable to major search engine bots. These crawlers interact with the web differently than human users; they fetch data at maximum throughput without pauses, creating new challenges in distinguishing legitimate AI activity from malicious automated attacks. To navigate this, the company employs complex signal analysis, looking at TLS fingerprints and behavioral patterns to ensure site owners can control which crawlers access their data without sacrificing security or performance.
Looking ahead, the infrastructure that powers this security is also serving as a platform for developers. By running code on every server in its network, the platform allows applications to run in every city simultaneously, rather than being confined to a handful of central cloud regions. This architectural decision, initially built to facilitate rapid packet filtering, now supports containerized workloads and sophisticated edge applications. As the network continues to scale and integrate emerging internet protocols like ASPA—which validates the complex paths traffic takes across the web—it underscores a vital reality of the digital age: the infrastructure of the internet is not a static utility, but a dynamic, evolving system that must be constantly reinvented to survive.