On May 14th, I had the opportunity to attend Fastly’s Xcelerate 2025 customer roadshow in Los Angeles. It was a full day of customer case studies, partner demonstrations, and executive briefings, all of which delivered a clear message: Fastly is admidst the transformation from being a traditional content delivery network vendor to becoming an integrated edge services vendor that aims to reduce operational friction and operating expenses, while opening new avenues for adopting AI applications. The three most prominent themes follow.
A Software-Defined Edge Platform Enables Distributed Cloud Networking Strategies
At Dell’Oro, I’ve been championing Distributed Cloud Networking. It is an architecture that couples the user edge, the wide-area middle mile, and the application edge, using a software-defined control plane that spans multiple clouds and networks. Although still emerging, Distributed Cloud Networking aims to harmonize routing, security, and compute policies wherever applications run. Fastly’s platform vision aligns tightly with this model. Executives described a composable stack that integrates content delivery, DDoS mitigation, Web Application Firewall, bot controls, object storage, WebAssembly compute, and observability behind a single set of Terraform modules and APIs.
Customers emphasized the operational upside. For example, customers credited Fastly’s new production-equivalent “staging edge,” where they can trial configurations and code before promotion. This safeguard has virtually eliminated rollbacks, enabling WAF users to ship approximately one-third more features each year. Moreover, flexible deployment options—such as edge points of presence (POPs), Fastly-managed environments in Amazon Web Services, or on-premises agents—support data-residency mandates without disrupting toolchains.
However, risks revolve around platform dependence. Enterprises that prefer best-of-breed tools may find the breadth of APIs to be demanding and the exit costs uncertain. Competitor Akamai continues to expand into core cloud services, while Cloudflare layers networking and security features at speed. We see enterprises benchmarking onboarding friction, roadmap transparency, and contractual agility before entrusting mission-critical workloads to any single vendor.
Offloading AI Workloads Closer to Users for Better Performance and Cost
Artificial intelligence was front and center at Xcelerate—less an aspiration and more an everyday workload. In a joint demo, Google and Fastly demonstrated how a semantic-aware edge cache handles Gemini prompts, with the cached reply being returned in approximately half the time of a cold request and using noticeably fewer tokens. For enterprises, that means faster pages and smaller AI bills without involving origin GPUs.
What makes the example interesting is where it happens. By utilizing an intelligent fabric, Google and Fastly can direct traffic to the nearest inference node, then maintain popular responses in place. It is a textbook illustration of Distributed Cloud Networking’s promise: policies and data move together through a programmable cloud networking fabric, allowing application teams to gain speed while finance teams experience predictable costs.
Shutterstock, the global stock-image and media marketplace, echoed the theme on the training side. Its video-analysis pipeline streams tens of millions of clips across AWS, Azure, and Google, while keeping preprocessing and vector embedding at edge points of presence. Running the heavy lifting in Fastly’s fabric enables Shutterstock to maintain steady throughput across clouds and avoid cross-region egress surprises—a real-world proof that Distributed Cloud Networking fabrics improve both performance and cost control for data-intensive AI jobs.
Challenges remain—semantic caching is young, model versions evolve quickly, and data-residency rules vary—but the direction is clear. Vendors, including Akamai and NVIDIA, are racing to offer similar edge-GPU overlays. Therefore, enterprises should pair any rollout with tight version control, automated rollback, and transparent governance to prevent the benefits from slipping away.
Edge Caching + Integrated Storage: Controlling Spend While Powering “The Best of the Internet”
Edge caching and integrated storage may not be as eye-catching as a software-defined edge-services platform or AI offload. Yet, when traffic surges and the finance team wants lower IT spend, their combination of uptime insurance and cost control often matters most.
For many customers, one of the most significant benefits of Fastly’s integrated object storage is the cost reduction it enables while serving massive amounts of data without interruption. Keeping hot data at the edge wipes out per-gigabyte cloud egress fees and shortens time-to-first-byte:
- Fox Sports hit a 99.97 % cache-hit ratio during Super Bowl 2025, offloading terabits from its origin and avoiding a game-day cloud-bill spike.
- Shutterstock migrated 35 PB of images once and now serves them approximately 40% faster, while eliminating a six-figure monthly cloud egress line item.
Cost efficiency is not reserved for media giants. Wildfire-alert nonprofit Watch Duty routinely saw incident spikes, ranging from 20,000 to 100,000 requests per second, during the devastating fires in Los Angeles in early 2025. Fastly provided WatchDuty capacity at a steep discount—an embodiment of the company’s aim to “Power the best of the internet.”
Whether it’s a global streaming platform or a community-safety service, the message was clear: every byte that stays in edge storage is one less byte paid for twice—first in bandwidth and then in user patience.
Conclusion
Fastly Xcelerate 2025 reinforced its commitment to an integrated edge platform that aligns with our vision for Distributed Cloud Networking. Customers repeatedly praised Fastly’s engineers for extracting every microsecond of performance and its high-touch support teams for restoring service stability when seconds mattered most—an operational culture evident in Fox’s Super Bowl war room and WatchDuty’s wildfire surge. We will continue tracking forthcoming roadmap milestones against the backdrop of our Distributed Cloud Network report, while evaluating Fastly tactically in our application security and delivery coverage within the quarterly Network Security report. Further developments deserve close observation.