A New developments combining advances in CDN technology with cloud computing at network edges are creating vastly improved streaming environments for OTT services.
Both Akamai and the less-known CDN technology supplier Varnish Software nearly simultaneously announced radical departures from previous business models ahead of the April NAB Show. These moves, which capitalize on major advances in computer chip processing, follow ongoing successes registered by Qwilt’s next-gen Open Edge CDN platform, including Comcast’s recent deployment of the solution to support what the cable giant claims is “the most distributed content delivery network in the U.S.”
While the three companies’ approaches differ significantly, they all have in common the goal of putting edge-based cloud computing to work to accommodate improvements not only in distributed CDN caching but also myriad other processing functionalities video service providers are looking for to drive user engagement at lower costs. “We’re making CDN delivery more efficient to where only 50% of CPU processing reserved for our platform is needed for CDN operations, which means customers can use their APIs to activate the remaining processing power for their use cases,” says Adrian Herrera, general manager and chief marketing officer for North America at Varnish Software.
Left unsaid in the publicity surrounding these innovations is the fact that all three companies are up against the tough realities of competing in a price-cutting commoditized market. A Qwilt executive, speaking on background, acknowledges the struggles that come with building business in a market where CDNs have been commoditized with common configurations optimized for video distribution.
“We have the advantage of federated usage of our Open Edge platform through our support for Open Caching,” the executive says. “But the price competition is brutal.”
As noted by Will Law, chief architect for Akamai’s Edge Technology Group, the ramifications are clear. With the CDN business turning into a “commoditized industry,” he says, “A lot of CDNs have died,” including CDN giant Edgio, which after filing for bankruptcy in January sold off some of its assets to Akamai.
The New Akamai Business Model
Law describes Akamai’s strategy as a market-driven transformation that allows it to capitalize on streamers’ growing use of cloud technology in lieu of traditional CDNs to create their own streaming infrastructures. In one of the more dramatic adjustments to new realities in the CDN business, Akamai is leveraging its cloud-supported CDN hardware footprint in conjunction with its acquisition of cloud computing service provider Linode to create a video-optimized cloud computing platform that’s meant to outperform the support DIY CDN builders get from the leading hyperscalers.
Akamai’s plan is to support application-specific media services for contribution, transport, transcoding, packaging, digital rights management, ad insertion, and content security. The open system design allows customers to mix and match these services any way they choose with solutions from other suppliers.
“We’re going from being a pure-play CDN to providing a cloud computing and security platform where our CDN operates as [an optionally available] layer on top of that,” Law says. “We’re creating a compelling environment where customers want to stay because the performance and costs are better, not because we’re forcing them to stay with a walled-garden approach.” Moreover, he adds, in contrast to cloud compute providers like Amazon with AWS and the Prime service, “we’re the cloud provider who doesn’t compete with you.”
And while Law doesn’t mention this, another liability streamers are discovering in working with the hyperscalers is that, with so many huge revenue-generating irons in the fire, their internal teams focusing on the fast-moving streaming market’s needs often have trouble getting decisions on key steps that need to be taken to address those needs. “Extremely frustrating,” confides an executive overseeing some projects on that side of AWS operations.
Akamai kickstarted its new strategy with Linode in 2022, at which point it had 11 core compute regions to work with. Tapping CDN resources running across some 4,100 edge nodes in 131 countries, Akamai has expanded the cloud compute reach to 30 regions with a full suite of datacenter capabilities, including support for object storage and multiple approaches to virtualized processing based on virtual machine/hypervisor, container/Kubernetes and virtual PC technologies.
While the Akamai CDN currently accounts for a third of company revenues, Law says the goal now is to drive business growth by providing the edge cloud computing resources video-oriented customers need regardless of whether they use the Akamai CDN or build their own. Akamai is making cost savings a big part of the pitch by introducing new efficiencies in edge processing while minimizing or, in cases where customers use the Akamai CDN, eliminating egress fees.
Edge compute efficiency gains are attained with a recently introduced major advancement in what video distributors can get from the cloud through what it calls Cloud Accelerated Compute Instances. For the first time anywhere, Akamai is using cloud servers to support the transcoding efficiencies powered by NETINT video processing units (VPUs), which are application-specific integrated circuits (ASICs) built on the Codensity silicon architecture widely used in NETINT’s purpose-built appliances.
The companies say the implementation of VPUs in cloud servers represents a new cloud computing category that’s meant for workloads where minimizing video streaming costs and energy consumption are priorities. The accelerated cloud server instances are powered by NETINT’s single-chip Quadra T1U VPUs supporting 8- or 10-bit encoding via AV1, HEVC and H.264 at resolutions up to 8K at 60 frames per second. At 1080p 30fps broadcast quality, each Quadra T1U can encode 32 live streams with the ability to scale linearly for processing at resolutions above or below HD.
If customers choose to activate Akamai’s CDN software on the company’s cloud compute platform, servers act as origins for incoming streams with the availability of encoding or multi-profile transcoding support from the VPUs. Streams incur just microseconds of latency going from the processing environment into CDN transmission while avoiding any egress costs, Law says.
“If they choose a CDN that’s not collocated with our cloud platform, it’s still a very low-latency transition,” he notes, adding that in these cases the egress fees charged by Akamai are about half what many cloud compute services charge. In cases typically involving larger streaming services where Akamai’s CDN customers prefer that transcoding be performed at the edge, the VPUs can perform that task as streams exit the Akamai CDN.
More broadly, a VPU-based architecture delivers up to 20 times the throughput compared to one relying solely on CPUs, which NETINT chief revenue officer Randal Horne notes frees up CPUs to do other things like dynamic packaging, de-interlacing, real-time speech-to-text captioning, software decoding for standards not supported in the VPU, and run popular applications like FFMPEG and GStreamer. “VPUs are the ultimate cheat code for video streaming profitability,” he jokes.
Currently, the VPU instances can be mounted as VMs in virtualized datacenter environments, Law says. But with other processors on the Akamai cloud compute platform available for use in the containerized environment supported by Akamai’s Managed Kubernetes service, “I think we’re going to release containers with NETINT as well,” he notes.
The VPUs complement Akamai cloud compute capabilities supported by GPUs. “We have GPUs for people who want to do hardware acceleration,” Law says, which ensures AI processing as well as other acceleration applications are available on the Akamai cloud. But while the VPUs and GPUs are accessible in the same PoP, they’re not offered on the same machine.
Making AI processing available at the edge provides customers the ability to operate AI inferences on a distributed basis, Law notes. “The edge can be a good place for training AI and for doing things like turning voice feeds into text,” he says. Asked whether Akamai might introduce neural processing units (NPUs), the AI-optimized chipsets employing GPU acceleration, he replies, “We don’t rule out NPUs,” but adds, “We have good customer adoption with GPUs and NETINT VPUs so far.”
Akamai’s video-optimized cloud compute strategy plays well with the surge in user-generated content (UGC) driven by social media giants like Tik Tok and Instagram. A big payoff could come soon in a deal with a major social media player that’s nearing competition, Law notes. “One NETINT machine can do all you need to handle processing for users served by a given PoP,” he adds.
Varnish and Intel Launch Streaming-as-a-Service
Taking a different cloud compute-oriented tack to address the DIY CDN trend, Varnish Software, long a provider of CDN technology for DIY builds, has partnered with Intel to create what the companies describe as a fully managed private CDN and edge delivery service optimized for video streaming. Implemented through a new venture known as Ora Streaming, the new service is meant to support carriers’ and large network-equipped ISPs’ private single-tenant CDN operations without the complications and cloud-compute usage costs incurred with DIY CDN operations, says Adrian Herrera.
Ora is engineered as a turnkey service that allows carriers to offer support for live and on-demand video streaming by large to mid-sized broadcasters, OTT providers, and ad-supported platforms looking for enterprise-grade streaming at multi-terabit scale, Herrara says. It’s meant to provide carriers and ISPs a high-performance, transparent, and scalable content delivery solution without the operational complexity of building or managing a CDN.
Dubbed “streaming-as-a-service,” which is not to be confused with managed streaming platforms operated by online video publishers (OVPs), Ora Streaming leverages carriers’ needs for capacity-based pricing models that cut costs by doing away with the typical cloud compute usage-based pricing. Customers pay a fixed fee per month for a certain capacity, with no additional cost for egress, port fees, traffic or overage, Herrera says, which means the more a customer uses the capacity, the lower its gigabyte/month costs will be.
As a partner in the Ora venture, which is 100% owned by Varnish, Intel is playing a major role by tapping its relationships with carriers to set up the multi-use environment with Ora. Varnish pays a set fee to Intel for Ora’s use of resources assigned to the service, Herrara explains, which, in turn, allows Ora to set a single monthly payment from the carriers for using the allocated resources in both the CDN and ancillary applications.
He says a big contributor to lowering costs as well as freeing up CPU capacity or other edge compute applications is the performance achieved with Varnish Enterprise software running on Intel Xeon processors. Last year the combination tested out with Xeon 6 processors on 2 RU Supermicro CloudDC servers delivering record-setting 1.5 Tbps throughput at an energy-saving 1.3 Watts per terabyte.
By bringing Varnish Enterprise software into the managed service environment, Ora Streaming is leveraging a feature-rich caching and HTTP accelerator platform that Varnish has been marketing as a major differentiator in the CDN DIY market. Along with high-availability content replications and persistent handling of multi-terabyte data sets, the service supports dual-layer storage, in-core Transport Layer Security (TLS), administrative support and much else, Herrara says.
As a private CDN option, Ora Streaming is not meant to support a shared CDN environment, he adds. “Single tenancy is a primary value for our customers, which allows them to customize the architecture as needed for additional use cases,” he says.
Adding efficiency to multi-use customization, Varnish customers can use Varnish Configuration Language, a proprietary coding language, to develop logic supporting their use cases. The language conforms to the WebAssembly (WASM) standard, which enables browser support for near-native performance of JavaScript and other applications brought into the Ora Streaming domain.
Ora Streaming runs on many leading global cloud computing platforms, including AWS, Azure, Google Cloud, Oracle OCI and others. Varnish, which began as a supplier of open caching technology 15 years ago, has provided some degree of CDN support to Orange, Disney, Irish public broadcaster RTE, and others over the years, including Paramount+ for help in streaming the Super Bowl, Herrara says. “There are a lot of streamers we can’t name,” he adds, noting that “the U.S. is where our largest customers are.”
Qwilt on a Role
Nobody has made greater headway pushing next-gen approaches to CDN operations than Qwilt, which got a huge boost in the marketplace starting in 2020 when Cisco Systems and Qwilt forged a partnership with investment support from Digital Alpha. Today, with reported investment funding totaling $135 million, $70 million of which is attributed to a 2023 investment round backed by Cisco, the companies report their joint solution serves close to two billion subscribers to services from more than 200 network service operators and content providers worldwide.
Qwilt built its platform on an Open Edge architecture tied to the Open Caching specifications developed through the Streaming Video Technology Alliance, which Cisco and Qwilt founded with many other streaming market leaders in 2014 and now includes over 100 companies. Here again the idea is to leverage computing resources for CDN and other applications at the edge but with the twist that the Open Caching format allows all service providers using the technology to be federated into a global delivery platform for the streaming industry.
The Qwilt-Cisco partnership combines Qwilt’s CDN platform with Cisco’s Unified Computing System (UCS) and Nexus 9000 switches to create a comprehensive solution encompassing edge computing with support embedded in operators’ network infrastructures. The result is an emerging “global network of high-quality streaming content from multiple publishers,” says Theodore Tzevelekis, vice president of business development for Cisco’s Mass-Scale Infrastructure Group, in an article written for Lightreading.
As one of the latest customers to sign on, Comcast nine months ago said its deployment of the Qwilt Open Edge platform was made possible by its implementation of a virtualized distributed access architecture that, in the words of Comcast chief network officer Elad Nafshi, put the brains of its network closer to customers. “We’re tapping into the incredible power of edge compute to build a leading content delivery network that provides incredible benefits to our customers and to the providers who distribute content across our network,” Natfshi said in a statement released with the announcement.
Having introduced 4K with its Xfinity service along with Dolby Vision HDR, Dolby Atmos spatial audio, and “ultra-low latency only seconds behind live action,” the activation of the Qwilt platform enabled what Comcast calls “quality-as-a-service” based on positioning edge compute and caching clusters deep in its networks. Comcast also noted the benefits of the federated CDN connectivity enabled by Qwilt’s adherence to the Open Caching specifications.
“With this deployment, the global coverage of the Open Edge will now uniquely enable local delivery to the majority of U.S. broadband subscribers – a critical milestone in the world’s largest federated, all-edge CDN, supported by a global, unified edge cloud footprint that benefits content publishers, service providers and customers,” the company says.
The Real-Time Streaming Question
These and all the other vendor benefits described here apply to conventional streaming over HTTP infrastructure. It remains to be seen how the multiple paths to CDN transformation undertaken by Akamai, Varnish and Qwilt play out as the move to ultra-low latency, including real-time streaming, gathers momentum across the video services ecosystem.
Qwilt hasn’t broached the topic publicly or in meetings with us, and Comcast appears satisfied with the fact that last year’s enhancements to its distributed access architecture include the aforementioned seconds-behind-live throughput.
Akamai, with Will Law playing, as previously reported, a major role in the IETF’s standardization of a new Media over QUIC (MoQ) real-time streaming protocol, clearly has positioned itself to support streamers who want to put MoQ to use when it’s ready, whether through DIY efforts or refinements in the Akamai CDN that will likely be made to support MoQ.
Herrera, when asked about the thinking at Varnish about real-time streaming and the possibility of partnering with WebRTC-based platform providers to enable use of its software in that domain, doesn’t hesitate. “We’re looking at the various approaches to ultra-low latency streaming,” he says, noting Ora Streaming’s support for transcoding and other advanced features in real time can be important contributions to real-time streaming. “We’ll be evaluating WebRTC for possible engagements down the road.”