LIVE SPORTS






Real-Time Connectivity Unleashes Sports Shift
to Remote Production


By Fred Dawson


Josh Arensberg,
CTO for M&E at Verizon Business
The advent of commercially viable real-time streaming has unleashed a torrent of activity among sports, news and other live programming producers who want to exploit the cost-cutting and other benefits that come with remote production.

Experimentation and in some cases full engagement with real-time streaming platforms, especially in sports productions, points to what could soon be a major shift away from reliance on in-venue production teams and equipment other than the cameras and crews engaged in capturing the action. The real-time streaming component has entered the picture with other factors contributing to the transformation in live production, including in-venue networking conversions to 5G and advanced Wi-Fi and migrations to public cloud-based processing.

Verizon Business Pushes Support for Remote Production

For Verizon’s business unit, which, as previously reported, has taken a leading role enabling advances in sports’ fans in-venue mobile viewing experiences, remote production leveraging real-time connectivity between venues and distant production centers is the next shoe to drop in the live sports streaming revolution. “You can't have remote production unless there's instantaneous connections,” said Josh Arensberg, CTO for M&E at Verizon Business, during a video interview with UltraMedia Pipeline at the IBC trade show in September.

“My goal is to actually enable the industry to do this and support it across a number of partners,” Arensberg said. “In fact, I urge partners to reach out.”

Earlier this year Verizon Business helped the National Hockey League to become the first major U.S. pro sports enterprise to take the plunge into full cloud production with the elimination of big in-venue production vehicles. Speaking during a panel discussion at IBC, Grant Nodine, the NHL’s senior vice president of technology, made clear what the initiative has meant to transforming the league’s ability to enhance viewing experiences.

By orchestrating all the camera inputs delivered in a single feed to the cloud, the NHL can give producers “the tools to wrap the broadcast in our data” and choose “what kind of renditions do I want to make of that feed to give fans different views of the game,” Nodine said. Articulating a widely held aspiration among sports producers, he added this has allowed the league “to start diversifying our content in a way that reacts to the generational change in the way people consume media.” Over time this will include the addition of 4K and 8K display options, fam control over camera angles, and the choice of alternate broadcasts with different commentators/

Currently, with sports program distribution split between traditional TV modes of delivery and streaming, the personalized functionalities are limited to just a portion of the audience. But eventually they’ll apply to all viewers if, as many providers expect, streaming takes over as the dominant distribution mode.

As John Ellerton, head of futures & innovation at BT media and broadcast noted at IBC, “It’s clear we as an industry will move over the coming years from being a broadcast media to being something that is entirely OTT delivered.” The trend in that direction is well underway, facilitated by an OTT environment where cloud technology plays a dominant role in storage, distribution and, increasingly, production.

According to a recent report from researcher eMarketer, total annual streamed-sports viewership in the U.S. for the first time surpassed pay TV viewership in 2023 and surged to 105.3 million versus 85.7 million in 2024. The report predicts that by 2027 streamed sports will draw 127.4 million views, nearly doubling the TV viewership and raising the incentive to capitalize on the feature-enrichening capabilities of IP technology.

Multi-Access Edge Computing Becomes a Gateway to REMI in Sports

With this perspective taking hold, producers want to know whether they can enable remote collaboration in live production from any location, a step widely adopted in non-live content production that’s commonly referred to as REMI (remote image model) production. REMI adoption in live productions depends on whether producers can sort through inputs from remote camera feeds, dispersed commentators and archived files of ancillary content unimpeded by latencies incurred over long distances. This is especially challenging for anyone who wants to eliminate the need for on-site production teams and apparatus beyond what’s required to capture the action.

The challenge can be easier to meet in cases like the NHL where sports entities are willing to rely on hardware support from production provided by public cloud resources. It gets harder to cut latency to the vanishing point between the in-venue video camera feeds and the point of production processing when hardware resources, whether they’re proprietary appliances or commodity servers, are co-located with production personnel in brick-and-mortar studios.

For example, with the NHL relying on Verizon 5G connections to cloud production supported by AWS facilities, some aspects of the latency issue are resolved through tie-ins between the 5G network and on-ramps to the AWS cloud. This allows bundled camera feeds to be streamed directly to multi-access edge computing (MEC) instantiations of the AWS cloud known as Wavelength Zones through AWS portals co-located with 5G control centers, thereby avoiding latencies induced by multi-hop internet transport to AWS datacenters.

According to Verizon officials, the transmission from venue to cloud takes only about 110ms, which equates to lag time imperceptible to production personnel. Distribution out to mass audiences via conventional streaming still takes multiple seconds, typically well in excess of the time it takes traditional TV signals to reach end users over MVPD networks.

But the Wavelength return to mobile phones in the event venue is as fast as the uplink from the camera feeds to the cloud, which is driving the NHL and many other Verizon customers to create highly personalized in-venue viewing experiences that occur in sync with what’s happening on the playing field. The implementation of such capabilities as an incentive to drive live event attendance is exploding worldwide.

In part, this is abetted by the fact that MECs are growing more common. Microsoft now supports Azure MECs used by AT&T and other unnamed partners. Carriers recently reported to be utilizing Wavelength Zones include Verizon, KDDI, SK Telecom, Vodafone, and Bell Canada. As of the latest public accounting in 2023, Verizon had the largest MEC footprint with Wavelength Zones in 13 U.S. cities.

Expanding REMI Beyond Cloud Production

But MECs and 5G connectivity are still the exception when it comes to resolving the remote production latency issue. Not only are such footprints a subset of the scenarios where sports producers are relying on cloud production; the broader move to cloud production in sports as well as news and other live programming, while accelerating, still has a long way to go, which means there needs to be a way to support real-time connectivity between venues and distant studios to enable wider adoption of remote production strategies among sports and news producers.

At this stage cloud usage in these live production scenarios predominantly involves hybrid approaches with premises-based production tapping assets held in cloud storage. Moreover, even where full-bore cloud productions are benefitting from in-venue tie-ins to 5G and MECs, producers can’t bring remote commentors or other out-of-venue sources of contribution into their real-time workflows without a way to link video feeds from those locations to the cloud in real time, which means at 250ms or lower latencies.

This is where real-time interactive streaming (RTIS) infrastructure comes into play. As reported elsewhere, the industry is more focused on finding a smooth path to RTIS than ever with consensus centered on WebRTC-based platforms as the best options currently available.

“WebRTC streaming is, I think, finally mature, and everybody is coming around to the fact that this is the right way to do this stuff,” said Chris Allen, CEO of RTIS platform supplier Red 5. “The production side of it is getting really hot. We’re working with sports producers and news organizations around the world who want to lower production costs by avoiding the need to send high-cost mobile production vans and staff to venue locations.”

To facilitate cost-effective distributed production operations, Red5 introduced what it calls TrueTime Studio for Production with a demonstration at IBC that showed how the solution can be used by production editors on standard computer screens to view and interact with virtually any number of A/V feeds coming in from cameras and other remote editing posts. Shared multi-screen editing in real time on computers eliminates the need for high-priced switching equipment at workstations across the workflow, Allen noted.

Red5’s cloud-based distribution architecture employes WebRTC as the primary mode of transport to achieve end-to-end latencies routinely tabulated at 250ms or less over trans- and even intercontinental distances in both backend and distribution scenarios, Allen said. Touting scalability to millions of end users with support for one-to-many, many-to-one and many-to-many interactive video communications, the company says its Experience Delivery Network (XDN) platform can be implemented with point-and-click configurability through the Red5 Cloud service or in customer-tailored configuration with the aid of Red5 Pro SDKs and toolsets.

Putting real-time interactive connectivity to work immediately in live production scenarios has important implications for wide use in distribution down the road, Allen said. “My theory is [use of RTIS in live production] is going to help accelerate a lot of end user experiences as well,” he explained. When producers “start using this stuff in production, then it makes it a lot easier to transfer to the consumer and create interactive apps and everything else as we go.”
 Red5 is also engaged with sports producers who are focused on using RTIS to deliver enhanced in-venue viewing experiences, Allen said. This can be the case even when 5G is in play with Wavelength Zone entry to the cloud.

That’s because, while a very slight time lag time in the video display is of no consequence as viewers cast their gazes back and forth from the live action to their screens, a lag between audio generated by phones and the crowd and other sounds coming from on-site speakers can be a nuisance for anyone who wants to listen to the broadcast commentary. To prevent this from happening the end-to-end latency must be no greater than 40ms-60ms, which roughly matches the range of speed-of-sound lag times between a speaker and an end user within anywhere from 60 to 120 feet of the nearest venue speaker, Allen said.

Sports producers working with Red5 are meeting these requirements by positioning the company’s platform nodes at multiple locations within their venues, which relies on the ability of the Red5 architecture to ensure signals to and from each end user’s handset travel over the shortest path to the Red5 platform. In a crowded stadium, latency-reducing cluster orchestration can involve dozens of Red5 network nodes, Allen said.

He also noted that customers who aren’t looking to solve the sound issue but want to use more distant Red5 node instantiations in the AWS cloud are benefitting from the fact that Red5 is the only RTIS provider that AWS has authorized for pre-integration with Wavelength. The installation of Red5 nodes in the Wavelength MEC environment helps to lower latency in support of REMI where production remains on remote premises rather than in the cloud while facilitating real-time personalization of in-venue viewing experiences, he explained.

RTIS-Based Remote Production at Global Scale

Nobody has gone further than esports giant Riot Games in demonstrating the power of remote production supported by RTIS to transform business models. The company has built a real-time video streaming network that has grown to be tdhe 14th largest distribution infrastructure in the world in terms of points of presence, according to Rizwan Hamid, head of sports and entertainment for Europe, the Middle East and Africa for Cisco Systems.

Cisco is providing the routing technology Riot uses to manage traffic worldwide as well as the Wi-Fi networking infrastructure the esports producer uses at its live competition venues, which draw audiences in the thousands. The real-time Riot network “allows us to ping content in milliseconds” between competition sites across the world and the producer’s production studios, Hamid told us in a discussion illuminating the role Cisco is playing in converting sports venues to use of its advanced Wi-Fi, routing and other technologies.

Riot Games, producers of League of Legends esports competitions that reach tens of millions of viewers worldwide, has steadily streamlined its operations to enable substantial portions of the production workflow to be handled at its studio in Los Angeles or regional studios in other countries, even though the venue might be thousands of miles away. This has allowed Riot to transform its business model by extending competitions to multiple locations across the globe,

As explained by Riot’s Esports Technology Group, multiple feeds from cameras and crowd microphones are transmitted to the LA or regional production centers where everything is decoded, transported to relevant control rooms and processed for various streams going out in 19 different languages around the world. This has cut the on-site operations presence from multiple production and satellite trucks and crews to an engineering room housing about ten A/V processing racks.

Hamid said Cisco is making great strides with equipping sports venues to use advanced Wi-Fi connectivity to transform viewing experiences in and beyond these and many other playing arenas. “We’re seeing a huge disruption in content strategies with the shift to SMPTE 2110 and IP,” he reported. “The ability to create new user experiences with our IP Fabric for Media framework brings new approaches to live streaming monetization.”

At the venue level, Cisco is contracting to install advanced Wi-Fi systems with data-gathering intelligence to drive personalized in-venue service reception on Wi-Fi-tuned cell phones and data collection and analysis to help owners manage their services and properties. In a partnership with cloud streaming management system provider Wipro, Cisco is enabling diverse viewing experiences tuned for off-site audiences and in-venue viewing with the latter benefitting from cloud production delivering personalized features to phones just 200ms-300ms out of sync with gameplay

“Wipro allows us to encode live camera feeds and produce personalized output for delivery to thousands of people attending a sports event,” Hamid said. “We can do this in the context of offering sponsorships to vendors, different viewpoints, fantasy football information, AR (augmented reality) overlays and other features.”