Inch By Inch, Broadcasters’ Cloud Production Expands
Broadcasters have already proven that they can use cloud technology, whether public or private, to replace dedicated on-premise hardware for playout and archiving applications. Now many are looking to expand their use of the cloud into production for everything from ad-hoc events aimed at streaming distribution to mainstream live sports and news coverage for their linear channels.
While most are still years away from a broad adoption of the cloud for their production needs, stations, networks and studios are experimenting today with how they can use the cloud to gain new flexibility and efficiency. Challenges remain in software interoperability and latency, but significant progress has been made in the past year, said top engineers who gathered last week for the TVNewsCheck webinar “Live Production in the Cloud,” moderated by this reporter.
Tegna Awaits AI’s Promise
“The cloud, as we think of it, is ideal for variable workloads, where you can really leverage a consumption-based model and scale up and scale down as you need it,” said Kurt Rao, SVP and CTO, Tegna.
Rao considers Tegna’s content in three main “buckets”: news, weather and local sports. He said the cloud is currently being leveraged for production at some level across all of them. For example, news cameras are feeding content from the field through the cloud and then back into the studio, where it is packaged using tools like Adobe’s cloud-based editing software for linear or digital distribution.
“We’ve been very purposely looking at what the capabilities are in each of the key workflow steps, and how you can leverage the cloud in elements of that process,” Rao said. “That’s where we are. I would say we’re still very early in the days of adopting end-to-end, but there are certainly a lot of purpose-built tools and processes that can leverage the cloud.”
Rao said that complete camera-to-cloud workflows, which were demonstrated by vendors including Adobe at this year’s NAB Show, are “ready to test” but not necessarily ready for production. He added that he was “very excited” by the focus at NAB show on artificial intelligence (AI), which is heavily dependent on cloud compute.
Rao sees great potential in the possibility of an AI chip in a camera that could work with the cloud to automatically create multiple versions of video as it is being shot and output them from the camera.
“In theory, the promise is you could get a two-minute clip that goes out to linear, but it could summarize that for a 10-second one for Instagram,” Rao said.
“Decomposing” News Workflows At Sinclair
Sinclair was an early adopter of the cloud for playout and disaster recovery functions, for both its stations and its national networks. Now the company is exploring how it can use the cloud for news and sports production, including testing production control room (PCR) software from Ross Video and Grass Valley at several stations and also using private cloud technology for some Tennis Channel productions.
While the benefits of moving functions like playout or DR to cloud were very clear, the analysis for news production is more complex, said Mike Kralec, SVP and CTO, Sinclair. The company is attempting to “decompose the ecosystem” before migrating to the cloud by taking a hard look at individual components of the news production workflow like field acquisition, archiving and editing.
“Eventually, we’re going to get to this holistic ecosystem of composable parts where we bring all these different cloud workflows together,” Kralec said. “But we’re not at a point where we have the visibility into seeing everything in the cloud all at once.”
Streamlining field acquisition processes with the cloud to get content to digital platforms in interesting, Kralec said, as is the possibility of centralizing Sinclair’s news archives in the cloud to make repurposing content easier. But as it looks for viable business models for live news production in the cloud, Sinclair and other broadcasters are in some ways “victims of our efficiency,” Kralec said.
After rolling out production automation systems like Ross Video’s OverDrive, Sinclair has already made its control room operations very efficient, Kralec said. He isn’t sure how much more benefit can be gained by moving those workflows to the cloud, though Sinclair is currently evaluating whether a single cloud-based PCR could support news production across multiple stations, at least for disaster recovery.
“It’s dials and knobs,” Kralec said. “Can we turn the knob on shared resourcing, in terms of, can you deploy 40 production control rooms in a shared service model rather than 75 control rooms? But as you turn that knob down, you turn the other one on network resourcing and encoding and transport and everything else that goes into that. You turn that one up. So, there are tradeoffs.”
Transcending Geography
Ross Video is definitely seeing interest from broadcast customers in slowly moving their productions into the cloud, said Peter Abecassis, that company’s director of product management for production workflow solutions. The most common use case is disaster recovery, with a production control room in the cloud that can be used by any one of a group of stations, at any time, on an emergency basis.
Another application is using the cloud to expand into a new market without the upfront capital investment of building a new facility, whether it be a production control room or an entire station. Instead, they can use the cloud to quickly “spin something up” and experiment with a new product.
“What Ross is doing is we’re moving our production control room and production workflow products into the cloud, which is enabling our customers to use the interfaces and the products that they’ve used for many, many years, but have it in a more centralized location that’s more accessible to people regardless of geography,” Abecassis said. “It helps them reduce the amount of spend that they have in terms of hardware, maintenance and the cost of running the solution. And it also allows them to spin it up and spin it down depending on, do we need it for an emergency, do we need it as overflow for an election, do we need it for a special event?”
That includes migrating its OverDrive automation system into the cloud, which Ross is in the process of doing with one customer. Implement OverDrive in the cloud will allow a single set of operators to run the same show for multiple time zones, Abecassis said, and “maintain a consistent quality to the show regardless of where you are.”
While a lot of news production can be done in the cloud, Abecassis emphasized that physical equipment, such as robotic cameras in the studio that connect to cloud software, remains vitally important.
“We need to be conscious of the fact that not everything’s going to run in the cloud,” Abecassis said. “There’s going to be a time and a place for things to be in the cloud, and a time and place for things to run on-premise. The trick is, the goal, is to have everything be seamless, so people don’t really know or care where the software is running. It’s just the interface they’re familiar with.”
Mining The ‘Digital Backlot’ At Warner Bros. Discovery
Renard Jenkins, SVP, production integration and creative technology services, Warner Bros. Discovery, is exploring how the cloud can aid production in three different verticals for Warner Bros: live-action films, animated films and episodic television. Those businesses are working with far more production lead time than live news at a television station, noted Jenkins, which in some ways make the cloud easier to adopt. He said WBD’s overall goal is to make things more efficient without disrupting the workflows to which creative personnel are accustomed.
“While making it more efficient, we also want to make it transparent to the creatives,” Jenkins said. “Tech can sometimes get in the way and be a hindrance, and our job is really to make sure that it doesn’t.”
A current project Jenkins is tackling is to create a “digital backlot,” a repository of digital assets, including CGI backgrounds for virtual production, that allows them to be easily accessed and repurposed across the entire company. Jenkins says all of the major studios are pursuing similar initiatives as they seek to extend the lifespan of their assets.
Another goal for Jenkins and his team is to see if they can make the visual effects (VFX) process a cloud-based process. He said there is still software development work needed to do to make existing effects tools, most of which were designed to work with on-premise hardware, run well on public cloud compute. He compares the challenge to the issues faced by cloud-based editing when it was introduced a few years ago.
“One of the big complaints was that the software was not ready for editing in the cloud,” Jenkins said. “You could render in the cloud, and you could process in the cloud, but the actual editing function was clunky and wasn’t as smooth. And much like editing, visual effects and color artists, they’re based on rhythm. They need to have that flow and that rhythm. So, your applications need to be responsive, no matter where they live.”
Jenkins said that WBD is working with vendors to virtualize effects applications that were not originally designed to work in the cloud. He said that while file sizes sometimes pose a challenge to artists working remotely, rendering visual effects in the cloud is certainly doable today.
“That’s the easy part,” Jenkins said. “But actually having your application not on-prem, not on hardware that’s right there in the room with the individual, is still a bit of a hurdle because the timing is not going to be there. And I do believe that the processing power is definitely there, because you can scale infinitely. [But] there is work to be done on the application side to make it a little more palatable for the artists.”
That said, Jenkins has seen a lot of progress in the past six months and hopes that WBD’s effects artists will be working with cloud-based tools by early 2024.
Bridging 2110 And The Cloud
Some of TAG Video Systems’ customers have already fully embraced cloud production, said Robert Erickson, TAG’s VP of live production and sports. One example is Apple TV’s production of Major League Soccer games for its “MLS Season Pass” subscription production, which is being produced with technical support from NEP Group.
Erickson explained that Apple/MLS is an “entirely elastic infrastructure,” with most of the production functions happening in the AWS cloud and the rest being handled by NEP.
“If they’re doing one Major League Soccer game, they spin up just enough resources to do that,” Erickson said. “If they’re doing four or five, they spin up just enough resources to do that. But NEP is also handling some of the transport and some of the production also, and they have their own data center in Dallas.”
Other customers, like Fox with its FIFA World Cup and NFL coverage, have adopted a hybrid approach. They are primarily using ST 2110-based on-premise hardware with some support from cloud-based tools, such as in graphics creation. So, TAG has designed its monitoring, multiviewer and probing tools to work with all flavors of IP transport, from uncompressed 2110 to high-end mezzanine formats like JPEG-XS to streaming protocols like SRT, Zixi and NDI.
Erickson said that by eliminating physical interfaces like SDI connectors, ST 2110 represented a “first jump” into a software-based production environment. The cloud represents the “next evolutionary step” in making production compute- and location-agnostic.
“We’re seeing technology bridge those gaps now because customers do have existing 2110 facilities, but they want to start to be able to use the workflow tools that the cloud offers,” he said. “Because being able to put compute anywhere, whether compute is local on a server or semi-local in a data center or whether compute is in AWS or [Microsoft] Azure or what not, that’s kind of the power and flexibility that we’re pushing towards.”
Comments (0)