archives Archives - TV News Check https://tvnewscheck.com/article/tag/archives/ Broadcast Industry News - Television, Cable, On-demand Tue, 26 Dec 2023 13:27:02 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.2 Apple Explores AI Deals With News Publishers https://tvnewscheck.com/ai/article/apple-explores-ai-deals-with-news-publishers/ https://tvnewscheck.com/ai/article/apple-explores-ai-deals-with-news-publishers/#respond Tue, 26 Dec 2023 13:27:02 +0000 https://tvnewscheck.com/?p=304706 The company has discussed multiyear deals worth at least $50 million to train its generative AI systems on publishers’ news articles.

The post Apple Explores AI Deals With News Publishers appeared first on TV News Check.

]]>
The post Apple Explores AI Deals With News Publishers appeared first on TV News Check.

]]>
https://tvnewscheck.com/ai/article/apple-explores-ai-deals-with-news-publishers/feed/ 0
Metadata Is Key To Archive Monetization https://tvnewscheck.com/journalism/article/metadata-is-key-to-archive-monetization/ https://tvnewscheck.com/journalism/article/metadata-is-key-to-archive-monetization/#respond Thu, 21 Dec 2023 10:30:09 +0000 https://tvnewscheck.com/?p=304593 Executives from Fox News, Sinclair and Hearst Television discussed efforts underway to organize and capitalize on their massive archives at last week’s NewsTECHForum, where efficient — and more potentially inexpensive — methodologies are beginning to emerge.

The post Metadata Is Key To Archive Monetization appeared first on TV News Check.

]]>
Broadcasters want to derive more value from their archives by enriching daily news production, creating original programming for multiplatform distribution and generating new revenues from third-party licensing. But to do so they need to be able to easily search through and access old content, no easy task for legacy broadcasters with decades of analog tapes, and even film canisters, sitting in storage.

Several groups have undertaken large-scale digitization efforts to tackle the problem, with some exploring new AI and ML (machine learning) tools to more efficiently tag and index video. Regardless of the method, generating accurate metadata is key to any archive efforts, both for old content and fresh material being created today, said broadcasters last week at TVNewsCheck’s NewsTECHForum in New York City.

Metadata’s Critical Role

“Before we can actually monetize the archives in a reasonable way, we have to have metadata on it,” said Mike Palmer, AVP, advanced technology/media management for Sinclair. “And in many cases, most cases, we have not been putting good metadata on it.”

Palmer, speaking on the panel “Harvesting the Archive for New Content and Opportunities” moderated by this reporter, said archive metadata must not only include enough information to find content using a media asset management (MAM) system. It also needs to have information about the rights attached to the content, since most call-letter stations have a mix of content they shot themselves, and fully own the rights to, and derivative content originally sourced from a network news service.

There isn’t any technical means today to tell whether a station owns a piece of content or not, Palmer said. That question can usually be answered only by calling and (hopefully) finding an employee who was there when it first aired.

“How long have we been talking about archives and metadata, but we’re not bringing back basic information about ownership, what camera it was shot on, the date, the geolocation, all this metadata that is in the cameras that we should be carrying forward,” Palmer said. “And we’re recreating the same problem that we’re trying to solve today with AI and ML because we’re simply not putting the right metadata on that content as it moves into the archive.”

Palmer said the culprit for lost camera metadata is often nonlinear editing systems that strip it out during the production process. To combat the problem going forward he sees a solution in the Coalition for Content Provenance and Authenticity (C2PA) standard, as promoted by the Content Authenticity Initiative (CAI). C2PA specifies provenance metadata that survives all the way from camera to distribution. C2PA not only addresses content ownership, but also content authenticity, an issue of growing importance in the age of AI-generated fake images.

‘A Wildly Human Process’

To improve accessibility of content for its journalists and producers, Hearst Television began digitizing the archives across its stations in 2021. To date it has digitized about 20%-25% of its archive material, representing roughly 45,000 hours of video.

“We parachute into a couple of stations at a time and help them digitize their archives in a systematic way,” said Devon Armijo, director, digital news integration for Hearst Television. “We bring in archival staff that handles not only the physical media but also the paper data that associates with it. Not only do we focus on digitization, but they also are not only tagging. They are looking at it in a discovery way. making sure they’re telling about the editorial opportunities, the promotional opportunities and sometimes the sales opportunities that are there in the archives — things that are sealed in the tapes that folks may know or not know that they have.”

While Hearst makes some use of automation, Armijo said that digitization remains “a wildly human process,” particularly when dealing with physical media that is beyond its end of life, such as 40-50 year-old tapes. That is where Hearst’s archivists serve as “the first line of defense.”

“They’re putting tapes through on a daily basis and making so many human decisions, up front at the beginning of digitization, that helps you with any sort of automation that rolls through afterwards,” Armijo said. “We had some automation processes throughout, like black [frame] detection. But that stuff is all secondary to the human decisions, the conversations, and understanding the history of not only the station but the content that’s there in your archive.”

Hearst licenses archive content to third parties, Armijo said, but the group itself remains “our first customer.” So far this year, Hearst has used its archive to produce over 370 pieces of digital original content along with a handful of linear specials and some local streaming content, including the popular true crime series Hometown Tragedy.

Fox is digitizing the archives across its station group as well as Fox News and Fox Business and bringing them into cloud storage. It has taken a different approach than Hearst by outsourcing the work, which encompasses tens of thousands of U-matic, one-inch and two-inch tapes, 16mm and 35mm film and various digital tape formats.

“We have tractor trailers come and pick up the entire library and it goes off to one of our five digitizing vendors, and then it works through their process,” said Ben Ramos, VP, Fox Archive, field and emerging tech, Fox News. “They have around 35 metadata enhancers who watch every frame of it, and kind of tag it as they’re going through it. It’s very manual, we haven’t gotten to too many AI/ML tools yet.”

Fox’s first goal was to preserve “at-risk” content like one-inch, two-inch and U-matic libraries, with the second objective being to generate ROI by licensing content to third-party documentary filmmakers. The initial effort was aimed at 5,000 U-matic tapes.

“What do we have in there, what’s the failure rate, and can we find ROI?” Ramos said. “We found ROI within six months, so that kind of supercharged the process, and then we got to do the rest of the 70,000 U-matic, two-inch and one-inch, and then we started dipping into the more expensive 16mm.”

Fox has experienced a failure rate of 3%-5% on that older content, and those impaired assets are now sitting on two pallets “awaiting further remediation,” Ramos said. That could involve baking them for several weeks to remove moisture, or even cracking tapes open to clean them and rehouse them.

Overall, it is a slow process, and so far, Fox has only digitized about 8% or 9% of its total physical media assets. One of the surprising findings is that newer formats like Beta, DV and DVCPRO tapes are also experiencing similar 3%-5% failure rates during the digitization process, and some of the older one-inch and U-matic tapes are actually playing better depending on how and where they were stored.

“Now everything feels a little bit at risk,” Ramos said.

Finding Answers With AI, ML

Sinclair was early in archiving some of its content in the public cloud, and last year struck a deal with producer Anthony Zuiker to mine its news archives to create original content that can be licensed to third parties. The group has around 23 million assets that were “born digital,” Palmer said, which means they been archived from a newsroom computer system with a script attached to it. Those assets have accurate metadata, allowing one to search that content across the entire enterprise and access it. Sinclair also has another roughly 10 million assets sitting on shelves on varied physical media.

“The question at this point is what do we want to invest in to bring this back?” Palmer said. “We look at news content, and it’s a fact that most news content has no value in the archive. It is the rare jewel that justifies the expense of all the rest of the work that you put into that. So, we’re focused right now in trying to determine, to the best of our knowledge, which portions of the archive have the highest probability for containing those jewels, and then go mining in that direction. And we may not — I say may, because there are no hard decisions at this point — but we may not want to go back to those 10 million assets and actually digitize them all. It depends on what we find.”

Sinclair has worked with archiving vendor Memnon to digitize cutsheets and tape labels on stored media at a few stations. It plans to use AI tools like optical character recognition (OCR) to analyze them and hopefully generate good descriptions that it can then use to determine what is worth digitizing.

Fox Sports has spent several years on its own complex archive project with Google to create a system that allows producers to quickly call up old footage, such as to enhance a halftime package. Ramos said he has been given access to it and “playing with it for about six months.” The system uses two kinds of metadata: metadata created by human loggers, as well as metadata created by the same ML algorithms that form the basis of YouTube search. A user has a choice of searching by either type.

“It’s definitely working,” Ramos said. “It’s a massive, massive archive, it’s huge. They’ve got a lot of content in there, so it would be really hard to search otherwise.”

Ramos’ own budget for AI/ML tools is more modest, so his team has focused on the least expensive AI tools, speech-to-text and OCR, and runs content through the AI tools themselves.

“Usually when there’s an anchor or a reporter talking about something, it relates to the video that’s covering that,” Ramos said. “So that’s been a really good way for us to inexpensively find most of what we need. But it’s not 100% of the way there.”

Finding Affordability

French company Newsbridge wants to make indexing archive content and searching through it more affordable. The company has developed a cloud-based AI engine called MXT-1 that can quickly sift through archive video and generate human-like descriptions, and do it more affordably than conventional AI systems, said Newsbridge CEO Phillippe Petitpont. Its indexing technology can also be applied to ingesting live content.

“With 1,000 hours of archive, there might be three hours that are hidden gems that have a lot of value,” Petitpont said. “So, you need to analyze 1,000 hours but there are maybe only three or four that are relevant. The problem is that current AI, monomodal indexing technology is very expensive. You don’t want to spend $10 million to index something that might be valuable for just two or three hours. So, we took this problem and have been working on it for a few years. We need AI with video understanding that is able to be very efficient, so that it can meet business realities in terms of pricing.”

Petitpont said a key differentiator for Newsbridge’s AI that it is multimodal, which means that it doesn’t just analyze speech or recognize text but considers multiple types of data within video as a human would. And instead of analyzing each individual frame of video, MXT-1 employs “smart subsampling” and only looks at a few key relevant frames. This cuts down on the use of expensive graphics processing units (GPUs) on public cloud compute and avoids wasting money by “overindexing” content.

“We only process a frame that will really best illustrate the content,” Petitpont said. “So then we’ve reduced by an order of magnitude a lot of traditional sampling.”

Sinclair is not currently a customer of Newsbridge, but Palmer said when he spoke with them he was impressed by their smart subsampling approach. The company obviously had arrived earlier at the same conclusion that his team at Sinclair had reached.

“That was, that you don’t need to look at every frame of video,” Palmer said. “You don’t need to do some of these massive tagging things for every frame of video. Some of these AI models will create pages and pages of metadata for each frame of video, and that is not appropriate for news. Less in some cases, and probably this case, is better.”


Read more coverage of NewsTECHForum 2023 here. Watch this session and all the NewsTECHForum 2023 videos here.

The post Metadata Is Key To Archive Monetization appeared first on TV News Check.

]]>
https://tvnewscheck.com/journalism/article/metadata-is-key-to-archive-monetization/feed/ 0
Getty Images And BBC Studios Partner On Platform Enabling ‘Unprecedented’ Access To BBC Archive Video https://tvnewscheck.com/tech/article/getty-images-and-bbc-studios-partner-on-platform-enabling-unprecedented-access-to-bbc-archive-video/ https://tvnewscheck.com/tech/article/getty-images-and-bbc-studios-partner-on-platform-enabling-unprecedented-access-to-bbc-archive-video/#respond Wed, 06 Dec 2023 15:56:15 +0000 https://tvnewscheck.com/?p=303900 Getty Images, a global visual content creator and marketplace, in partnership with BBC Studios, the commercial subsidiary of the BBC, announces the launch of a new platform giving its customers unprecedented access to […]

The post Getty Images And BBC Studios Partner On Platform Enabling ‘Unprecedented’ Access To BBC Archive Video appeared first on TV News Check.

]]>
Getty Images, a global visual content creator and marketplace, in partnership with BBC Studios, the commercial subsidiary of the BBC, announces the launch of a new platform giving its customers unprecedented access to BBC archive video and a new seamless search, purchase and download experience.

Designed to intuitively enable end-users to select and order content, the platform, powered by MAM software specialists VIDA Content OS, allows easy access to over 57,000 programs from the BBC archive which were previously available only offline by a heavily manual process. Customers can now securely search the entire digitized library, view, annotate, clip, share and download previews for use within projects. Post clearance, the high-resolution masters are available immediately, providing significant benefits for fast turnaround projects.

Curated collections around upcoming events and commissioning trends support ideation, while the latest Speech to Text tools enable programme makers to ‘transcript search’ all assets, bringing to light previously hidden content from BBC Motion Gallery’s vast archive.

Paul Davis, Getty Images vice president of media and production EMEA, said: “The VIDA platform is a significant breakthrough in making BBC content more accessible for our customers around the world. Our team is increasingly working as a creative partner, able to assist from ideas stage through production and this platform will considerably enable our efforts in supporting producers to derive successful, returnable shows for the global market.”

Chris Hulse, head of BBC Motion Gallery at BBC Studios added: “Our partnership with Getty Images is focused on getting BBC content into the hands of program makers around the world. This new platform is a gamechanger for us, surfacing a wealth of previously offline BBC content, now accessible on-demand for the first time ever.” 

Symon Roue, managing director at VIDA, commented: “We’re delighted to power the integration of BBC Motion Gallery and Getty Images. As the archive footage licensing industry grows year over year, customers expect an integrated and efficient supply chain for servicing of catalogue. Accessibility to the clips that matter, using the latest cloud and AI technology, is where Content OS excels.”

The new platform expands the on-demand offering of BBC content for Getty Images customers who can already access more than 200,000 BBC editorial and creative clips on gettyimages.com.

Getty Images offers an expansive video library of over 25 million video clips, including footage from a range of world-renowned broadcast, studio, film-maker and archive partners. Available to customers are contemporary, archive and creative footage, including more than 12 million clips in 4K and vast offline archives from BBC Motion Gallery and NBC News.

The post Getty Images And BBC Studios Partner On Platform Enabling ‘Unprecedented’ Access To BBC Archive Video appeared first on TV News Check.

]]>
https://tvnewscheck.com/tech/article/getty-images-and-bbc-studios-partner-on-platform-enabling-unprecedented-access-to-bbc-archive-video/feed/ 0
NewsTECHForum: Harvesting Archives For New Content And Opportunities https://tvnewscheck.com/journalism/article/newstechforum-harvesting-archives-for-new-content-and-opportunities/ https://tvnewscheck.com/journalism/article/newstechforum-harvesting-archives-for-new-content-and-opportunities/#respond Mon, 20 Nov 2023 10:25:25 +0000 https://tvnewscheck.com/?p=303203 Leading executives from Sinclair, Fox News, Hearst Television and Newsbridge will share the latest technologies and methodologies they’re employing to harness the full content potential of their vast archives for new shows and revenue streams in a panel at TVNewsCheck’s NewsTECHForum conference at the New York Hilton on Dec. 12. Register here.

The post NewsTECHForum: Harvesting Archives For New Content And Opportunities appeared first on TV News Check.

]]>
Media companies are beginning to use technology to tag, index and search for video to enhance storytelling, create new shows and, eventually, find new revenue by licensing content. Harvesting the Archive for New Content and Opportunities, a panel at TVNewsCheck’s NewsTECHForum conference on Dec. 12 at the New York Hilton will look at how the newest advances in AI are making these tasks significantly easier among other archive-harnessing developments.

Speakers are Devon Armijo, director of digital news integration, Hearst Television; Mike Palmer, AVP advanced technology/media management, Sinclair; Philippe Petitpont, CEO, Newsbridge; Ben Ramos, VP, Fox archive, field and emerging tech, Fox News. TVNewsCheck Contributing Editor Glen Dickson will moderate the discussion.

“AI has fundamentally changed the process of metatagging archives and making them more thoroughly searchable, which in turn has offered pathways to new content creation and licensing opportunities drawing from those archives,” said Michael Depp, chief content officer, NewsCheckMedia and editor, TVNewsCheck. “This session will look closely at how a well-organized, easy-to-retrieve-from archive can have numerous benefits for newsrooms under pressure.

“The panel will also look at the emerging challenge of how media companies can authenticate their deep trove of archival content and determine rights ownership,” he added.

NewsTECHForum, now in its 10th year, is co-located with the Sports Video Group Summit. The conference’s theme for 2023 is Adapting to a Culture of Continuous Crisis.

Featured sessions are:

  • Keynote: Democracy, Technology, TV Journalism and the 2024 Election
  • Reassessing the Streaming News Content Strategy
  • Chasing AI: Threatening or Enhancing the News?
  • Adapting to a Culture of Continuous Crisis
  • Agility in News Production
  • Building the Architecture of More Collaborative Content Creation

Register here.

The post NewsTECHForum: Harvesting Archives For New Content And Opportunities appeared first on TV News Check.

]]>
https://tvnewscheck.com/journalism/article/newstechforum-harvesting-archives-for-new-content-and-opportunities/feed/ 0
TV Archives Are Vital History, And Too Often Discarded https://tvnewscheck.com/journalism/article/tv-archives-are-vital-history-and-too-often-discarded/ https://tvnewscheck.com/journalism/article/tv-archives-are-vital-history-and-too-often-discarded/#comments Tue, 14 Nov 2023 10:29:54 +0000 https://tvnewscheck.com/?p=302894 The Library of American Broadcasting Foundation is doing critical work to help media find and preserve the treasure in their vast archives.

The post TV Archives Are Vital History, And Too Often Discarded appeared first on TV News Check.

]]>

Mary Collins

About 15 years ago, Antioch College’s student radio station, WYSO-FM, was preparing to move to a different building on the school’s Yellow Springs, Ohio, campus. Staff was responsible for clearing out everything that had accumulated since the station’s founding in 1958. In a storeroom they found recordings dating back to the station’s earliest days, all of which had been put away with the same care college students use when packing at the end of the year.

You can imagine — reel-to-reel tapes, cassettes and other audio storage media all stuffed into boxes and garbage bags. Many of the items were so covered in mold that labels were unreadable. The magnetic tapes themselves were in various states of decomposition. Fortunately, someone took the time to do some investigation instead of relegating the whole mess to a dumpster. What they found was an audio record of the prior 50 years composed of concerts, interviews, lectures and voices ranging from those of Susan Sontag and Cesar Chavez to Dick Gregory and Martin Luther King, Jr.

The items have since been conserved, with “several hundred hours of the most significant tapes … made accessible to the public.” The digitized material was also used to produce a series of feature stories about both the Civil Rights Movement and the Vietnam War.

For better or worse, the same story is playing out across the country, often with very different results. Consider the pioneering local journalist or station manager, since retired, who dies without making time to catalog 50 years’ worth of notes, scripts, tapes, awards and other ephemera. Grieving relatives don’t have the energy, or knowledge, to sort what’s left behind and no one wants to bring even more stuff into their own home. It’s a fair bet that the best part of the collection will end up in the garbage.

Another case is that of the station changing format or ownership. An out-with-the-old-and-in-with-the-new philosophy typically results in the loss of valuable material.

In general, local media outlets aren’t in the business of preserving the content they create. Media moments are fleeting; there is no secondary market for newscasts. Generally, we don’t appreciate that we are experiencing history in the making until much, much later. The objective is to complete the story and get it on the air (or in print) and move on to the next piece. I cannot imagine that anyone covering the initial story of the break-in at the Watergate Hotel, early rumors about Harvey Weinstein’s treatment of female actors or even the original story of George Floyd’s death at the hands of Minneapolis police realized that they were seeing the beginning of something that would shape the future.

Why Save Media And Related Materials?

In the case of television and radio broadcasts, actually seeing and/or hearing the content tells a person so much more than they could ever get from reading a transcript. Specifics such as how the words were spoken, how the speaker was dressed, details about the location and even where the speaker was looking are rarely included in the written account.

The compilation of CBS News’ coverage of the JFK assassination provides an excellent example of this. Walter Cronkite’s words are simple. It’s the image of him in his shirtsleeves, at his cluttered desk in the CBS newsroom that really drives home the point of how unprecedented the situation was. The desktop, littered with papers, also includes three telephones, a typewriter and a stand microphone. Cronkite looks right into the camera, first telling viewers that Kennedy and Texas Gov. John Connally have been taken to a Dallas hospital with nothing else known about their condition. About 20 minutes later, Cronkite is reporting that the president has been given last rites. Soon afterwards, Cronkite tells viewers that the president has died.

In the second and third reports, Cronkite is seen removing his glasses even as he begins addressing his audience. When saying that the president has been declared dead at 1 p.m. Central, Cronkite is visibly shaken. He can be seen looking up and to his left, twice, at what must be the newsroom’s clock so he can calculate that the declaration occurred 38 minutes prior to his report. No transcript would be able to provide the audio and visual details that convey the unprecedented gravity of the reporting.

Consider that today’s media include female, non-white and some LGBTQIA+ reporters, anchors, disk jockeys, hosts and other presenters. That wasn’t always the case. The stories of the media pioneers who paved the way for the current generation are an important part of our cultural legacy. Unfortunately, most of their stories are missing.

While national networks like CBS can, hopefully, be counted upon to preserve their history, there are any number of other collections that are at risk or have already been lost. Archivists such as those at the Library of American Broadcasting (LAB) at the University of Maryland can help people with access to potentially interesting materials determine if what they have identified is worth preserving. In fact, one of the LAB’s online resources is a comprehensive guide (“libguide”) that outlines what to save, how to handle the items, and lists of number of places that might welcome such a donation.

If you are interested in learning more about this topic, I encourage you to take a look at the video called Preserving Broadcast History, which NAB produced as part of this year’s 100th anniversary celebration. It features: Jack Goodman, co-chair, LABF; April Carty-Sipp, EVP, industry affairs, NAB; Laura Schnitker, Ph.D., C.A., curator, mass media and culture, University of Maryland’s Special Collections and University Archives; and Mike Henry, reference specialist, University of Maryland’s special collections and university archives.

I have the privilege of serving as the volunteer treasurer for the Library of American Broadcasting Foundation (LABF), which provides major funding for the Library of American Broadcasting. LABF raises funds to support the library through its annual Giants of Broadcasting and the Electronic Arts Award Luncheon along with individual donations to the foundation. Next year’s Luncheon will be held on Nov. 12, 2024, at Gotham Hall in New York City. In the meantime, I encourage you to consider making a year-end donation via the Foundation’s website. Broadcasting history won’t wait.

Former president and CEO of the Media Financial Management Association and its BCCA subsidiary, Mary M. Collins is a change agent, entrepreneur and senior management executive. She can be reached at MaryMCollins1@comcast.net.

The post TV Archives Are Vital History, And Too Often Discarded appeared first on TV News Check.

]]>
https://tvnewscheck.com/journalism/article/tv-archives-are-vital-history-and-too-often-discarded/feed/ 1
Inch By Inch, Broadcasters’ Cloud Production Expands https://tvnewscheck.com/tech/article/inch-by-inch-broadcasters-cloud-production-expands/ https://tvnewscheck.com/tech/article/inch-by-inch-broadcasters-cloud-production-expands/#respond Thu, 29 Jun 2023 14:00:19 +0000 https://tvnewscheck.com/?p=297804 Executives from Warner Bros. Discovery, Sinclair, Tegna, Ross Video and TAG shared the latest in how they’re using the cloud to gain new flexibility and efficiency, from unlocking AI’s progress to “decomposing” news workflows and building “digital backlots,” in a TVNewsCheck webinar last week.

The post Inch By Inch, Broadcasters’ Cloud Production Expands appeared first on TV News Check.

]]>
Broadcasters have already proven that they can use cloud technology, whether public or private, to replace dedicated on-premise hardware for playout and archiving applications. Now many are looking to expand their use of the cloud into production for everything from ad-hoc events aimed at streaming distribution to mainstream live sports and news coverage for their linear channels.

While most are still years away from a broad adoption of the cloud for their production needs, stations, networks and studios are experimenting today with how they can use the cloud to gain new flexibility and efficiency. Challenges remain in software interoperability and latency, but significant progress has been made in the past year, said top engineers who gathered last week for the TVNewsCheck webinar “Live Production in the Cloud,” moderated by this reporter.

Tegna Awaits AI’s Promise

Kurt Rao

“The cloud, as we think of it, is ideal for variable workloads, where you can really leverage a consumption-based model and scale up and scale down as you need it,” said Kurt Rao, SVP and CTO, Tegna.

Rao considers Tegna’s content in three main “buckets”: news, weather and local sports. He said the cloud is currently being leveraged for production at some level across all of them. For example, news cameras are feeding content from the field through the cloud and then back into the studio, where it is packaged using tools like Adobe’s cloud-based editing software for linear or digital distribution.

“We’ve been very purposely looking at what the capabilities are in each of the key workflow steps, and how you can leverage the cloud in elements of that process,” Rao said. “That’s where we are. I would say we’re still very early in the days of adopting end-to-end, but there are certainly a lot of purpose-built tools and processes that can leverage the cloud.”

Rao said that complete camera-to-cloud workflows, which were demonstrated by vendors including Adobe at this year’s NAB Show, are “ready to test” but not necessarily ready for production. He added that he was “very excited” by the focus at NAB show on artificial intelligence (AI), which is heavily dependent on cloud compute.

Rao sees great potential in the possibility of an AI chip in a camera that could work with the cloud to automatically create multiple versions of video as it is being shot and output them from the camera.

“In theory, the promise is you could get a two-minute clip that goes out to linear, but it could summarize that for a 10-second one for Instagram,” Rao said.

“Decomposing” News Workflows At Sinclair

Sinclair was an early adopter of the cloud for playout and disaster recovery functions, for both its stations and its national networks. Now the company is exploring how it can use the cloud for news and sports production, including testing production control room (PCR) software from Ross Video and Grass Valley at several stations and also using private cloud technology for some Tennis Channel productions.

Mike Kralec

While the benefits of moving functions like playout or DR to cloud were very clear, the analysis for news production is more complex, said Mike Kralec, SVP and CTO, Sinclair. The company is attempting to “decompose the ecosystem” before migrating to the cloud by taking a hard look at individual components of the news production workflow like field acquisition, archiving and editing.

“Eventually, we’re going to get to this holistic ecosystem of composable parts where we bring all these different cloud workflows together,” Kralec said. “But we’re not at a point where we have the visibility into seeing everything in the cloud all at once.”

Streamlining field acquisition processes with the cloud to get content to digital platforms in interesting, Kralec said, as is the possibility of centralizing Sinclair’s news archives in the cloud to make repurposing content easier. But as it looks for viable business models for live news production in the cloud, Sinclair and other broadcasters are in some ways “victims of our efficiency,” Kralec said.

After rolling out production automation systems like Ross Video’s OverDrive, Sinclair has already made its control room operations very efficient, Kralec said. He isn’t sure how much more benefit can be gained by moving those workflows to the cloud, though Sinclair is currently evaluating whether a single cloud-based PCR could support news production across multiple stations, at least for disaster recovery.

“It’s dials and knobs,” Kralec said. “Can we turn the knob on shared resourcing, in terms of, can you deploy 40 production control rooms in a shared service model rather than 75 control rooms? But as you turn that knob down, you turn the other one on network resourcing and encoding and transport and everything else that goes into that. You turn that one up. So, there are tradeoffs.”

Transcending Geography

Peter Abecassis

Ross Video is definitely seeing interest from broadcast customers in slowly moving their productions into the cloud, said Peter Abecassis, that company’s director of product management for production workflow solutions. The most common use case is disaster recovery, with a production control room in the cloud that can be used by any one of a group of stations, at any time, on an emergency basis.

Another application is using the cloud to expand into a new market without the upfront capital investment of building a new facility, whether it be a production control room or an entire station. Instead, they can use the cloud to quickly “spin something up” and experiment with a new product.

“What Ross is doing is we’re moving our production control room and production workflow products into the cloud, which is enabling our customers to use the interfaces and the products that they’ve used for many, many years, but have it in a more centralized location that’s more accessible to people regardless of geography,” Abecassis said. “It helps them reduce the amount of spend that they have in terms of hardware, maintenance and the cost of running the solution. And it also allows them to spin it up and spin it down depending on, do we need it for an emergency, do we need it as overflow for an election, do we need it for a special event?”

That includes migrating its OverDrive automation system into the cloud, which Ross is in the process of doing with one customer. Implement OverDrive in the cloud will allow a single set of operators to run the same show for multiple time zones, Abecassis said, and “maintain a consistent quality to the show regardless of where you are.”

While a lot of news production can be done in the cloud, Abecassis emphasized that physical equipment, such as robotic cameras in the studio that connect to cloud software, remains vitally important.

“We need to be conscious of the fact that not everything’s going to run in the cloud,” Abecassis said. “There’s going to be a time and a place for things to be in the cloud, and a time and place for things to run on-premise. The trick is, the goal, is to have everything be seamless, so people don’t really know or care where the software is running. It’s just the interface they’re familiar with.”

Mining The ‘Digital Backlot’ At Warner Bros. Discovery

Renard Jenkins

Renard Jenkins, SVP, production integration and creative technology services, Warner Bros. Discovery, is exploring how the cloud can aid production in three different verticals for Warner Bros: live-action films, animated films and episodic television. Those businesses are working with far more production lead time than live news at a television station, noted Jenkins, which in some ways make the cloud easier to adopt. He said WBD’s overall goal is to make things more efficient without disrupting the workflows to which creative personnel are accustomed.

“While making it more efficient, we also want to make it transparent to the creatives,” Jenkins said. “Tech can sometimes get in the way and be a hindrance, and our job is really to make sure that it doesn’t.”

A current project Jenkins is tackling is to create a “digital backlot,” a repository of digital assets, including CGI backgrounds for virtual production, that allows them to be easily accessed and repurposed across the entire company. Jenkins says all of the major studios are pursuing similar initiatives as they seek to extend the lifespan of their assets.

Another goal for Jenkins and his team is to see if they can make the visual effects (VFX) process a cloud-based process. He said there is still software development work needed to do to make existing effects tools, most of which were designed to work with on-premise hardware, run well on public cloud compute. He compares the challenge to the issues faced by cloud-based editing when it was introduced a few years ago.

“One of the big complaints was that the software was not ready for editing in the cloud,” Jenkins said. “You could render in the cloud, and you could process in the cloud, but the actual editing function was clunky and wasn’t as smooth. And much like editing, visual effects and color artists, they’re based on rhythm. They need to have that flow and that rhythm. So, your applications need to be responsive, no matter where they live.”

Jenkins said that WBD is working with vendors to virtualize effects applications that were not originally designed to work in the cloud. He said that while file sizes sometimes pose a challenge to artists working remotely, rendering visual effects in the cloud is certainly doable today.

“That’s the easy part,” Jenkins said. “But actually having your application not on-prem, not on hardware that’s right there in the room with the individual, is still a bit of a hurdle because the timing is not going to be there. And I do believe that the processing power is definitely there, because you can scale infinitely. [But] there is work to be done on the application side to make it a little more palatable for the artists.”

That said, Jenkins has seen a lot of progress in the past six months and hopes that WBD’s effects artists will be working with cloud-based tools by early 2024.

Bridging 2110 And The Cloud

Robert Erickson

Some of TAG Video Systems’ customers have already fully embraced cloud production, said Robert Erickson, TAG’s VP of live production and sports. One example is Apple TV’s production of Major League Soccer games for its “MLS Season Pass” subscription production, which is being produced with technical support from NEP Group.

Erickson explained that Apple/MLS is an “entirely elastic infrastructure,” with most of the production functions happening in the AWS cloud and the rest being handled by NEP.

“If they’re doing one Major League Soccer game, they spin up just enough resources to do that,” Erickson said. “If they’re doing four or five, they spin up just enough resources to do that. But NEP is also handling some of the transport and some of the production also, and they have their own data center in Dallas.”

Other customers, like Fox with its FIFA World Cup and NFL coverage, have adopted a hybrid approach. They are primarily using ST 2110-based on-premise hardware with some support from cloud-based tools, such as in graphics creation. So, TAG has designed its monitoring, multiviewer and probing tools to work with all flavors of IP transport, from uncompressed 2110 to high-end mezzanine formats like JPEG-XS to streaming protocols like SRT, Zixi and NDI.

Erickson said that by eliminating physical interfaces like SDI connectors, ST 2110 represented a “first jump” into a software-based production environment. The cloud represents the “next evolutionary step” in making production compute- and location-agnostic.

“We’re seeing technology bridge those gaps now because customers do have existing 2110 facilities, but they want to start to be able to use the workflow tools that the cloud offers,” he said. “Because being able to put compute anywhere, whether compute is local on a server or semi-local in a data center or whether compute is in AWS or [Microsoft] Azure or what not, that’s kind of the power and flexibility that we’re pushing towards.”

The post Inch By Inch, Broadcasters’ Cloud Production Expands appeared first on TV News Check.

]]>
https://tvnewscheck.com/tech/article/inch-by-inch-broadcasters-cloud-production-expands/feed/ 0
Talking TV: AP Resets The Game On AI-Based Archive Search https://tvnewscheck.com/journalism/article/talking-tv-ap-resets-the-game-on-ai-based-archive-search/ https://tvnewscheck.com/journalism/article/talking-tv-ap-resets-the-game-on-ai-based-archive-search/#respond Fri, 09 Jun 2023 09:30:33 +0000 https://tvnewscheck.com/?p=297069 Paul Caluori, AP's VP of global products, and Derl McCrudden, AP's VP and head of global news production, discuss the organization’s new AI-based archive search tool that circumvents the need for metatags and the shadow that generative AI casts over the industry at large. A full transcript of the conversation is included.

The post Talking TV: AP Resets The Game On AI-Based Archive Search appeared first on TV News Check.

]]>
What do broadcasters do when they know what they’re looking for in the archives but may not have the right metatags at hand to find it?

The Associated Press has just released a new AI-powered search tool for its AP newsroom platform that may mitigate the problem. This tool uses descriptive language to connect searchers with their targets, bypassing metatags altogether. It may end up being a game-changer in the fast-developing world of making media asset management systems and archives more searchable in the process.

In this week’s Talking TV conversation, Paul Caluori, AP’s VP of global products, and Derl McCrudden, AP’s VP and head of global news production, share what’s underpinning the new AI tool and its wider implications for the industry. They also look at developments in generative AI applications like ChatGPT and how they may problematize content authenticity on the one hand, but could mature into a valuable tool on the other.

Episode transcript below, edited for clarity.

Michael Depp: The Associated Press recently announced the launch of an AI-powered search tool on its AP newsroom platform for multimedia content. Rather than just using a conventional metadata search, the new tool understands descriptive language and offers up search results based on the description a user provides. Just think of the implications for all those broadcasters who spent untold hours with knotted brows searching through inadequately tagged archives.

I’m Michael Depp, editor of TVNewsCheck, and this is Talking TV. Today, a conversation with Paul Caluori, AP’s VP of global products, and Derl McCrudden, AP’s VP and head of global news production. We’ll talk about how AI is enabling a more accommodating kind of search ability across a massive archive and the wider implications for AI’s usage in broadcast media asset management systems. We’ll also talk about the widening use of AI at the AP, which was a vanguard adopter of the technology, and the ethical and operative guidelines it’s adopting around AI’s usage. We’ll be right back with that conversation.

Welcome Paul Caluori and Derl McCrudden, to Talking TV.

Paul, this new tool allows users to search through apps, vast photo and video library without needing very specific meta tags to do it. How were you able to build that?

Paul Caluori: We actually are working with another company called Merlin One, which is based in Boston, and they specialize in AI applications for visual assets. So, what we’ve done is we’ve adopted their engine after working with them over the last year to prepare for this.

And what it does is look at the description that a user puts in and is able to sort of understand concepts in a way that keywords and regular tags, you know, are just sort of very, very blunt. And it is also able to translate those into elements within a visual context so it can look for a moment within a video or it can look for a component within a photo, and it makes it a much more specific kind of search than we typically get.

What’s really great about it is that it can find things that we don’t have tags for. And if you think about our whole archives or, you know, our photos, go back to the 1840s. The AP was founded in 1846. We have photos that go all the way back to the beginnings of photography. Nobody was thinking about metadata back then. And so, a lot of these things are not very well tagged and are difficult to find. This is a way to sort of unlock all of that, and we’re very excited about it.

Well, this is fascinating how this searchability works. So, how accommodating can it be exactly for the user who has only a kind of very abstract or imprecise idea of what exactly they’re looking for?

Paul Caluori: That’s such a great question because it gets right to the heart of one of the things that this is going to change, which is the way people approach search. Right, so typically people approach search with that sort of broad scope of what they’re looking for, because that’s how search works. I’m going to put in one or two keywords, then I’m going to start sifting through a bunch of results to see if anything sort of grabs my fancy.

If you want to search that way with this, you can say, I’m looking for, you know, soccer games with a blue sky or I’m looking for soccer games with… or excuse me, Derl might call it a football game, so, no, I’m looking for a soccer game with people who are wearing yellow uniforms, Right? Because I’m an art director and I’m looking for a particular look. So, you can look for abstract ideas like that. If you were looking for something, you weren’t quite sure what you wanted, it will return a whole lot of results. And then you start sifting through them the same way.

And that seems to be the problem. I suppose the danger here is that the user is then flooded with search results. So, how can you avoid that or winnow that down in a more user-friendly way?

Paul Caluori: I think that while we’re always flooded with search results, anything you search for, you get millions and millions of results, and you get past the first 10 and it gets pretty far away from what you were looking for. So, I think the way around it is for users to start thinking a little bit more specifically about what they want.

You know, it’s hard to find something if you don’t quite know what you’re looking for. So, you know, if you do want something very specific, you can enter that into this search engine, and it gives rather strikingly precise results. I spent some time playing around with it, so I started looking around, show me pictures of Winston Churchill or video of Winston Churchill in a garden. And then somebody said, Well, let’s see if we can find him feeding birds. So, all right. I literally typed in Winston Churchill in a garden feeding birds. That’s pretty specific. And it came up with like five videos, and it went right to the moment where that exact thing was happening. I didn’t know that that was part of our archive. And I guarantee you we don’t have anything tagged like that. But I mean, knowing exactly what you’re looking for, you can find out pretty quickly whether we’ve got it.

So, the more language you throw into this search field, the more it is going to winnow it down.

Paul Caluori: That’s right. That’s right. So, instead of doing it after your search, you do it ahead of your search and find the things you’re looking for. We have research teams here both in the U.K. and in the U.S. working with our different markets. And, you know, our great hope is that this makes them more able to serve their customers effectively.

Broadcasters are going to be quite interested in the underlying technology here because so many of them are wrestling with the archives that are woefully undertagged, as you know. For those who started to try to impose some order on all of that chaos, they’ve been going about it by having AI wending through and adding tags. And it seems like this isn’t the same model now with this new tool, which would just appear to kind of circumvent the tagging process altogether. Do I have that right?

Paul Caluori: I think that’s right, yeah. In fact, I was just talking with one of the people who worked on the engine before this conversation, and I asked that question whether, you know, do you anticipate that your service would add in tags and said, no, actually, this is you know, that I think I really think that the AP would be better served by using our own tags, along with a third-party search engine or any sort of AI engine to find content because we have a specific set of tags that we use to identify things.

And, you know, my colleagues who work with Derl in our news department have specific ways of tagging things. And I’ll bet you that most organizations have their own ways of tagging things. And to the extent that any one organization can be consistent about how they tag things, that’s miraculous in and of itself.

If different organizations can be consistent in the way they tag across multiple organizations, I don’t think that’s a much higher peak to climb. I think philosophically it makes sense not to try and impose that. That said, I think that some level of metadata tagging, and AI search combined will give us the best results.

Because for breaking news, for example, an AI agent that knows how to find particular things within, you know, concepts within a visual is very good, but it won’t know that at 3:02 p.m. yesterday a dam broke and that these and it was in this particular product. You know, it’s not going to know that kind of specific information and that’s where metadata is really, really valuable.

Yeah, I’m just saying about the idea that every company might tag in exactly the same way, if wishes were horses, then beggars would ride. Is this tool public-facing or is it only available to AP members?

Paul Caluori: It’s public facing. It’s both. We have an e-commerce function at AP newsroom that walk-up users can use to find visuals. It’s not our entire archive that’s available through that. The experience that our subscribers have will yield a look at a much larger content site. And also, you know, our subscribers have the benefit of working with our staff — those researchers I talked about and other experts who worked on this. So, it’s there for the walk up public. It’s better for our subscribers.

Got it. Derl, let me bring you into this conversation. As I mentioned at the top, AP has been at the vanguard of AI usage, and I’m thinking about years back when it started using AI to generate earnings reports for many smaller publicly traded companies and minor league sports baseball scores. What’s been the widening use of AI at AP since then?

Derl McCrudden: So fundamentally, the use cases we’re looking at are really about the same thing. It’s about making more impactful journalism. So, the earnings reports that you’re talking about dating back to 2014 was about instead of taking a small group of financial journalists and getting them to do as many earnings reports that they can, which had a finite number, we were able to automate a lot of that using and templated it, and then we could make a lot more of those reports and free up the time of those journalists to do higher volume work.

That’s the common thread of everything that we’re looking at. So, the examples I’d give are a few years ago we started working with a transcription company that takes audio, usually on video strips. It often converts it to text. And for any news producer who has ever had to spend, you know, time looking at a 30-minute interview and rushes from three camera outputs and trying to find the sound bite that they know they had in the moment of the interview. But their transcription themselves and their notebook is not as good as you know, as it should be.

This has been a lifesaver for us. And so, the company we use happens to be Trint. There are others out there, but we create about 27,000 hours of live video — we’re the biggest wholesaler of live video in the world, and we default to transcribing all of that content. And what that’s doing is instead of putting the pressure on a producer or an editor, it’s allowing to focus on the journalism and then let the tool do the heavy lifting so that we can find the business we need or discover the bits we need and to do it at speed.

I’ll just add one more thing. It also allows us to do something slightly different, which is to work in a different way, and that involves a mind shift. So, for instance, in our media asset management system, we’re now in a system where the cloud integrates with that. And instead of having to shuttle up and down or to down the timeline, we’re able to go to print transcription, but we want to highlight it and then it drops down into the edit. And it’s a different routine, if you like, from what we we’d gotten used to.

Beyond the sports and the earnings reports I mentioned before, have you found a wider application of sort of templatizable stories using AI to generate those?

Derl McCrudden: Well, we’re not doing that. But anything that revolves around a verifiable data set is fair game. The one application we are looking at is around localizing our content. So, for instance, when we do stories that are about, I don’t know, the price of gas in different states that we can then localize, we can then give our customers and our members the tools to localize that content so they can drill down into specific datasets around a bigger story.

And that allows a degree of reformatting content and reformatting stories and applying them in a different way than would have been possible otherwise.

AP is a very venerable and storied news organization, going back, as you said earlier, to the 1840s. And so, I’m sure that its forays into AI usage are coupled with some pretty serious ethical and operative principles. Can you describe how that process is playing out at AP, Derl?

Derl McCrudden: That is a real-time conversation. I suspect every newsroom in the world is having this right now. Our standards and principles are what we live by, how we operate. And, you know, not just our journalists, but all of our staff members subscribe to those principles.

But they have to not be set in concrete or in stone. They have to be relevant to the environment in which we operate. And what generative A.I. has created is an ever-changing landscape which for some people has come out of nowhere. But for others, as you know, we’ve seen it developing over a long period of time.

And so, where we’re at now is going back to basics. We expect our journalists not to use those nascent, generative AI tools to create journalism unless it’s something that is like a device within a story — like “we asked ChatGPT to give us a comment on this thing about generative AI.” And, you know, we’ve done stories around that kind of device, but otherwise we’ve actually sent an all-staff memo not that long ago saying, just a reminder, if you’re unclear that, you know, we’re not using this kind of technology today.

What it really leads us to is thinking very carefully about how we do take these tools and how they apply to us. We’ve all seen examples of tools out there that will create scenes based on an archive or a database that it draws on. We’re about eyewitness journalism and about putting journalists into the heart of a story and faithfully telling that story. So, the tools will help us in that work, but not create something out of nothing.

For both of you, where are your own key areas of concern around AI’s usage in news right now?

Derl McCrudden: For me, I would say it’s about understanding how we can spot things that are not real or what they’re not purporting to be. And that really means getting under the hood of generative tools to understand them, working with some of the big players in the market in order to understand how they are doing, tagging metadata, how what’s created out of a camera or a microphone is [real], then how we use that and make that available within the newsroom. That’s the direction in which it’s headed.

Watermarking content when it airs?

Derl McCrudden: Watermarking it and not just adding … I’m a journalist not a not a technologist, so I’ll get out of my lane pretty quickly if I go into detail. We need journalists to do journalism. And that means using a gut check. Is something too good to be true? It probably is. You know, if it seems that way, it probably is.

So, we still do journalism one on one. For us, it’s about trying to spot the fake. So, to answer your question about what keeps me awake at night, it’s the fakes that the generative AI tools can lead to, although it has a lot more positive uses as well.

Paul, what about you?

Paul Caluori: I’m right there with the question of whether this is real or not. So, you know, I’m responsible for products and our customers were already asking us, how are you going to guarantee that what you’re sending to us is authentic?

And that’s that is just central to the way our relationship with AI has to play out. We have to be able to be authentic all the way through, particularly when we are looking at UGC. You know, it’s one thing to work within our own journalism and our own people. It’s another to identify other sources.

And we need to do that. We need to do that on a regular basis, and we need to be able to stand behind it so that our subscribers can feel confident. I mean, that’s always been our goal, what we strive for is for our subscribers to feel confident. This is from the AP. This this is something I can stand behind, right?

So being able to sort that out, and as Derl points out, there may be some great ways that we can deploy tools, and that’s an ongoing conversation. But the thing that I’m most concerned about is authenticity. You know, I just I can imagine multiple scenarios, whether it’s visuals or data or, you know, fill in the blank. We need to be certain that something hasn’t been created by an untrustworthy source.

Right. Well, I mean, this technology seems to be exponentially or almost exponentially more sophisticated and potentially also encroaching more and more on the journalist’s role. Do either of you have a concern about that? And does the whole industry need to come together around this issue to come up with some broader guidelines and principles by which everybody should be operating now?

Derl McCrudden: I think in an ideal world, yes. But you alluded before to metadata and tagging, because if there is one system, wouldn’t that be great? I think getting an industry-wide consensus is difficult when something is developing at pace and so is it going to wreak so much change ahead of it. So, I think, yes, there needs to be a wide-scale industry discussion about this, and I think that’ll be ongoing.

There’s a lot of shortcutting there you that potentially some less ethically oriented organizations might take advantage of in AI, it seems.

Derl McCrudden: Yeah, but could I just add one thing? I do want to just make clear, we are not looking at this as a way of cost cutting. We’re looking at this technology as a way of supercharging our journalism and putting our journalists at the heart of stories, whether they are desk editors, editing text copy or photo editors editing photographs, or people in the field creating amazing video and the amazing storytelling we do every day. And we don’t see this technology replacing it.

We see this technology is taking the heavy lift out of mundane tasks to do that higher value work I talked about before.

Paul, you want the last word on this?

Paul Caluori: There are things that are really exciting about this in the future. I can imagine to the point where I was just making that having a large language model that’s been trained on the right things could be a fantastic resource for a journalist to ask questions. It could be like the sort of colleague who knows everything. If you’re confident in what the thing’s been trained on and it becomes a resource for you, it’s valuable.

That’s not the sort of thing that is, you know, to Derl’s point, that’s aiming to just undercut jobs. It’s aiming to make it easier for journalists to do better work. So, I think there’s a lot of discussion about how these tools can harm journalism. I think it’s helpful for us to think about ways in which we can help as well.

That said, to our earlier point, we need to keep our guard up. Authenticity is the name of the game and being trustworthy is what we’re all about. So, it’s a balance, particularly with generative AI. As far as the other types of AI, like the search and recognition tool that we’ve just launched, I’m just nothing but excited about our ability to find things that we couldn’t find before. It’s important to remember that AI is not simply a matter of generative things that we have to scratch our heads over.

Sure. Well, it’s fascinating tool that you’ve built there, and I’m sure broadcasters are going to look at it with great interest. Paul Caluori and Derl McCrudden, it’s undoubtedly a Pandora’s box that we’ve opened here. Thank you for sharing your thoughts about it.

You can watch and listen to past episodes of Talking TV on TVNewsCheck.com and on our YouTube channel, as well as all the major platforms on which you get your podcasts. We’re back most Fridays with a new episode. Thanks very much for tuning in to this one and see you next time.

The post Talking TV: AP Resets The Game On AI-Based Archive Search appeared first on TV News Check.

]]>
https://tvnewscheck.com/journalism/article/talking-tv-ap-resets-the-game-on-ai-based-archive-search/feed/ 0
News Organizations Find ‘Pure Gold’ In Their Archives https://tvnewscheck.com/tech/article/news-organizations-find-pure-gold-in-their-archives/ https://tvnewscheck.com/tech/article/news-organizations-find-pure-gold-in-their-archives/#respond Thu, 27 Apr 2023 14:00:11 +0000 https://tvnewscheck.com/?p=295362 Executives from The Weather Channel, Fox News and Capitol Broadcasting have rolled up their sleeves and dived into their organizations’ deep and messy archives. They told a panel at last week’s Programming Everywhere event that doing so has yielded untold — and very monetizable — treasures. Pictured (l-r): Nora Zimmett, The Weather Channel; Sam Peterson, Bitcentral; Jon Accarrino, Capitol Broadcasting; and Ben Ramos, Fox Archive. (Alyssa Wesley photo)

The post News Organizations Find ‘Pure Gold’ In Their Archives appeared first on TV News Check.

]]>
LAS VEGAS — Content stuck on tape from 40 years ago, when digitized and properly tagged, can be “pure gold,” allowing broadcasters to create fresh pieces and sell rights to the video.

While preserving all that old video sounds like an overwhelming project, only by doing so can broadcasters learn what assets they have access to, industry experts said during the Mining the Archives for New Shows panel at TVNewsCheck’s Programming Everywhere event on April 16 at the NAB Show.

Fox did a proof-of-concept preservation project of 5,000 tapes to convert the magnetic tapes and apply metadata to old video, Ben Ramos, VP, Fox Archive, field and emerging technology at Fox News, said.

Some of the content was damaged, and about 5% of the original conversions failed, but what made it through the process “was just pure gold, amazing content that hasn’t seen the light of day in 40 years,” he said.

The organization saw a return on the investment within nine months, he added, and expanded the project by an additional 45,000 tapes.

“It’s not until you make an effort, touch it, make an effort, that’s when you find how many assets you really have,” Ramos said.

A 60-minute tape costs Fox about $100 for the “Cadillac version” of conversion and tagging, Ramos said, while the “Kia version” with no bells and whistles runs around $20. But the higher-end version can yield “so many products” such as three seconds of New York City taxi cabs from 1977 and frame grabs of famous people, he said.

But Fox is also “reaching out to old news directors and photographers and assignment editors and people who were there that day” and asking them to provide detailed metadata on the old videos, Ramos said.

“It’s onerous and expensive,” he said, but added that the company “thinks it’s necessary with this specific subset of product,” although the return on investment for this undertaking remains to be seen. He called it a “curated white glove service that AI can’t replicate” that makes the quality of Fox’s archive “that much more special.”

Overall, he said, he needs “around $100 million” to digitize all the archives, but the steps to date have led to significant funding for more preservation efforts.

Nora Zimmett, The Weather Channel president, news and original series, said her organization has thousands of tapes in climate-controlled storage and is in the process of digitizing them.

“I didn’t appreciate how much of a process that is,” she said. “It’s not just the process, but the metadata, where to put it, and where to store it and how much to download.”

The metadata is critical, she said. “Your archive isn’t worth anything if you can’t find video by keywords,” Zimmett said. “It’s one thing to digitize and put it in the cloud, it’s another if there’s a user experience so producers and users can find the materials.”

The Weather Channel is bringing archived content into its current projects.

“You can slice and dice it so many different ways,” she said. “One piece of content can have so many lives now that we are well beyond a linear environment.”

And there’s no time like the present to focus on making the archives easier to use. “Every hour that goes by, we’re creating more video,” although that is being better tagged in the moment, she said. “It’s like the roadrunner — you’re never caught up.”

Zimmett said she wonders whether licensing an organization’s content to a big studio devalues the content. “After you answer the question of ‘can we,’ sometimes we wrestle with ‘should we?’” she said.

Jon Accarrino, VP of transformation and strategic initiatives for Capitol Broadcasting Co., said much of his company’s early content was destroyed due to improper storage, but in 2014 CBC digitized all of its tapes. It took truckloads, he said, to move the 36,000 tapes that needed to be converted.

“We sent off all these trucks, had all that content digitized and they mailed back this tiny little hard drive” full of SD video, he said.

In 2007, the company moved into fully digital operations with Bitcentral’s Oasis, and as such is working through archiving that content as well.

“A lot of it is older, MPE2 formats we need to recompress and move to new archive system we’re building,” he said.

And while a lot of companies are opting to store their archives in the cloud as a primary location, despite the egress costs, CBC relies on two physical locations with cloud as the backup. CBC is soft-launching its archives soon, he said.

Sam Peterson, Bitcentral COO, said many of the industry’s archives are not very organized. “The state of metadata and how interconnected it is, and the process they used to get it there, runs the gamut,” he said. “Some are thinking for the long-term, but some do not have the foresight.”

It is important to work with the end in mind. “What do we want to end up with, and how do we get there are thing you have to work through pretty quickly to not make it worse,” he said.

Peterson cautioned that archives maintenance can be simple but is not complete once a project is over.

“Know it will be iterative approach” because tools are rapidly evolving, he said. “The main thing is not to lose any more content. Let’s get it captured, at least.”

Read more from Programming Everywhere here.

The post News Organizations Find ‘Pure Gold’ In Their Archives appeared first on TV News Check.

]]>
https://tvnewscheck.com/tech/article/news-organizations-find-pure-gold-in-their-archives/feed/ 0
DataCore Buys Object Storage Pioneer Object Matrix https://tvnewscheck.com/tech/article/datacore-buys-object-storage-pioneer-object-matrix/ https://tvnewscheck.com/tech/article/datacore-buys-object-storage-pioneer-object-matrix/#respond Wed, 25 Jan 2023 18:42:05 +0000 https://tvnewscheck.com/?p=291731 DataCore Software  has acquired Object Matrix, an object storage pioneer and media archive specialist. Object Matrix’s best-of-breed appliances and cloud offerings will become part of the DataCore Perifery portfolio that […]

The post DataCore Buys Object Storage Pioneer Object Matrix appeared first on TV News Check.

]]>
DataCore Software  has acquired Object Matrix, an object storage pioneer and media archive specialist. Object Matrix’s best-of-breed appliances and cloud offerings will become part of the DataCore Perifery portfolio that specializes in delivering end-to-end, application-centric solutions for edge and high-growth markets, including media and entertainment. This acquisition, DataCore says, “reinforces the Perifery line of edge devices and solutions, while adding unparalleled talent and expertise to the Perifery team.”

Abhijit Dey, general manager of DataCore’s Perifery business, said: “Gartner predicts that more than 50% of enterprise-managed data will be created and processed at the edge by 2025. We’re excited to expand our Perifery portfolio with innovative solutions that will enable us to lead in edge markets. The Object Matrix product line perfectly complements our world-class solution portfolio, increasing reliability and agility for customers in the fast-growing media and entertainment edge market. Working with the Object Matrix team, we look forward to bringing industry-leading innovation, and providing continued technology and customer support.”

Object Matrix develops MatrixStore on-premises appliances, hybrid solutions and cloud storage offerings that let broadcasters and service providers securely manage content at every stage of the media lifecycle, offering “significant operational and financial benefits.” Object Matrix customers include NBC Universal, Warner Bros. Discovery, MSG-N, ATP Media, BT, and the BBC. Object Matrix customers will benefit from the innovative capabilities of DataCore’s well-established, worldwide presence and award-winning support.

Acquiring Object Matrix accelerates the DataCore.NEXT vision to aid customers in moving from core to the edge and cloud. By strategically aligning the Perifery and Object Matrix solution offerings, DataCore said it is “redefining how storage resources are best utilized to ensure optimal results and the lowest cost for customers, turning technology barriers into breakthroughs.”

Jonathan Morgan, Object Matrix CEO, said: “This announcement signifies an exciting new stage for Object Matrix, allowing us to extend our reach and product ambitions within DataCore while continuing to develop state-of-the-art on prem, cloud, and hybrid media storage solutions. By leveraging DataCore’s experienced leadership team, worldwide distribution — consisting of over 400 channel partners and more than 10,000 global customers — combined with world-class engineering, sales, and marketing, we are in an excellent position.”

The acquisition of Object Matrix follows DataCore’s acquisitions of Caringo in January 2021 and MayaStor in November 2021. Caringo’s Swarm Object Storage software currently serves hundreds of customers in more than 25 countries, growing in excess of 35% in 2022. DataCore Bolt, developed through the MayaStor acquisition, is a Kubernetes storage and data services platform for containerized applications. MayaStor is approaching 1 million downloads a year and has seen over 50% user growth since 2021, the company says.

The post DataCore Buys Object Storage Pioneer Object Matrix appeared first on TV News Check.

]]>
https://tvnewscheck.com/tech/article/datacore-buys-object-storage-pioneer-object-matrix/feed/ 0
WRAL Moves To Digitally Save Its Identity https://tvnewscheck.com/tech/article/wral-moves-to-digitally-save-its-identity/ https://tvnewscheck.com/tech/article/wral-moves-to-digitally-save-its-identity/#respond Tue, 13 Sep 2022 09:30:25 +0000 https://tvnewscheck.com/?p=282255 This Thursday, WRAL will announce a partnership with Eon Media, a Toronto-based tech company focused on artificial-intelligence video streaming solutions, that will generate boundless access to the station’s archives. The vast cache of now metadata-encoded video — amounting to half a million hours’ worth of content, according to Accarrino — will soon be made easily available to not only the WRAL newsroom, but also the general public.

The post WRAL Moves To Digitally Save Its Identity appeared first on TV News Check.

]]>
Around 30 years ago, an employee at WRAL, Capitol Broadcasting’s NBC affiliate in Raleigh, N.C., made what Jon Accarrino calls an “unfortunate decision” to toss roughly two decades of archival video into the trash. “Maybe they ran out of space or they just didn’t know what to do with it,” Accarrino says. “I’m not sure.”

Jon Accarrino

What Accarrino, Capitol’s VP of strategic business development, does know is that upon hearing of the loss, management was frustrated. They only grew more upset when they learned that, like many other stations looking to conserve storage space, WRAL had been taping over additional archival tapes with new content.

Accarrino says the boss “saw the legacy of the station and its history starting to disappear.” And though it took a while — as well as a sizable amount of cash — in 2015 WRAL digitized its remaining archives: 26,000 tapes, some dating back to 1976.

The initiative better ensured the safety and sustainability of the archives. It also set the stage for an even bigger technological undertaking, of which the benefits to WRAL and the community it serves will start to become known in a few months.

This Thursday, WRAL will announce a partnership with Eon Media, a Toronto-based tech company focused on artificial-intelligence video streaming solutions, that will generate boundless access to the station’s archives. The vast cache of now metadata-encoded video — amounting to half a million hours’ worth of content, according to Accarrino — will soon be made easily available to not only the WRAL newsroom, but also the general public. It will take the form of what WRAL calls a “digital storefront.”

Hitting the web with a soft launch toward the end of this year, and with a fully operational iteration arriving sometime in Q1 of 2023, WRALArchives.com will serve as a portal where community members, as well as “media clients and content creators across the globe,” according to a statement from Capitol, can purchase access to the archives. For a fee, the videos will be available for download; like on YouTube, interested parties can also stream content with ads.

The initiative creates a suite of revenue opportunities for WRAL. Accarrino says the station receives dozens of requests on a monthly basis from community members hoping to get their hands on old clips, not only of newscasts but other shows that have been beamed across WRAL airwaves. Soon they’ll be able to access the archives on their own through the portal, with an optimized search engine, and purchase or stream whatever videos they want. They’ll also be able to create NFTs from the clips.

“If somebody wants an NFT of their grandma in the news, we’re not going to stop them from acquiring that digital souvenir,” Accarrino says.

Furthermore, content creators, such as documentarians, will be able to buy videos from the archives for use in their own productions. However, the archives also make internal content production far more manageable than before.

Pete Sockett

Pete Sockett, WRAL’s director of engineering and operations and the project lead, envisions increased output particularly from the station’s investigative unit, making the most of what he calls the “online history museum of the city of Raleigh” that is the new station digital archive library.

None of this would be possible without the help of Eon Media and its artificial intelligence tools. Sockett says he and his team were “blown away” by the AI capabilities Eon Media presented once the company got a hold of the videos and encoded metadata throughout.

“The value of an archive is fully dependent on its metadata and searchability,” Accarrino says. “The AI, though it was expensive, was completely necessary for this project.”

Hiring human beings to scan through the archives — tapes of which were sometimes labeled with cryptic phrases like “Fire downtown, June 6, 1982,” says Sockett — and encode metadata to make the content searchable would have taken perhaps decades and not been worth the cost, Accarrino says. But for the Eon Media AI, he says, three-hour tapes were encoded in a few minutes.

Testing the new search engine on one occasion, Sockett says his team looked for footage of a “helicopter.” Among the many results, there was one standout: a years-old video of dubious quality shot on three-quarter-inch tape by the WRAL helicopter that happened to capture a competitor’s news helicopter off in the distance.

“When we saw that we were like, ‘Oh my gosh, this is just a whole different planet now,’” Sockett says.

In addition to such object identification, the AI’s facial recognition capabilities mean searchers of the archive can, among other feats, find every piece of footage that includes a particular WRAL reporter, even as they age across their career — helpful at a place like WRAL where some on-air talent has been with the station for four decades, Accarrino says.

The AI can even identify “sentiment,” Accarrino says. “So if I need a clip of ‘happy kids,’ the AI will go find that,” he adds.

There are even more capabilities made possible by the AI metadata-encoded archives. Among the most valuable, Accarrino says, is the station’s ability to conduct research on the quality of its content and how well it resonates with viewers, making it a “business intelligence tool.”

In the recent past, WRAL invested in a year-long study of its content, seeking insights from a consultancy team.

“The AI can do what that small team did in a year in just a couple minutes,” Accarrino says. “It allows us to be smarter, make faster decisions and reduces the amount of money we have to spend doing research.”

The station committed a major investment to the WRALArchives.com project, according to Accarrino. Sockett praises the Goodmon family, which still owns Capitol Broadcasting, for “believ[ing] in the technology that will drive the business,” and making such a sizable investment into it.

Will it prove worthwhile?

Says Sockett: “We’re going to find out.”

The post WRAL Moves To Digitally Save Its Identity appeared first on TV News Check.

]]>
https://tvnewscheck.com/tech/article/wral-moves-to-digitally-save-its-identity/feed/ 0
Print Archives Show Past Impeachments. Where Will We Go To Find The History Being Made Today? https://tvnewscheck.com/digital/article/print-archives-show-past-impeachments-where-will-we-go-to-find-the-history-being-made-today/ https://tvnewscheck.com/digital/article/print-archives-show-past-impeachments-where-will-we-go-to-find-the-history-being-made-today/#respond Wed, 20 Nov 2019 13:28:19 +0000 https://tvnewscheck.com/?post_type=more_news&p=241460 The post Print Archives Show Past Impeachments. Where Will We Go To Find The History Being Made Today? appeared first on TV News Check.

]]>
The post Print Archives Show Past Impeachments. Where Will We Go To Find The History Being Made Today? appeared first on TV News Check.

]]>
https://tvnewscheck.com/digital/article/print-archives-show-past-impeachments-where-will-we-go-to-find-the-history-being-made-today/feed/ 0
AP Acquires British Movietone Film Archive https://tvnewscheck.com/uncategorized/article/ap-acquires-british-movietone-film-archive/ https://tvnewscheck.com/uncategorized/article/ap-acquires-british-movietone-film-archive/#respond Tue, 27 Sep 2016 08:04:27 +0000 http://production.tvnewscheck.com/2016/09/27/ap-acquires-british-movietone-film-archive/ The post AP Acquires British Movietone Film Archive appeared first on TV News Check.

]]>
The post AP Acquires British Movietone Film Archive appeared first on TV News Check.

]]>
https://tvnewscheck.com/uncategorized/article/ap-acquires-british-movietone-film-archive/feed/ 0
Tedial Provides Integrated MAM, Archive Solution https://tvnewscheck.com/uncategorized/article/tedial-provides-integrated-mam-archive-solution/ https://tvnewscheck.com/uncategorized/article/tedial-provides-integrated-mam-archive-solution/#respond Thu, 28 May 2015 15:08:39 +0000 http://production.tvnewscheck.com/2015/05/28/tedial-provides-integrated-mam-archive-solution/ Tedial, an independent MAM technology solutions specialist, sold its Tarsys enterprise MAM system to Indian public service broadcaster, Doordarshan. The Tedial system, installed at Doordarshan’s Kolkata facility provides its operators […]

The post Tedial Provides Integrated MAM, Archive Solution appeared first on TV News Check.

]]>
Tedial, an independent MAM technology solutions specialist, sold its Tarsys enterprise MAM system to Indian public service broadcaster, Doordarshan.

The Tedial system, installed at Doordarshan’s Kolkata facility provides its operators  with core and browser-based desktop media tools as well as a fully integrated archive system, able to scale in volume and throughput. The Tedial archive will connect to Doordarshan’s sites in other regions as well as its central archive in Delhi. The installation was carried out by Tedial’s Indian partner MediaGuru.

Doordarshan provides television, radio, online and mobile services throughout metropolitan and regional India and overseas through the Indian Network and also Radio India. As well as delivering a fully equipped archive, Tedial’s Tarsys MAM will allow Doordarshan to ingest its existing U-matic, Betacam SP, DVCPR050 and XDCAM HD422 program content stored on tapes and optical disc media, into the system in both high and low resolution formats.

Tarsys is fully integrated with 30TB of HP Online Storage and a 48-slot HP LTO6 tape library as well as Tektronix Cerify QC for automated quality control of file-based content. It lets operators at the facility browse, catalogue, search and store content via the Tedial web-client.

Ashish De, director of engineering at Doordarshan, says: “The Tedial MAM system provides us with an integrated archive that is fully scalable enabling us to expand the system, as and when the time comes. Tarsys was recommended to us by MediaGuru following a rigorous tender process and we are very happy with the system.”

The post Tedial Provides Integrated MAM, Archive Solution appeared first on TV News Check.

]]>
https://tvnewscheck.com/uncategorized/article/tedial-provides-integrated-mam-archive-solution/feed/ 0
Dialogue Search Aids WSB Archives Project https://tvnewscheck.com/uncategorized/article/dialogue-search-aids-wsb-archives-project/ https://tvnewscheck.com/uncategorized/article/dialogue-search-aids-wsb-archives-project/#respond Thu, 05 Dec 2013 12:01:18 +0000 http://import.tvnewscheck.com/2013/12/05/dialogue-search-aids-wsb-archives-project/

WSB Atlanta is the first station to use Nexidia Dialogue, a speech search technology also being used by major cable news networks, including MSNBC. The technology uses basic language sounds called phonemes to find any audible words in any digital library. WSB can now quickly access more than 40,000 hours of material dating back to the 1950s.

The post Dialogue Search Aids WSB Archives Project appeared first on TV News Check.

]]>
Following a three-year initiative to digitize more than 40,000 hours of film and tape dating back to the 1950s, employees at WSB Atlanta still had a difficult time finding specific clips that were now stored on the LTO tape because of the lack of metadata.

“We were relying on tape labels, sticky notes — anything we could to figure out what was on a tape,” says Gary Alexander, director of engineering at WSB.

But last spring they discovered a product that has helped them zero in on just what they are looking at and attach proper metadata.

The product is Nexidia Dialogue. The speech-to-text technology is also being used by major cable news networks, including MSNBC. The Cox-owned ABC affiliate is the first TV station to put it to work.

“We started Nexidia over a weekend and let it index all of our files, and today, it’s really unbelievable that we can find things we never knew even existed,” Alexander says.

The technology uses basic language sounds called phonemes to find any audible words in a digital library.

“There are about 40 phonemes that make up the English language and about 400 total for all of human speech,” says Drew Lanham, SVP and GM of media and entertainment at Nexidia. “What we did is create an index of those phonemes and regardless of the word, it’ll be able to find it.”

Obamacare, for instance, isn’t a word found in modern dictionaries, and basic speech-to-text recognition software isn’t trained to know that word, Lanham says. “But by breaking down those phonetic elements, we can help a station find every time Obamacare is mentioned in a story,” he says.

Dialogue licenses start at about $10,000 and can go as high as $250,000, based upon how many hours of media it needs to index and how many users have access to it.

Earlier this year, WSB put together a special package for the 50th anniversary of Martin Luther King’s “I Have A Dream” speech and found the phrase “I have a dream” in thousands of clips.

“What we then did was search for an obscure part of the speech,” Alexander says. “And at the click of a mouse, we found some real gold for our story.”

The goal is to add metadata, a detailed description of the video, so that producers can easily recall clips inside their media asset management solution, says Richie Murray, president of Bridge Digital Inc., who served as a consultant on WSB’s project.

“Right now, it’s a two-step process,” he says. “They need to find it in Nexidia, then export it over to Avid Interplay, their [media asset management] system. The goal is to have just one interface.”

That’s still a ways off. Employees tag content with metadata as they search for clips and use them in the course of daily production, but the station will likely have to hire someone dedicated to the job.

Hiring the right person is critical, Murray says. “It sounds like a job for an intern, but the problem is that there could be this huge gap in knowledge.”

One staffer tagged a clip as “two old men shaking hands,” says Murray. “But in reality, it was two important governors. To whomever was watching that, it was just two old men. You needed to know who they were to get an accurate description.”

After the Cox family decided the preservation of the station’s content was worth the undisclosed investment, Atlanta-based Crawford Communications three years ago began digitizing all of WSB’s content —  film, one-inch tape, three-quarter-inch tape — to a Spectral Logic LTO tape library at the station.

The station has a partnership with the University of Georgia, which will also get a digital copy of all the content.

Nearly all of the archives are digitized today. After being digitized, high-resolution versions of the content were ingested into Avid Interplay using SGL Flashnet, a content storage solution geared for the broadcasting industry.

To speed searches, low-res proxies of the content were created using ProMedia Carbon, a file-based transcoder by Harmonic, and then stored on a network attached storage (NAS) server, also located on site.

Stations interested in taking on a major archiving project should first evaluate the value of their content, Murray says. “And in my opinion, that’s the hardest thing to do. The mechanics are there and it’s fairly simple once you get going. But you need to determine if you can generate money from doing something like this, or derive some kind of value.”

The post Dialogue Search Aids WSB Archives Project appeared first on TV News Check.

]]>
https://tvnewscheck.com/uncategorized/article/dialogue-search-aids-wsb-archives-project/feed/ 0
Cox D.C. Bureau Updates Video Archive System https://tvnewscheck.com/uncategorized/article/cox-d-c-bureau-updates-video-archive-system/ https://tvnewscheck.com/uncategorized/article/cox-d-c-bureau-updates-video-archive-system/#respond Wed, 09 Oct 2013 08:17:00 +0000 http://import.tvnewscheck.com/2013/10/09/cox-d-c-bureau-updates-video-archive-system/ The post Cox D.C. Bureau Updates Video Archive System appeared first on TV News Check.

]]>
The post Cox D.C. Bureau Updates Video Archive System appeared first on TV News Check.

]]>
https://tvnewscheck.com/uncategorized/article/cox-d-c-bureau-updates-video-archive-system/feed/ 0