AWS for M&E Blog

Managing the history of the NFL

With such an abundance of National Footb­all League (NFL) programming available to modern fans, it’s easy to forget that coverage was once limited to game day broadcasts. Since launching in 2003, NFL Media has evolved into a sports entertainment powerhouse, publishing content through its own and partner channels, and a direct-to-consumer app. In addition to produced segments and full shows, the assets NFL Media manages include raw content.

Organizing all the associated data—which Brad Boim, NFL Media Senior Director, Media Management & Post Production, estimated to run around a million assets—is no small feat. He shared more detail about the NFL’s process during a fireside chat at a recent Amazon Web Services (AWS) for Media & Entertainment Symposium. Leveraging artificial intelligence (AI) and machine learning (ML) tools from AWS for an assist, Boim and his team are now able to focus on how to best package content and make it available for fans.

Early origins

NFL Media currently oversees the league’s digital presence, and Boim has been with the media services department since day one when it set up shop in Culver City, CA. He started at NFL Media as an editor, with a setup that included a few videotape machines and 3 Avid Adrenaline systems. There was no shared storage in place and tapes were physically passed between editors. After a few seasons, his team began using Apple Final Cut Pro and deployed a media asset management (MAM) system. However, at this point, the MAM was simply a media repository with no metadata to power any search capability. Over the years, NFL Media slowly improved and evolved its MAM approach, with help from cloud-based solutions. In 2021, the organization relocated to the just-finished SoFi Stadium complex in nearby Inglewood, CA.

“We embrace the idea of nothing is static,” Boim said. “At launch, NFL Media didn’t have a digital presence, it was a linear network with a traditional broadcast approach. Then, all of our digital properties evolved, and today, we’re in this place where I don’t know that you can really differentiate these experiences. We’re all painting with the same brush. Game footage is needed everywhere, so we aim to get this content distributed on all platforms and to everybody that we possibly can.”

New beginnings

After 18 years in the same space, Boim and his team were given the reins to build out the broadcast facility of their dreams from the ground up. They were able to select and deploy the latest technology, but the incredible opportunity also presented the challenge of migrating all the data from the Culver City studio into the new facility and new environment.

“Fortunately, we had moved a lot of our archiving workflows into private object storage, and that was managed in a Reach Engine interface,” Boim explained. “We knew that we were going to deploy a brand-new MAM environment in the new facility, so we used our archive as a data lake so that we could push out assets from our legacy MAM and cut the cord as quickly as possible.”

Part of the challenge was sorting through all the content, including proxies and metadata, and determining what to preserve, archive, or delete. Then, the team built an API integration between their cloud storage and their new MAM, which Vidispine orchestrated. With the legacy content transfer underway, NFL Media then started to focus on its live recording workflow for capturing anything from game day shows to press conferences. The new pipeline uses an EVS ingest tool, which supports the ingest of 42 real-time channels into the MAM.

The circle of metadata

Along with managing footage, NFL Media also maps that footage to NFL Next Gen Stats, statistical data capturing every play of every NFL game, to make the content highly searchable for elements like coach reactions or player celebrations. It also enables producers to search for specific scenarios, such as touchdowns by a certain player when a specific formation is used. An API integration that pushes NFL Next Gen Stats data to the MAM and the EVS ingest system helps automate much of the data logging, though some is done manually.

NFL Media has also set up an integration between its MAM and NFL Films, the film and television arm of the NFL. This connection sends broadcast footage to the NFL Media MAM, as well as all the source footage coming from OB trucks, and photographers and videographers on the field. Considering that NFL Films has its own MAM and taxonomy, Boim and team worked to standardize game terminology to make it simple for end users to search for content.

“We want to consume as much metadata as we can get our hands on and then map that into keywords and tags so that people can find it,” Boim noted. When it had a local MAM in Culver City, NFL Media had to run AI and ML workflows on a dedicated machine and content searches were limited to that machine. Now, Boim and his team have integrated the ability to search and stream proxies from a web browser, with more than 400 active users, primarily editors, tapping the system. Live shows for broadcast are sent to a playout server, with an additional layer of asset management for organization, then through a live control room and onto distribution; digitally distributed assets are sent through a separate pipeline. 

Prepping for Sunday

Sundays are typically the biggest game days in the NFL. For each matchup, NFL Media receives live game records from multiple feeds, and editors use that to cut highlights. The next day, the data deluge arrives, including radio calls of virtually every play of the game from both the home and away announcers that are sent from NFL Films. Boim and his team must register all the content onto the NFL Media MAM, then run it against the Next Gen Stats. They also receive all the different camera ISOs from the broadcast trucks to create melts—highlight reels that show key moments from multiple camera angles, which could amount to 800 individual clips per game.

“If there’s 11 angles of a specific play, we concatenate it into one clip and then run the Next Gen Stats API call to put all that metadata on that clip and then get it into the MAM system,” detailed Boim. “All of this content gets used by everybody, but it’s the bread and butter for our four-hour Sunday morning pre-game show. They want all these camera ISOs, they want every angle we could possibly provide, they want the radio calls. They want all these tools, so we run a 24-hour operation Monday, Tuesday, Wednesday to try to get it all processed and make it available for them to start editing on Thursdays.”

The NFL’s custom transcription and closed captioning API

In a traditional linear broadcast workflow, NFL Media would send its content to master control for live captioning, so they simply focus on the content creation and the delivery. However, for video on demand (VOD) or free ad-supported TV (FAST) channels, content captioning is increasingly an added requirement for the NFL Media team.

The NFL’s custom transcription and closed captioning API helps lighten the load by automating content transcription and closed captioning. Users can drag video or audio files into the UI, set an output language and speaker labels, then submit the file to the custom API built using Amazon Transcribe, which converts speech to text. The tool allows for a custom vocabulary, so NFL Media can tune player name spellings and terminology as needed, and it returns editable transcriptions quickly back into the UI. From the main dashboard, users can download a variety of file formats, including Avid ScriptSync and SRT for use in Adobe Premiere.

“Our captioning and transcription tool is on our front burner in real-time applications and our internal solutions are getting better all the time,” Boim said. “There’s a lot of mandated captioning that we need to do for a lot of content, and certainly some things are done live, but this gives us the ability to deliver embedded captions.”

Enhancing the NFL’s custom Facial Recognition API

NFL Media hosts an expansive library of digitized NFL media guides dating back to the 1940s. To enhance its player search capabilities, NFL Media built an image consumption portal leveraging Amazon Rekognition to support the internal web app. It has indexed about 140,000 NFL faces thus far and can compare any uploaded image against that dataset for more accurate results than simply going into a search engine. If a face is recognized as an existing image in the collection, the app will automatically label it and the NFL Media team can confirm the identity. The result shows both the person’s name along with an image accuracy confidence score. Noted Boim, “The image recognition is pretty amazing.”

Evolving AI applications

Along with deploying AI-powered solutions, NFL Media is looking at how its MAM workflow might benefit from generative AI (gen AI) technology. For example, an AI chatbot could create a summary of key points from an interview transcript, saving producers from having to review it manually.

“What I really enjoy about my job is trying to think ahead about what’s coming next. I’m really interested in gen AI, especially as it relates to what can you build and how can you create tools for content creators that they don’t currently have. We’re trying to figure out pie in the sky ideas, then how to get internal support for those ideas so that we can actually try to execute them,” Boim concluded.

To learn more about how NFL Media manages its massive archive, view the full fireside chat with Boim. You can also check out more highlights from the AWS for M&E Symposium for additional insights into how customers are transforming industries using AWS.

Lisa Epstein

Lisa Epstein

Lisa Epstein is a Senior Industry Marketing Manager at Amazon Web Services.