AWS for M&E Blog

Maple Leaf Sports & Entertainment debuts generative AI video editor to deliver content to fans faster

This blog post was co-authored by Ryan Khurana, Lead, Machine Learning and Advanced Analytics, MLSE.

In today’s social media-fueled 24/7 news and entertainment cycle, speed is of the essence, especially when it comes to sports coverage. Every pivotal moment has endless re-watch value, enabling a plethora of stories to be produced from vast archives. This is why Maple Leaf Sports & Entertainment (MLSE) approached Amazon Web Service (AWS) to harness the power of generative artificial intelligence (AI) to accelerate its media production and deliver fans more of the content they crave.

MLSE owns Toronto premier sports franchises, including the Toronto Maple Leafs, Toronto Raptors, Toronto Football Club (FC), and Toronto Argonauts, as well as operates the venues in which these teams play. To enhance team productivity and uncover new ways of engaging fans, MLSE looked to AWS to leverage generative AI across its organization.

The result of these efforts will be in production this fall, as MLSE debuts a new generative AI implementation—a web-based video editor that allows MLSE creatives to quickly and easily tap into a wealth of content to entertain fans like never before. Taking the heavy lifting out of media asset management (MAM), the new solution creates rich embeddings of video assets that allow video producers to find and edit relevant content entirely from text descriptions. They can search for and access clips directly, instead of relying on a media team to pull them. With an hours-long process reduced to minutes, editors can now focus on developing and posting creative content faster.

“With generative AI, we can mine neglected aspects of videos, both in and out of game, to create content tailored to each fan while maintaining the brand identity of each team preserved in team-specific generative models. This revolutionary approach allows creative teams to focus on high-level strategy while providing an unprecedented scale of personalized, engaging content,” said Farah Bastien, Director, Data Science & Engineering • Technology & Digital Office at MLSE.

Kicking off production

MLSE collaborated with Twelve Labs to build the generative AI video editor, which leverages Anthropic’s Claude large language model (LLM) in Amazon Bedrock and Amazon OpenSearch Service. The Twelve Labs multimodal foundation models run within the application on AWS to create rich embeddings detailing every aspect of a video, allowing editors to conduct text-based searches with low latency. MLSE started development on the new solution in June 2024 and expects to begin using it in production across its franchises this fall.

Instead of the content marketing team working late nights, they’ll be able to produce more compelling highlights as soon as the final whistle blows. This frees the team to focus on innovation and new strategic projects.

How it works

MLSE currently stores data for all its teams in different silos using separate providers for on-premises media asset management (MAM), and cloud-based digital asset management (DAM). By using Amazon Bedrock as the end point and tapping into Twelve Labs, the generative AI solution can quickly and cost-efficiently index proxies, which allows editors to search the full content library at extremely low latency.

As a first step, MLSE teams exported proxy versions of content and uploaded them to Amazon Simple Storage Service (Amazon S3). Content was then embedded using Twelve Lab’s Marengo 2.6 model, which creates vector representations of the content and stores them in OpenSearch Service for efficient indexing and retrieval. The model creates embeddings that allows for semantic search of time-dependent actions such as “player getting emotional after a win” or “camaraderie before a game starts.” This makes video content searchable via text. As games are played, proxies are exported and uploaded in near-real time so that footage is available to search immediately following games.

Once proxies are indexed, authenticated users, such as video producers or broadcasters, can access content via a web-based user interface (UI). They can either enter a sentence description of a specific clip or set of clips for retrieval or upload a script that describes multiple clips and edits to perform. When a script is uploaded, the LLM Agent in Claude on Amazon Bedrock creates an action plan and runs a series of searches for relevant clips. In all scenarios, the LLM Agent retrieves the requested clips and assembles edits. Users then provide feedback on the pulled clips and can adjust the search process using natural language until they’ve reached their desired result.

At that point, the LLM Agent generates an XML file that users can upload to professional video editing software on an MLSE workstation, which pulls through full-resolution source footage clips from the MAM for placement on a timeline with edits applied. This reduces the tedious upfront aspects of clip procurement and organization, freeing editors to put together cool highlights—like a super cut of all the fouls in a game—more quickly. Balancing efficiency and flexibility, the solution can expand clips from the original video, letting editors adjust the final deliverable as they see fit.

This reference architecture shows how MLSE's content is uploaded through Amazon S3 and leverages Amazon Bedrock to automatically generate video clips and assets based on prompts.

Partners in innovation

Encouraged by progress with the new video editor, MLSE is already considering how else to incorporate generative AI across the organization. Potential applications include scouting, contracts review, and coaching assistance through computer vision.

“After seeing the power of generative AI, we’re eager to explore other use cases that drive business value through scaling fan engagement and operational excellence. AWS has been instrumental in helping us navigate where this new technology can solve specific challenges and we look forward to getting the next initiatives into production,” concluded Sumit Arora, Vice President, Strategy & Analytics • Technology & Digital Office at MLSE.

MLSE selected AWS as its official cloud provider, including AI and machine learning (ML) services, in 2022. Together, the companies have continued to deliver exceptional moments for fans of Toronto’s premier teams. Learn more about building with scalable generative AI applications on AWS.

Ari Entin

Ari Entin

Ari Entin is Head, Sports & Entertainment Marketing at AWS, based in Silicon Valley. He joined Amazon in 2021 from Facebook where he led AI communications and marketing. He has driven integrated media campaigns for top-tier consumer electronics, sports and entertainment, and technology companies for decades.