Business Productivity

Generating insights from Zoom Meetings with AWS


Today, you can use AWS machine learning services to get real-time insights from your calls. Machine learning-generated insights include live transcriptions to remove note-taking requirements, user sentiment and phrase detection to drive in-call guidance, and conversation characteristics such as talk speed and non-talk time for in-call coaching.

Organizations use these insights to help improve their customer experience, productivity, and business outcomes, and in some cases, for regulatory compliance. This is why many organizations want to capture insights from all their calls, and with the elasticity of AWS, they can.

There are several existing AWS products and services to generate call insights. Which one you will use depends on your use case. For example, Contact Lens is a turnkey solution for Amazon Connect customers; Amazon Transcribe Call Analytics provides batch and streaming APIs processing audio; and the Amazon Chime SDK now provides call analytics for telephone calls.

Zoom Meetings insights with AWS machine learning

What about when a call is not a phone call but a Zoom Meeting? That is why AWS and Zoom are partnering to provide AWS machine learning insights for your Zoom Meetings. AWS will empower organizations with the same insights across their phone conversations and Zoom Meetings. They can use the same data lake, queries, and dashboards for comparable insights across the two call types, using Amazon Chime SDK call analytics.

“We are excited to be working with AWS to provide our customers with more choices,” said Brendan Ittelson, CTO at Zoom. “Now, our mutual customers will have the choice of generating their own analytics, including voice tone analysis, phrase detection, and call categorization, using AWS.”

Like a phone call, Zoom Meetings offer a convenient way to connect with anyone worldwide. But unlike a phone call, Zoom Meetings provides users with video and screen sharing from the convenience of their preferred device.

For example, a mortgage broker might have a choice between scheduling a Zoom Meeting or a phone call with a mortgage client. The Zoom Meeting’s video and screen share allows the broker to review the client’s identification documents, inspect the supporting documentation, and guide form completion – just as they would at an in-person appointment. In contrast, phone calls require the client to digitize all documents and send them asynchronously via email or upload them to a portal.

“AWS is partnering with Zoom so our customers can analyze their Zoom Meetings and generate insights with AWS machine learning services,” said Sid Rao, GM of Amazon Chime SDK at AWS. “Organizations will be able to improve productivity, increase customer satisfaction, and better understand their business with the same underlying AWS machine learning technology services they use and trust today.”

Audio from Zoom Meetings and phone calls flow into Amazon Chime SDK call analytics to deliver consistent alerts, insights and metadata

Figure 1: Overview of call analytics for Zoom Meetings and phone calls

Amazon Chime SDK call analytics provides insights using built-in voice analytics and integrations with Amazon Transcribe Call Analytics. The insights generated include voice tone analysis, speaker sentiment, call categorization, transcription, and post-call summaries. Insights can be immediately actionable using real-time alerts configured on various factors, such as when a specific phrase is spoken or a negative voice tone is used.

Call metadata and insights are delivered in a format readily ingestible by most data lakes. Once ingested, insights are available for further analysis incorporating other user information, such as employee tenure or customer loyalty tier. You can also store a recording of the audio call in an Amazon S3 bucket for traceability of insights back to the source audio or any other purpose.

DIY Zoom Meetings analytics

If you are an AWS builder, you can get started today by building a connection from your Zoom Meetings to AWS to retrieve the audio for analysis using the AWS Virtual Participant Framework.

The framework includes sample integrations developed by AWS Prototyping and Cloud Engineering (PACE) and Solutions Architects. The Virtual Participant Orchestrator for Zoom Meeting sample includes a docker container image and AWS CloudFormation templates for a solution that captures audio from a Zoom Meeting and publishes it to Amazon Kinesis Video Streams (KVS). Once the audio is in a KVS stream, it can be consumed by your application and routed for analysis by AWS ML services such as Amazon Transcribe Call Analytics.

The DIY approach lets AWS builders get started today and puts them in control of which Zoom Meetings to analyze and which insights to generate. In the meantime, AWS and Zoom are working together to make AWS insights from Zoom Meetings available to everyone with just a few mouse clicks.

Learn More

To learn more about insights for phone calls, review the following resources:

To learn more about the AWS Virtual Participant Framework, review the following resources: