AWS for M&E Blog
Videon uses Amazon IVS to simplify live streaming
Supporting faster, simpler streaming using on-premises edge compute alongside Amazon IVS
Videon Labs (Videon), having been in the video industry for over 20 years and producing live streaming products for more than five years, is by no means new to low-latency live streaming. Videon was one of the first qualified ultra-low latency encoders to support AWS Elemental MediaStore, an Amazon Web Service (AWS) storage service optimized for media, and AWS Elemental MediaLive, a broadcast-grade live video processing service. After the release of Amazon Interactive Video Service (Amazon IVS), a managed live streaming solution that is quick and easy to set up and ideal for creating interactive video experiences, Videon created a native API integration with Amazon IVS. The integration applies Videon’s custom, Qualcomm-powered, edge computing to contribute live video streams into Amazon IVS to deliver a differentiated user experience for customers. Paul Brown, Videon’s cofounder and chief technology officer says it best, “Videon has done the work for you by using the Amazon IVS SDK and creating an out-of-the-box solution where any user can get a high-quality, low-latency live stream running quickly and easily without requiring deep technical knowledge.”
Amazon IVS is a great option for low-latency, interactive live streaming experiences. Built on the same video infrastructure that supports Twitch, Amazon IVS offers a fully managed solution for developers looking to add live streaming to their offering. By using Amazon IVS for its complete solution for live stream video workflows, API-driven tools, and video player software development kits (SDKs), builders can focus on innovating. Additionally, customers can use Amazon IVS alongside Videon’s EdgeCaster with ease.
Solving common live stream challenges using Videon’s EdgeCaster
Some of the most common challenges in live streaming we hear from customers is around the contribution of video and audio: What technologies and devices are best suited to stream video? How can one stream to multiple destinations simultaneously? What if the live stream requires low latency?
Videon created a simple way to ingest a live video feed from on-premises hardware with EdgeCaster and deliver low-latency live video using Amazon IVS.
Using Amazon IVS with EdgeCaster
Videon’s devices perform many other functions beyond simply encoding. The goal for using EdgeCaster with Amazon IVS was to provide a simple setup and user experience. Using the Amazon IVS suite of APIs and SDKs, Videon created a software interface to streamline the process for customers. Here’s how it works:
API integration between EdgeCaster and Amazon IVS creates a simple setup and delivery of low-latency, interactive live streaming workflows. On Videon devices, users enter their AWS access key and secret access key for the device to process and access Amazon IVS service in their account.
After a customer authenticates their AWS account on the Videon device, the user either chooses an existing Amazon IVS channel or chooses to create a new channel. The benefit of simply choosing an existing channel is that it eliminates the need to locate the ingest URL, stream key, or any other information required to ingest the video feed into Amazon IVS. Videon’s API integration simplifies the process to a couple of clicks without having to switch between Amazon IVS and the EdgeCaster stream configuration dashboard.
Additionally, using Videon’s API integration with Amazon IVS, users can create new channels from Videon’s control interface, choosing the various settings like name, channel type, and latency mode, among other settings, as applicable.
Unlocking additional live stream use cases
An example of the combined power of EdgeCaster alongside Amazon IVS is the solution created by Jeroen Kosterman, founder of LiveConnect. This solution facilitates high-quality, low-latency live medical streaming of endoscopic procedures and other medical investigations. LiveConnect offers a simple-to-use portal where the medical live streams are hosted with live audience feedback through chat and moderation.
The combination of quality and low latency helps medical professionals remotely connect to the live session and offer advice and opinions using the Amazon IVS live chat feature embedded in the stream portal. During medical procedures, every detail and every moment counts, meaning that the solution has to work reliably and maintain consistent performance for remote viewers.
After comparison with other options, LiveConnect determined that EdgeCaster and Amazon IVS offer the best results. Using EdgeCaster and Amazon IVS, LiveConnect also offers point-to-point solutions as well as conference solutions for medical professionals. When discussing LiveConnect solutions, Kosterman made clear that testing was critical to high-quality, reliable performance: “We did several tests . . . with many doctors all over the world. [We included] Australia, India, Spain and it worked flawlessly. [We] never had a complaint about stuttering or low quality.”
Videon’s EdgeCaster also offers multiple Real-Time Messaging Protocol inputs and multiple bitrate capabilities so that customers can stream to multiple environments at once, all from the same streaming device. For example, users can live stream to Amazon IVS, YouTube, and Facebook simultaneously, with streams properly configured to the requirements of each. This feature helps users deliver a live stream experience natively on their own website or application, while also extending the reach of content onto third-party services without having to set up multiple live stream feeds.
About Videon
Videon Labs (Videon), a leader in edge computing for video, makes live video processing and distribution faster and more efficient—with lower costs. It gives the freedom to process video at the source by combining built-in functions running on its local video compute environment with additional features from the cloud. Videon provides tools to develop innovative video applications to handle anything from simple, low-latency encoding and streaming to advanced artificial intelligence–powered use cases.