AWS for Industries
Food delivery with Tiny Mile on AWS Wavelength
Tiny Mile utilize AWS Wavelength to Make Deliveries Faster and at Lower Cost
Little pink Tiny Mile robots in Toronto, Canada are leveraging AWS Wavelength on Bell Canada’s 5G network to make deliveries safer, faster, and more cost effective.
Tiny Mile delivers food, snacks and other items to customers, and make sure that speed and security standards are met. The robots are small, four-wheeled, and travel at walking speed. They can carry six kilograms or about 13 pounds worth of goods. Tiny Mile’s API plugs right into a restaurant’s online ordering system, offering robot delivery as an option at checkout. When an order is placed, a Tiny Mile ships out a robot from a central facility to the restaurant to pick up the order and deliver it to the consumer.Tiny Mile has a long-term goal of achieving fully-autonomous operations of robot fleets. Detecting obstacles and collision avoidance is one of the critical capabilities for such an operation. The following figure shows the architecture that helps Tiny Mile Robots implement accident avoidance. Although these capabilities can be implemented using powerful onboard GPU-based hardware, the complexity and costs associated with robot maintenance and operations are high. The batteries also consume high power due to memory and processors, and thus require frequent charging while taking bots off the road during required peak hours. Therefore, Tiny Mile turned to AWS Wavelength on Bell Canada’s 5G network to offload their heavy computing needs to the cloud.
Bell Canada is the first Canadian network operator to launch multi-access edge computing (MEC) services using AWS Wavelength. Building on Bell’s agreement with AWS announced last year, together the two companies are deploying AWS Wavelength Zones at the edge of Bell’s 5G network, starting in Toronto. AWS Wavelength embeds AWS compute and storage at the end of communications service providers’ (CSPs) 5G networks to bring access closer to the end user or device, thereby lowering latency and increasing performance for services such as real-time visual data processing, augmented/virtual reality (AR/VR), artificial intelligence and machine learning (AI/ML), and advanced robotics.
Tiny Mile use real-time video analytics applications running on AWS Wavelength to detect obstacles and automatically issue a stop command to avoid collisions. Safety is the top priority for Tiny Mile, and the 5G Edge computing technology using AWS Wavelength helps Tiny Mile raise the bar for safe operations. In addition to the already existing pilot supervisor and onboard sensors, this solution provides a reliable mechanism to avoid collisions. Using AWS Wavelength, Tiny Mile was able to stream their robot cameras to the AWS Wavelength Zone via Bell’s 5G network and run ML inference models. The models detect potential obstacles for the robot and issue a stop command should an object come too close. The low latency of 5G paired with edge computing on AWS Wavelength enabled Tiny Mile’s robots to operate with low latency and enhance operational safety. The back-end technology provided by AWS and Bell Canada allowed for Tiny Mile to conduct real-time ML on vision data with a robot. It managed to avoid collisions and demonstrated the ability to stop the robot at least 4 feet away
This post will help you understand the technology components and data flow from Tiny Mile robots to and from the back-office operators.
Figure 1. Near-real time video processing Architecture using AWS Wavelength
Solution: In this section we will cover the steps involved in the Tiny Mile Robots use case:
- A continuous video stream from a robot connected to Bell’s 5G network is streamed into the AWS Wavelength zone in Toronto. The video stream is funneled through the ANT Media Server on AWS g4dn EC2 Instances that provide GPUs to enhance the video encoding performance.
- A custom video processing frame listener plugin is registered (refer to the following figure) with the ANT Media Server pipeline, and processes the video streams in near real-time by performing ML inference to detect objects.
Figure 2. Custom frame listener to process video frames
- To reduce the inference latency, the ML models are deployed on the AWS g4dn EC2 instances in the AWS Wavelength Zone. Tiny Mile engineering used Deep Java Library (DJL) to host the ML models and enable computer vision inference in near real-time.
- A custom algorithm predicts imminent collisions based on inference results and the robot attributes. The Frame listener sends an MQTT message to an AWS IoT
- The AWS IoT message broker relays the MQTT messages to the robot subscribed to the AWS IoT topic via the carrier gateway, and the robot stops moving to avoid a collision.
- Robot operators can also view the high-definition robot video streams (and optionally command/control the robots) via a browser using a low latency WebRTC connection established with the ANT Media Server.
Figure 3. Object detection on the robot operator dashboard
The Tiny Mile use case is one of the many business solutions that require low latency compute capabilities. Builders developing applications across industries that touch Internet of Things (IoT), Connected Vehicles, Immersive experience, media content production, transcoding and distribution, security, robotics, or any other workload that require near real-time processing can utilize AWS Wavelength. Technologists can use AWS Wavelength for broader use cases requiring near real-time processing at data origination or consumption points.
To get started with AWS Wavelength, review the steps here.