How Eyegage scaled their life-saving app via Impact Accelerator partnership

How was this content?

To celebrate Black History Month, AWS Startups is featuring posts throughout February highlighting the contributions of Black builders and leaders in tech. Above all, these individuals inspire, empower, and encourage others—especially those historically underrepresented in tech—to prove what's possible.

Dr. LaVonda Brown, founder and CEO of Eyegage, whose expertise lies in artificial intelligence (AI) and eye-analysis technologies, developed a robotic engagement model based on user eye gaze and pupil size (dilation and constriction).

This published, patented, and licensed model has been applied to several use cases, including math education and physical therapy to improve outcomes. Her work has also led to using eye tracking as a viable biomarker for mild cognitive impairment and early-onset Alzheimer’s Disease.

LaVonda used her expertise to launch EyeGage, a mobile app that applies eye-analysis techniques to evaluate whether individuals are under the influence of drugs and alcohol to help prevent fatal accidents.

Eyegage planned to go to market with their mobile application in December 2022, but they had to deliver on a key requirement: getting the frontend app connected with Amazon SageMaker—the backend cloud platform that enables app developers to create, train, and deploy machine learning models.

“Our prototype worked well with Amazon Web Services (AWS), and our model was already trained,” says LaVonda. “But we needed to expose the model to SageMaker to ensure we could scale our services as user activity spikes.” But without prior experience with SageMaker, working through the documentation proved somewhat difficult for the Eyegage team. That’s when LaVonda turned to AWS for help.

AWS facilitates partnerships via Impact Accelerator program

Via her participation in the Black Founders cohort for the AWS Impact Accelerator Program, LaVonda gained access to personalized coaching, capital funding, and technical solutions. This connected EyeGage to Avahi, a cloud-first consulting company and AWS Global Startup Program partner.

“Avahi impressed us with their knowledge about machine learning models and their understanding of our business,” says LaVonda. “More importantly, they presented previous SageMaker projects they had taken on that were similar to what we needed. That gave us confidence Avahi could do the job.”

Avahi layered the code of the machine learning model in SageMaker to expose it as a seamless API to end users of the EyeGage mobile app. This included updating the model code so it can be exposed as an event-driven machine learning inference. Avahi also layered the model with additional services like AWS Lambda for serverless computing and Amazon API Gateway, a managed service that simplifies creating and maintaining APIs.

Scaling to save lives

With the machine learning model exposed, the frontend app could better scale to provide contactless, non-invasive, objective/unbiased, secure, accurate, and quick drug screening results.

By pursuing this partnership, Eyegage, with Avahi’s assistance, was also able to streamline the collection of end user data (such as identity and location), which allows the app to compare current and past scanning results and provide users with additional valuable information about their condition.

“Avahi also helped encode the AWS backend to receive JSON web tokens,” adds LaVonda. “This gives us a more secure way to send data back and forth, which is critical, given the sensitivity of the information we process for our customers.”

What’s next for Eyegage?

EyeGage is actively researching and updating its mobile application to include features to improve and assist in individual and community safety to decrease accidents. New application features, including Should I Drive? and FriendGage, promotes easy ways for accountability and accessibility to understand impairment levels.

There may also be a potential use for the company’s dataset beyond its immediate use for detecting substances in the body. “You can identify someone by their eyes or diagnose illnesses, concussions or diabetes. Or, you can tell something like if you’ve had caffeine, depending on how it responds to light.” LaVonda adds. “Monitoring eye behavior can be used for so much.”

Explore more content that celebrates the achievements of Black innovators, such as:

Bonnie McClure

Bonnie McClure

Bonnie is an editor specializing in creating accessible, engaging content for all audiences and platforms. She is dedicated to delivering comprehensive editorial guidance to provide a seamless user experience. When she's not advocating for the Oxford comma, you can find her spending time with her two large dogs, practicing her sewing skills, or testing out new recipes in the kitchen.

How was this content?