AWS Cloud Enterprise Strategy Blog

ML in Society: An Interview with Michelle Lee, VP of Amazon Machine Learning Solutions Labs

In my recent blog post I shared a perspective on how to start your exploration of artificial intelligence (AI) and its subset, machine learning (ML), as business tools. I wrote this as a technology practitioner but a student of ML. At the other end of the experience continuum, I have the privilege of working with some of the world’s leading experts in these topics. On a course in 2019, proud of everything I’d accomplished during my career, I was humbled to sit between two new Amazon Web Services VPs, one tasked with launching thousands of low-Earth orbit satellites, the other tasked with establishing the Amazon Machine Learning Solutions Lab. The latter, Michelle Lee, has become a regular fixture at AWS advisory councils.

In this two-part blog post, Michelle and I discuss her insights about adopting ML in businesses. In this first part we talk about the impact of ML on society and businesses in general.

Phil


Michelle, AI is widely discussed with views ranging from the promise of great changes to society to doom and gloom articles on job losses. What’s your view on where AI is in terms of maturity?

While I understand and appreciate the concerns about technology like ML, indeed leaders at Amazon spend a lot of time considering these issues, I’ve also seen the enormous potential it has to benefit every industry, business and aspect of our society. For instance, life sciences companies are using ML to accelerate the pace of research into cures for deadly diseases. Last year, data scientists from my organization worked with AstraZeneca to develop computer vision models to accelerate the classification of tissue samples by pathologists, reducing the time taken by 50%. This meant research proceeded faster and the scarce skills of pathologists could be focused on areas of true differentiation and need.

When I talked to executives, they are no longer asking me why should they do machine learning, but rather, how they should be approaching it. And the good news is that it has never been easier for companies to leverage ML. Previously difficult or insurmountable barriers to entry have all been substantially removed. No longer do companies need to make massive capital investments for high-performance computing infrastructure, or spend years building technical expertise in ML. They can get started right away for minimal cost. And at AWS, we believe we have an important role to play in helping people take advantage of these new opportunities. My team is part of AWS’s commitment to democratize ML by making it easy for anybody to use. As we continue to remove the barriers to adoption, we expect more customers from an even broader range of industries starting to use ML in different areas of their business, and expect to see adoption continue to accelerate.

Can you give some examples of how AWS is going about this democratisation?

First, we’re focused on removing the undifferentiated heavy lifting that comes with machine learning. We’re proud that AWS has the broadest and deepest set of ML capabilities available to builders of all levels of expertise. At AWS, we think about ML as a stack consisting of three layers, each appealing to different audiences. The bottom layer is the most sophisticated and is designed for use by experts. It has a high-performance computing infrastructure and optimized programming frameworks that enable the development of novel algorithms. The second layer, the ML Platform Services, allows companies to take advantage of ML in an expansive way. Anchored by our Amazon SageMaker service, the features of this layer allow data scientists and developers to build, train, debug, deploy, and monitor ML models, run experiments, and label data at scale. For those who don’t have ML expertise, the top layer—the AI Services layer—supports computer vision, optical character recognition, natural language processing, conversational chatbots, transcription, translation, personalization, fraud detection, and forecasting. Customers can access these amazing deep learning–powered resources without having to hire a team of ML experts or attempting to build these functionalities themselves. For example, marketing leaders can use the Amazon Personalize service to tailor their campaigns to specific customers to increase engagement, all accessible through a relatively simple API. Customers do amazing things at all three layers of the stack.

To help customers gain advantage from these capabilities, we have programs like Machine Learning University, which offers a rich curriculum for both business and technology professionals to gain experience and expertise. For those who want a more intimate learning experience, we have our Embark program, which includes intensive online or onsite training led by AWS ML experts. For those who find the online learning format works well for them, we’ve partnered with Udacity, Coursera, and EdX to offer an ML curriculum. Products like AWS DeepRacer, DeepLens, and DeepComposer also make learning a fun experience.

We also have our Amazon ML Solutions Lab program, in which AWS-trained data scientists and ML experts work directly with customers to build working ML solutions for use cases identified by the customer. We’ve got our customers covered on their ML journey, from having the best set of tools and technology, to educational curriculum, to hands-on expertise.

Finally, we think it’s vital that we make sure that public high schools are doing a great job of building the next generation of scientists and engineers and that it extends well beyond high schools. To help children in building successful careers in computer science, we work to both inspire interest in and increase access to Science, Technology, Engineering, and Math (STEM) and computer science education. We’re committed to training the future generation of engineers through our participation in STEM programs and CS education, in part via an organization called Code.org. We also offer an initiative called Amazon Future Engineer – a four-part, childhood-to-career program aimed at inspiring and educating 10 million students from underrepresented and underserved communities each year to try computer science and coding. Additionally, AWS Educate is Amazon’s global initiative to provide students and educators with the resources needed to accelerate cloud-related learning.

Employee experience is a critical component of digital transformation and the future of work. How can AI/ML be most effectively deployed in service of those goals?

ML is playing an important role in employee experience, as it lends itself to really innovative solutions to long-standing challenges. Take knowledge sharing, for example. Most companies have a decade or more of investments in internal wikis, content management platforms, and shared file stores: massive sets of data spread across systems, but often hard to access or to find information in. We have all had the daunting experience of being a new employee and needing to figure out which website or app to turn to when seeking out important information, like how to add a new-born baby to your healthcare coverage. Chances are, some thoughtful person has written this down somewhere, but it can be hard to surface that information. ML provides an elegant solution to this problem through natural language queries. Employees can ask questions like “How do I add my newborn son to my insurance coverage?” with ML-powered search retrieving the most relevant content. This is what our enterprise search service, Amazon Kendra, does. Now imagine if you build a chatbot interface on top of this. This would further accelerate finding the right information through a conversational interface, without having to understand the underlying document repositories or what keywords to use. Or imagine being able to host a video conference with a global team, with real-time translation to the native languages of all the attendees. There are really so many innovative ways to enhance the employee experience through ML.

I’m optimistic about the future as we firmly believe that an entirely new set of opportunities will be created by this innovative technology, as we’ve seen with other disruptive tools throughout history.

 


About our Guest

Michelle Lee, Vice President of the Amazon Machine Learning Solutions Lab at Amazon Web Services

Prior to joining Amazon in 2019, Lee was an MIT CSAIL (formerly MIT Artificial Intelligence Lab) computer scientist, a tech executive who helped build a company that, like Amazon, grew quickly into a multi-national corporation, a professor, and a public servant. In the latter role, she served as the Under Secretary of Commerce for Intellectual Property and director of the United States Patent and Trademark Office (USPTO) from 2014 to 2017. In this position, Lee was the chief executive officer of one of the largest intellectual property offices in the world with around 13,000 employees and an annual budget in excess of $3 billion, and served as the principal advisor to the president on intellectual property policies.  She is the first woman to hold this position in American history.

Phil Le-Brun

Phil Le-Brun

Phil Le-Brun is an Enterprise Strategist and Evangelist at Amazon Web Services (AWS). In this role, Phil works with enterprise executives to share experiences and strategies for how the cloud can help them increase speed and agility while devoting more of their resources to their customers. Prior to joining AWS, Phil held multiple senior technology leadership roles at McDonald’s Corporation. Phil has a BEng in Electronic and Electrical Engineering, a Masters in Business Administration, and an MSc in Systems Thinking in Practice.