AWS Partner Network (APN) Blog

Explore Key Themes in the AWS Machine Learning Visionaries Partners Report

By Dr. Kristof Schum, Global Segment Leader of Machine Learning – AWS
By Kris Skrinak, Machine Learning Segment Tech Lead – AWS
By Dr. Mario Bourgoin, Machine Learning Ops Tech Lead – AWS

The AWS Machine Learning Visionaries Partners Report (AWS Partner Central login required) is a quarterly series that tracks, selects, collates, and distributes horizontal technology capabilities enabled by machine learning (ML) in areas that Amazon Web Services (AWS) expects to be transformative in 1-3 years.

The series’ purpose is to share our insights with AWS Partners and to collect their interest, expertise, and insights in co-building along these prioritized themes. The reports include updates on series topics as we see changes in those areas, and new topics will also be added.

In this post, we present summaries of report topics. As new reports are published, this post will be updated to include new content and topics.

At this time, the topics summarized are federated learning operations, machine learning operations (MLOps) on Kubernetes, machine learning using high-performance computing (HPC), trusted artificial intelligence (AI), decision intelligence (DI), and generative AI.

Federated Learning Operations

Federated learning trains on distributed datasets without moving large amounts of raw data. This has the benefit of preserving privacy, simplified data management, and potential cost savings. We discuss the key challenges of federated learning and propose building an end-to-end federated learning operations platform that can span regions, accounts/organizations, and clouds.

We are currently partnering with Docomo Innovations to work on a federated learning use case for their business. We are building an advanced platform supporting different flavors of federated learning running on AWS.

MLOps on Kubernetes

Kubernetes is a leader in the deployment of end-to-end machine learning services because it provides for the flexible and robust allocation and management of compute resources.

Anticipating what would disrupt Kubernetes’s position will provide AWS and our partners with useful insights on the coming changes in the ML compute landscape. Current long-standing challenges for Kubernetes that affect their use for ML are intra-cluster security, the management of multiple clusters, and the sharing of hardware accelerators.

ML Using High Performance Computing

The convergence of natural language processing (NLP) and high-performance computing (HPC) on the cloud brings time-reduction, flexibility, and cost-effective benefits for companies trying to leverage model-parallel distributed training techniques in their current workloads.

The distributed training of large language models (LLMs) for general and domain-specific purposes is cost prohibitive, but generates a niche for few large companies. We recommend AWS Partners and customers explore the use of pre-trained models (PTMs) and/or existing NLP solutions in their projects and products.

Trusted AI

Earning trust with software written by machine learning presents unique challenges. Outcomes may be worse than the problems you endeavored to solve in the first place. Automated algorithmic decisions can reflect and amplify undesirable patterns.

Artificial intelligence will do as the data used to train models reflects, and it will do so without respect to the social boundaries which humans instinctively work within. All stakeholders need to communicate clearly about machine learning. Effective communication about trusted AI begins with knowing the terminology, concepts, and tools.

Decision Intelligence

Decision intelligence (DI) operationalizes AI in various forms to ask questions about what’s likely to happen if you act on a decision today. It empowers users with AI capabilities to facilitate the entire decision-making value chain, from insights to automating actions at scale.

Among the challenges in DI are designing tools suited for business decision-makers to use and operationalize machine learning models that facilitate a broad range of decision-making scenarios that are unique to their business.

Furthermore, trusted AI capabilities need to be ingrained into DI tools. If the toolchain is more complex than mainstream decision support systems and decision makers don’t trust or misuse the models, DI tools will have limited success like the current state of predictive analytics.

AWS AIDA partner Snowflake is empowering users to train models and request new predictions with Amazon SageMaker Autopilot using a standard SQL query.

“Using the Snowflake and Amazon SageMaker Autopilot integration, teams can simplify the effort of transforming data into ML-powered insights, expanding the power of data science beyond the immediate data science teams,” says Torsten Grabs, Director of Product Management at Snowflake.

“Customers such as Western Union that tailor their customers’ digital banking experiences through propensity and segmentation models will be able to hyper-personalize those experiences with more granular models,” adds Torsten. “This will reduce the heavy lift associated with building ML models and can lower operating costs through the automation of training and deploying state-of-the-art machine learning models from within Snowflake.”

Generative AI

Generative AI autonomously explore a problem space and create solutions with optimized outcomes. They augment the innovation process for human co-creators and enable speed to market, enhanced discovery, and higher quality outcomes.

For example, AWS Partner Quantiphi has developed a super resolution service that generates high-resolution video frames using existing low-resolution frames. The service captures existing content and intelligently generates aligned video frames with higher quality. The result is faster and cost effective content enhancement that accelerates the process of monetizing old content.

As the metaverse grows in importance, generative AI will be instrumental in the automatic and speedy delivery of the compelling experiences it demands. When coupled with the physics-based methods of photogrammetry, generative algorithms will not only create immersive virtual models of the real world, but they will use dynamic real-world data to create novel worlds in the metaverse.

Next Steps

If you and your organization find parts of the report relevant for your business and future customers, please reach out to your AWS Partner representative and reference this publication for a deeper dive.

If you see partnership opportunities, overlap, and synergy between the outlined topics and your capabilities, please reach our team at ml-visionary-partners@amazon.com.