Artificial Intelligence

Category: Customer Solutions

How Aviva built a scalable, secure, and reliable MLOps platform using Amazon SageMaker

How Aviva built a scalable, secure, and reliable MLOps platform using Amazon SageMaker

In this post, we describe how Aviva built a fully serverless MLOps platform based on the AWS Enterprise MLOps Framework and Amazon SageMaker to integrate DevOps best practices into the ML lifecycle. This solution establishes MLOps practices to standardize model development, streamline ML model deployment, and provide consistent monitoring.

How Schneider Electric uses Amazon Bedrock to identify high-potential business opportunities

How Schneider Electric uses Amazon Bedrock to identify high-potential business opportunities

In this post, we show how the team at Schneider collaborated with the AWS Generative AI Innovation Center (GenAIIC) to build a generative AI solution on Amazon Bedrock to solve this problem. The solution processes and evaluates each requests for proposal (RFP) and then routes high-value RFPs to the microgrid subject matter expert (SME) for approval and recommendation.

How Northpower used computer vision with AWS to automate safety inspection risk assessments

How Northpower used computer vision with AWS to automate safety inspection risk assessments

In this post, we share how Northpower has worked with their technology partner Sculpt to reduce the effort and carbon required to identify and remediate public safety risks. Specifically, we cover the computer vision and artificial intelligence (AI) techniques used to combine datasets into a list of prioritized tasks for field teams to investigate and mitigate.

Transforming home ownership with Amazon Transcribe Call Analytics, Amazon Comprehend, and Amazon Bedrock: Rocket Mortgage’s journey with AWS

Transforming home ownership with Amazon Transcribe Call Analytics, Amazon Comprehend, and Amazon Bedrock: Rocket Mortgage’s journey with AWS

This post offers insights for businesses aiming to use artificial intelligence (AI) and cloud technologies to enhance customer service and streamline operations. We share how Rocket Mortgage’s use of AWS services set a new industry standard and demonstrate how to apply these principles to transform your client interactions and processes.

Scaling Thomson Reuters’ language model research with Amazon SageMaker HyperPod

In this post, we explore the journey that Thomson Reuters took to enable cutting-edge research in training domain-adapted large language models (LLMs) using Amazon SageMaker HyperPod, an Amazon Web Services (AWS) feature focused on providing purpose-built infrastructure for distributed training at scale.

Genomics England uses Amazon SageMaker to predict cancer subtypes and patient survival from multi-modal data

Genomics England uses Amazon SageMaker to predict cancer subtypes and patient survival from multi-modal data

In this post, we detail our collaboration in creating two proof of concept (PoC) exercises around multi-modal machine learning for survival analysis and cancer sub-typing, using genomic (gene expression, mutation and copy number variant data) and imaging (histopathology slides) data. We provide insights on interpretability, robustness, and best practices of architecting complex ML workflows on AWS with Amazon SageMaker. These multi-modal pipelines are being used on the Genomics England cancer cohort to enhance our understanding of cancer biomarkers and biology.

How Thomson Reuters Labs achieved AI/ML innovation at pace with AWS MLOps services

How Thomson Reuters Labs achieved AI/ML innovation at pace with AWS MLOps services

In this post, we show you how Thomson Reuters Labs (TR Labs) was able to develop an efficient, flexible, and powerful MLOps process by adopting a standardized MLOps framework that uses AWS SageMaker, SageMaker Experiments, SageMaker Model Registry, and SageMaker Pipelines. The goal being to accelerate how quickly teams can experiment and innovate using AI and machine learning (ML)—whether using natural language processing (NLP), generative AI, or other techniques. We discuss how this has helped decrease the time to market for fresh ideas and helped build a cost-efficient machine learning lifecycle.