Posted On: Jun 3, 2021

AWS Solutions has updated the AWS MLOps Framework, an AWS Solutions Implementation that streamlines the pipeline deployment process and enforces architecture best practices for machine learning (ML) model productionization. This solution addresses common operational pain points that customers face when adopting multiple ML workflow automation tools.

This update adds multi-account capabilities allowing customers to provision multiple environments (e.g. development, staging, and production) across different accounts. This improves governance and security of ML workloads deployment while protecting production data with the appropriate control measures. This new version also includes a new pipeline to build and register a Docker image for a custom algorithm, to be used for model deployment on an Amazon Sagemaker endpoint.

This solution provides the following key features:

  • Initiates a pre-configured pipeline through an API call or a Git repository
  • Automatically deploys a trained model and provides an inference endpoint
  • Continuously monitors deployed machine learning models and detects any deviation in their quality
  • Supports running your own integration tests to ensure that the deployed model meets expectations
  • Allows provisioning of multiple environments to support your ML model’s life cycle
  • Multi-account support for bring-your-own-model and model monitor pipelines.
  • Allows customers to build and register docker images for custom algorithms, to be used for model deployment on an Amazon Sagemaker endpoint.

Additional AWS Solutions are available on the AWS Solutions Implementation webpage, where customers can browse solutions by product category or industry to find AWS-vetted, automated, turnkey reference implementations that address specific business needs.