Select your cookie preferences

We use essential cookies and similar tools that are necessary to provide our site and services. We use performance cookies to collect anonymous statistics, so we can understand how customers use our site and make improvements. Essential cookies cannot be deactivated, but you can choose “Customize” or “Decline” to decline performance cookies.

If you agree, AWS and approved third parties will also use cookies to provide useful site features, remember your preferences, and display relevant content, including relevant advertising. To accept or decline all non-essential cookies, choose “Accept” or “Decline.” To make more detailed choices, choose “Customize.”

Sign in
Your Saved List Become a Channel Partner Sell in AWS Marketplace Amazon Web Services Home Help

Run DL Models faster!

  • By Gavin
  • on 10/18/2023

The Intel distribution of OpenVINO AMI is a versatile solution for deep learning inference workloads on AWS. The OpenVINO AMI provides a pre-configured environment for OpenVINO, and preconfigured Jupyter Notebooks which makes it easy to get started and deploy models quickly


There are no comments to display