Sign in
Categories
Your Saved List Partners Sell in AWS Marketplace Amazon Web Services Home Help

DeepSparse Inference Runtime

By: Neural Magic Latest Version: 1.3.2
Linux/Unix
Linux/Unix

Product Overview

Neural Magic's DeepSparse Inference Runtime AMI allows you to create an EC2 instance capable of running state-of-the-art machine learning models with GPU-class performance on x86 instance types. This allows you to run your machine learning workloads without concern for specific hardware accelerators. Simply select from a broad range of instance types based on the performance and cost requirements of your use case and deploy.

The deployed DeepSparse instance also comes with built-in benchmarking capabilities to help you assess the performance and cost benefits of your deployed model in a variety of scenarios.

Version

1.3.2

Operating System

Linux/Unix, Amazon Linux 2

Delivery Methods

  • Amazon Machine Image

Pricing Information

Usage Information

Support Information

Customer Reviews