Sign in
Categories
Your Saved List Become a Channel Partner Sell in AWS Marketplace Amazon Web Services Home Help

Intel Neural Compressor for TF packaged by Bitnami

Intel Neural Compressor for TF packaged by Bitnami

By: Bitnami by VMware Latest Version: 1.14.2-5-r06 on Debian 11
Linux/Unix
Linux/Unix

This version has been removed and is no longer available to new customers.

Product Overview

TensorFlow is an open-source high-performance machine learning framework. It allows programmers to easily deploy algorithms and experiments without changing the architecture. This image has been optimized with Intel(R) Neural Compressor (INC) an open-source Python library designed improve the performance of inference with TensorFlow. INC applies quantization, pruning, and knowledge distillation methods to achieve optimal product objectives, such as inference performance and memory usage, with expected accuracy criteria. This offering accelerates AI inference on Intel CPU & GPU, especially with Intel(R) Deep Learning Boost. To increase its performance, this solution works with AVX512 instructions.

Trademarks: This software listing is packaged by Bitnami. The respective trademarks mentioned in the offering are owned by the respective companies, and use of them does not imply any affiliation or endorsement.

Why use Bitnami Certified Apps?

Bitnami certified images are always up-to-date, secure, and built to work right out of the box.

Bitnami packages applications following industry standards, and continuously monitors all components and libraries for vulnerabilities and application updates. When any security threat or update is identified, Bitnami automatically repackages the applications and pushes the latest versions to the cloud marketplaces.

Version

1.14.2-5-r06 on Debian 11

Operating System

Linux/Unix, Debian 11

Delivery Methods

  • Amazon Machine Image

Pricing Information

Usage Information

Support Information

Customer Reviews