AWS Machine Learning Blog

Amazon and Facebook Collaborate to Optimize Caffe2 for the AWS Cloud

From Apache MXNet to Torch, there is no shortage of frameworks for deep learners to leverage. The various offerings each excel at different aspects of the deep learning pipeline and each meets different developer needs. The research-centric community tends to gravitate toward frameworks such as Theano, Torch and most recently PyTorch, while many in the industry have Caffe, TensorFlow or Apache MXNet deployed at scale for production applications. Given the heterogeneity in usage and users, AWS supports a range of frameworks as part of its developer tool offerings and, as a result, supports a broad spectrum of users.

AWS provides an open environment for developers to conduct deep learning. As we announced on April 18th, we are excited to further increase developer choice by offering support for Facebook’s newly launched Caffe2 project in the Ubuntu version of the AWS Deep Learning AMI (and coming soon in the Amazon Linux version, too).

What is Caffe2?

Caffe2—architected by Yangqing Jia, the original developer of Caffe—is a lightweight, modular, and scalable deep learning framework. Facebook deployed Caffe2 internally to help researchers train large machine learning models and deliver AI on mobile devices.

Now, all developers have access to many of the same tools for running large-scale distributed training and building machine learning applications for mobile. This allows the machine learning community to rapidly experiment with more complex models and deploy machine learning applications and services for mobile scenarios.

Caffe2 features include:

  • Easy implementation of a variety of models, including CNNs (convolutional neural networks), RNNs (recurrent neural networks), and conventional MLPs (multi-layer perceptrons)
  • Native distributed training interfaces
  • Mixed-precision and reduced-precision computations
  • Graph-based computation patterns that facilitate easy heterogeneous computation across multiple devices
  • Modularity, allowing the addition of custom recipes and hardware without risking codebase collisions
  • Strong support for mobile and embedded platforms in addition to conventional desktops and server environments

Why “yet another” deep learning framework?

The original Caffe framework, with unparalleled performance and a well-tested C++ codebase, is useful for large-scale conventional CNN applications. However, as new computation patterns emerge—especially distributed computation, mobile, reduced precision computation, and more non-vision use cases—Caffe’s design limitations became apparent.

By early 2016, the Facebook team had developed an early version of Caffe2 that improved Caffe by implementing a modern computation graph design, minimalist modularity, and the flexibility to easily port to multiple platforms. In the last year, Facebook has fully embraced Caffe2 as a multipurpose, deep learning framework, and has begun using it in Facebook products.

The Facebook team is very excited about Caffe2’s ability to support a wide range of machine learning use cases, and is equally excited to contribute Caffe2 to the open source community. The team’s also looking forward to working with partners like AWS and the open source software community to push the state-of-the-art in machine learning systems.

Getting Started with Caffe2 on AWS

To make it easy to try out Caffe2 on Amazon EC2, we’ve included it in the Ubuntu version of the Deep Learning AMI and will soon add it to the Amazon Linux version. Both versions are prebuilt with NVIDIA CUDA and CuDNN.

The AWS Deep Learning AMI for Amazon Linux or Ubuntu and the AWS Deep Learning CloudFormation template let you quickly deploy and run any of the major deep learning frameworks at any scale. The AMI lets you create managed, automatic scaling clusters of GPUs for large-scale training, and run inference on trained models. It’s preinstalled with Apache MXNet, TensorFlow, Caffe, Caffe2, Theano, Torch, CNTK, and Keras.

The AWS Deep Learning AMI is provided and supported by Amazon Web Services for use on EC2. There is no additional charge for the AWS Deep Learning AMI—you pay only for the AWS resources needed to store and run your applications. To see how to launch AMIs on EC2, see this blog.

For information on getting started with Caffe2—including documentation, a variety of tutorials, APIs, and more—see the Caffe2 website. We also plan to host Yangqing at our AWS Loft in San Francisco so keep an eye out for upcoming announcements.

Learn more

To learn more about Amazon AI and the AWS Deep Learning developer tools, see the Amazon AI page and our post announcing the release of the AWS Deep Learning AMI for Ubuntu.

For information on upcoming sessions at the AWS SF Loft and to register, see the San Francisco AWS Pop-up Loft webpage.

About the Authors

Joseph Spisak leads Deep Learning Product Management in AmazonAI
Yangqing Jia is a research lead and manager in the Facebook Applied Machine Learning organization