AWS Open Source Blog
Adopting machine learning in your microservices with DJL (Deep Java Library) and Spring Boot
Many AWS customers—startups and large enterprises—are on a path to adopt machine learning and deep learning in their existing applications. The reasons for machine learning adoption are dictated by the pace of innovation in the industry, with business use cases ranging from customer service (including object detection from images and video streams, sentiment analysis) to fraud detection and collaboration. However, until recently, the adoption learning curve was steep and required development of internal technical expertise in new programming languages (e.g., Python) and frameworks, with cascading effect on the whole software development lifecycle, from coding to building, testing, and deployment. The approach outlined in this blog post enables enterprises to leverage existing talent and resources (frameworks, pipelines, and deployments) to integrate machine learning capabilities.
Introduction
Spring Boot, one of the most popular and widespread open source frameworks for microservices development, has simplified the implementation of distributed systems.
Despite the broad appeal of this framework, there are few options to easily integrate it with Machine Learning (ML). Existing solutions such as stock APIs often do not meet customized application requirements, and developing customized solutions is time-consuming and not cost-effective.
Developers approached the integration of machine learning capabilities into existing applications in a number of ways. Taking inference as an example, current options vary from using stock API to having a Python or C++ based application wrapped with an API for remote calls. Stock API, though based on robust models, may not quite fit your domain or industry, causing problems that will be discovered in production and few options to address them. In other cases, when running inference at scale (for example, in streaming applications or latency-sensitive microservices), making a remote call may not be a viable option for performance reasons.
Recognizing this challenge, we at AWS have created a few open source projects to facilitate the adoption of ML for Java and microservices, and ultimately to help our customers, partners and the open source community as a whole. These initiatives align closely with the AWS goal to take technology that was traditionally cost-prohibitive and difficult for many organizations to adopt, and make it accessible to a much broader audience.
In this blog post we will demonstrate how Java users can integrate ML into their Spring applications with Spring Boot Starter for Deep Java Library (DJL). We will review how to apply these frameworks in action and integrate ML capabilities into a microservice, demonstrating common deep learning use cases around object detection and classification.
DJL overview
Deep Java Library (DJL) is an open source, high-level, framework-agnostic Java API for deep learning. It is designed to be easy to get started with and simple to use for Java developers. DJL provides a native Java development experience and functions like any other regular Java library.
DJL provides a convenient abstraction layer for using the most popular AI/ML frameworks such as Apache MXNet, PyTorch, and TensorFlow. However, it is not just a convenience on top of the existing libraries (some of which provide Java API / bindings). With DJL API, you are getting a uniform and consistent layer that can interact with all of these frameworks, allowing you to swap out the framework of your choice without any impact to the client code.
This unique feature, in combination with a fairly rich model zoo repository (a repository with pre-trained models), can enable ML engineers to find optimal models for the task at hand regardless of the underlying model implementation.
For more information on DJL please refer to the DJL GitHub Repository and FAQ.
DJL Spring Boot Starter
Spring Boot Starter is a one-stop shop for all Spring and related technologies that you need in your project, without having to hunt through sample code and copy-paste loads of dependency descriptors. Please see the official Spring Boot documentation for more information on starters.
Following this definition, DJL Spring Boot Starter provides all dependencies required to start using DJL in Spring as a single artifact. In addition to dependency management, the starter includes an auto-configuration that makes it possible to automatically wire dependencies based on the configuration file supplied by the user, and make them available as beans in the Spring Application context.
Dependency management
The DJL library is platform-specific, but it provides ways to automatically look up the correct dependency based on the target operating system. DJL can also be configured with different underlying engines (such as MXNet, PyTorch, or TensorFlow); the user is expected to make this choice before the starter is used. However, even after the choice is made, the underlying engine as well as the target operating system architecture can be changed by modifying your Maven (or Gradle) dependency, with no impact to your code.
Starter dependency management is organized so as to provide the most flexibility to the user.
For the MXNet starter, the following operating system classifiers are supported: osx-x86_64
for Mac OS X, linux-x86_64
for generic Linux, win-x86_64
for Windows distributions, and auto
for automatic detection of the target operating system. The last option requires connectivity to the external artifact repository (e.g., Maven Central) at runtime, which may be an issue for systems with tight security constraints and restricted egress.
Here’s an example of MXNet Dependency for Linux architecture, optimized for container workloads:
Auto
dependency that will download the correct artifact at runtime:
Using PyTorch
as an underlying engine, the starter dependency is:
Gradle dependencies will look similar. It is important to set the JNA version as "jna.version=5.3.0"
inside your gradle.properties
, since the Spring Boot parent POM uses an older version of JNA, which will not work with the DJL starter. Here is an example of your Gradle build file build.gradle.kts
(Kotlin DSL is used in this example), assuming the Spring Boot plugin is registered:
Spring auto-configuration
Once dependencies are configured correctly in your Spring Boot application, the next step is to configure your beans and wire them properly for injection. It is fairly easy to configure DJL-related beans and make them available in the Spring application context, but it requires internal knowledge of the library as well as the peculiarities of individual classes for proper scoping—some beans are thread-safe, others should be scoped per request/thread. To assist with this configuration, the DJL Spring Boot starter provides an auto-configuration.
This component is separate from the dependency component and requires an explicit dependency. We did it this way for a couple of reasons:
- Some developers prefer to have full control over configuration options and may not want the Spring “auto magic”. In such cases, the starter will support just the basic set of dependencies and allow developers to wire components explicitly.
- The auto-configuration component is generic for all kinds of the DJL configurations: Regardless of the underlying target operating system or the actual engine, the auto-configuration component remains the same. So, using the same auto-configuration, developers can swap underlying dependencies as a single step operation without any impact to the code.
Declaring dependency on auto-configuration in Maven:
Or in Gradle build.gradle.kts
:
Once the dependency is declared, the Spring Boot framework will automatically locate the configuration and wire the required components. At present, for inference it will look up the model from the model zoo repository and create a predictor that will be readily available to run inference.
Users are expected to supply a standard Spring configuration (application.yml
or application.properties
) model to use one of the supported application types:
For example, in order to run object detection in images, the user can set the application type to OBJECT_DETECTION
. DJL-related configuration should be namespaced under djl
root, for example djl.application-type=OBJECT_DETECTION
if application.properties
is used.
Here is an example of a yaml
configuration for DJL auto-configuration: