Posted On: Nov 19, 2021
Today we announce the general availability of Syne Tune, an open-source Python library for large-scale distributed hyperparameter and neural architecture optimization. It provides implementations of several state-of-the-art global optimizers, such as Bayesian optimization, Hyperband and population-based training. Additionally, it supports constrained and multi-objective optimization, and it allows users to bring their own global optimization algorithm.
With Syne Tune users can run hyperparameter and neural architecture tuning jobs locally on their machine or remotely on Amazon SageMaker by changing just one line of code. The former is a well-suited backend for smaller workloads and fast experimentation on local CPUs or GPUs. The latter is well-suited for larger workloads, which come with a substantial amount of implementation overhead. Syne Tune makes it easy to use SageMaker as a backend to evaluate a large number of configurations on parallel Amazon Elastic Compute Cloud (Amazon EC2) instances instances to reduce wall-clock time, while taking advantage of its rich set of functionalities (e.g., pre-built Docker deep learning framework images, EC2 Spot instances, experiment tracking, virtual private networks).
To learn more about the library, check out our GitHub repo for documentation and examples.