Posted On: May 7, 2021

Amazon SageMaker Automatic Model Tuning enables you to find the best version of a model by finding the optimal set of hyperparameter configuration for your dataset. Starting today, SageMaker Automatic Model Tuning now supports running up to 100 parallel training jobs for hyperparameter tuning, which gives you a 10X increase of parallel training jobs so you can complete your tuning faster. Additionally, for “Random” search strategy, SageMaker Automatic Model Tuning now supports exploring up to 10,000 hyperparameter configurations, a 20x increase over previous limit of 500, enabling you to improve coverage of search space leading to potentially better predictive performance of your model.

Running more training jobs in parallel is a preferred approach with “Random” search strategy since it reduces wall-clock time without impacting predictive performance of the models. For “Bayesian” search strategy, you may benefit from exploring more hyperparameter combinations when increasing the number of parallel training jobs, in order to manage the tradeoff between wall-clock time, predictive performance, and overall cost.

Increased limits for Amazon SageMaker Automatic Model Tuning are now available upon request in all existing regions where SageMaker Automatic Model Tuning is available, except AWS GovCloud. To get started, request a limit increase using the AWS Support Center or read our documentation to learn more about SageMaker Automatic Model Tuning.