With Amazon SageMaker, you pay only for what you use. Building, training, and deploying ML models is billed by the second, with no minimum fees and no upfront commitments.
Try Amazon SageMaker for free
As part of the AWS Free Tier, you can get started with Amazon SageMaker for free. If you have never used Amazon SageMaker before, for the first two months, you are offered a monthly free tier of 250 hours of t2.medium or t3.medium notebook usage with on-demand notebook instances or t3.medium instances with SageMaker Studio notebooks for building your models, plus 50 hours of m4.xlarge or m5.xlarge for training your models, plus 125 hours of m4.xlarge or m5.xlarge for deploying your machine learning models for real-time inference and batch transform with Amazon SageMaker. The free tier does not cover the storage volume usage. Your free tier starts from the first month when you create your first SageMaker resource.
Amazon SageMaker Studio is free
You can now access Amazon SageMaker Studio, the first fully integrated development environment (IDE) for free. SageMaker Studio gives you complete access and visibility into each step required to build, train, and deploy models. Using SageMaker Studio is free, you only pay for the AWS services that you use within Studio.
Lower total cost of ownership (TCO) with Amazon SageMaker
Amazon SageMaker offers at least 54% lower total cost of ownership (TCO) over a 3-year period compared to other cloud-based self-managed solutions. The complete TCO analysis for Amazon SageMaker can be found here.
On-Demand Notebook Instances
SageMaker Studio Notebooks
Studio notebooks are one-click Jupyter notebooks that can be spun up quickly. The underlying compute resources are fully elastic. These notebooks can be easily shared with others enabling seamless collaboration.
On-Demand Notebook Instances
On-Demand Notebook Instances
On-Demand Notebook Instances are machine learning (ML) compute instances running the Jupyter Notebook App. You will be billed for usage of the instance type you choose. Each notebook is listed separately on your bill.
SageMaker Processing lets you easily run your pre-processing, post-processing, and model evaluation workloads on fully managed infrastructure.
SageMaker makes it easy to train machine learning (ML) models by providing everything you need to train, tune, and debug models. When you use SageMaker Debugger, the built-in rules are free. For custom rules, you will need to choose an instance and you will be charged for the duration for which the instance is in use.
SageMaker Hosting: Real-Time Inference
When you deploy your models as Amazon SageMaker endpoints for real-time inference and enable Amazon SageMaker Model Monitor, you can use built-in rules to monitor your models or write your own custom rules. For built-in rules , you get up to 30 hours of monitoring for free. Additional usage will be based on usage.
SageMaker Hosting: Batch Transform
With Batch Transform, there is no need to break down the data set into multiple chunks or manage real-time endpoints. Batch Transform allows you to run predictions on large or small batch datasets.
Pricing Example #1: Studio Notebooks
A data scientist goes through the following sequence of actions while using SageMaker Studio Notebooks.
- Opens notebook 1 in a TensorFlow kernel on an ml.c5.xlarge instance, then works on this notebook for 1 hour.
- Opens notebook 2 a ml.c5.xlarge instance. It will automatically open in the same ml.c5.xlarge instance that is running notebook 1.
- Works on notebook 1 and and notebook 2 simultaneously for 1 hour.
- The data scientist will be billed for a total of two (2) hours of ml.c5.xlarge usage. For the overlapped hour where she worked on notebook 1 and notebook 2 simultaneously, each kernel application will be metered for 0.5 hour and she will be billed for 1 hour.
|Kernel Application||Notebook Instance||Hours||Cost per hour Sub total||Total|
Pricing Example #2: Processing
Amazon SageMaker Processing only charges you for the instances used while your jobs are executing. When you provide the input data for processing in Amazon S3, Amazon SageMaker downloads the data from Amazon S3 to local file storage at the start of a processing job.
The data analyst runs a Processing job to preprocess and validate data on two ml.m5.4xlarge instances for a job duration of 10 minutes. She uploads a dataset of 100 GB in S3 as input for the processing job, and the output data which is roughly the same size is stored back in S3.
|Hours||Processing Instances||Cost per hour||Total|
|1 * 2 * 0.167 = 0.334||ml.m5.4xlarge||$1.075||$0.359|
|General Purpose (SSD) Storage (GB)
||Cost per hour||Total|
|100 GB * 2 = 200
The sub-total for Amazon SageMaker Processing job = $0.359;
The sub-total for 200 GB of general purpose SSD storage = $0.0032.
The total price for this example would be $0.3622
Pricing Example #3: Training
A data scientist has spent a week working on a model for a new idea. She trains the model 4 times on an ml.m4.4xlarge for 30 minutes per training run with Amazon SageMaker Debugger enabled using 2 built-in rules and 1 custom rule that she wrote. For the custom rule, she specified ml.m5.xlarge instance. She trains using 3 GB of training data in Amazon S3, and pushes 1 GB model output into Amazon S3. SageMaker creates General Purpose SSD (gp2) Volumes for each Training instance. SageMaker also creates General Purpose SSD (gp2) Volumes for each rule specified. In this example a total of 4 General Purpose SSD (gp2) Volumes will be created. SageMaker Debugger emits 1 GB of debug data to customer’s Amazon S3 bucket.
|Hours||Training Instance||Debug Instance||Cost per hour
|4 * 0.5 = 2.00
|4 * 0.5 * 2 = 4
||No additional charges for built-in rule instances||$0
|4 * 0.5 = 2
|General Purpose (SSD) Storage for Training (GB)
||General Purpose (SSD) Storage for Debugger built-in rules (GB)||General Purpose (SSD) Storage for Debugger custom rules (GB)||Cost per GB-Month||Sub total|
|Cost||$0.00083||No additional charges for built-in rule storage volumes
The total charges for training and debugging in this example are $2.7811. The compute instances and general purpose storage volumes used by SageMaker Debugger built-in rules do not incur additional charges.
Pricing Example #4: Inference
The model in Example #3 is then deployed to production to two (2) ml.c5.xlarge instances for reliable multi-AZ hosting. Amazon SageMaker Model Monitor is enabled with one (1) ml.m5.4xlarge instance and monitoring jobs are scheduled once per day. Each monitoring job take 5 minutes to complete. The model receives 100MB of data per day, and inferences are 1/10 the size of the input data.
|Hours per month||Hosting instances||Model Monitor Instances
||Cost per hour||Total|
|24 * 31 * 2 = 1488||ml.c5.xlarge||$0.238||$354.144|
|31*0.08 = 2.5||ml.m5.4xlarge||$1.075||$2.688|
|Data In per month - Hosting||Data Out per month - Hosting||Cost per GB In or Out||Total|
|100MB * 31 = 3100MB
|10MB * 31 = 310MB||$0.02||$0.006|
The sub-total for training, hosting, and monitoring = $356.832; The sub-total for 3100 MB of data processed In and 310MB of data processed Out for Hosting per month = $0.056. The total charges for this example would be $356.887 per month.
Note, for built-in rules with ml.m5.xlarge instance, you get up to 30 hours of monitoring aggregated across all endpoints each month, at no charge.
Pricing Example #5: Batch Transform
Amazon SageMaker Batch Transform only charges you for the instances used during while your jobs are executing. If your data is already in Amazon S3, then there is no cost for reading input data from S3 and writing output data to S3.
The model in Example #1 is used to run Batch Transform. The data scientist runs four separate Batch Transform jobs on 3 ml.m4.4xlarge for 15 minutes per job run. She uploads an evaluation dataset of 1 GB in S3 for each run, and inferences are 1/10 the size of the input data which are stored back in S3.
|Hours||Training Instances||Cost per hour||Total|
|3 * 0.25 * 4 = 3 hours||ml.m4.xlarge||$1.12||$3.36|
|GB data In - Batch Transform
||GB data Out - Batch Transform||Cost per GB In or Out||Total|
The sub-total for Batch Transform job = $3.36; The sub-total for 4.4 GB into Amazon S3 = 0. The total charges for this example would be $3.36.