With Amazon SageMaker, you have two choices to pay, and you only pay for what you use. On-demand pricing is billed by the second, with no minimum fees and no upfront commitments, and SageMaker Savings Plans offer a flexible, usage-based pricing model in exchange for a commitment to a consistent amount of usage.
Get started with Amazon SageMaker for free
Amazon SageMaker is free to try. As part of the AWS Free Tier, you can get started with Amazon SageMaker for free. Your free tier starts from the first month when you create your first SageMaker resource. The details of the free tier for Amazon SageMaker are in the table below.
|Amazon SageMaker capability||Free Tier usage per month for the first 2 months
|Amazon SageMaker Studio notebooks , On-demand notebook instances||250 hours of ml.t3.medium instance on Studio notebooks OR 250 hours of ml.t2 medium instance or ml.t3.medium instance on on-demand notebook instances|
|Amazon SageMaker Data Wrangler||25 hours of ml.m5.4xlarge instance|
|Amazon SageMaker Feature Store||10M write units, 10M read units, 25 GB storage|
|Training||50 hours of m4.xlarge or m5.xlarge instances|
|Inference||125 hours of m4.xlarge or m5.xlarge instances|
Amazon SageMaker Savings Plans
Amazon SageMaker Savings Plans help to reduce your costs by up to 64%. The plans automatically apply to eligible SageMaker machine learning (ML) instance usage including SageMaker Studio Notebooks, SageMaker On-Demand Notebooks, SageMaker Processing, SageMaker Data Wrangler, SageMaker Training, SageMaker Real-Time Inference, and SageMaker Batch Transform regardless of instance family, size, or region. For example, you can change usage from a CPU instance ml.c5.xlarge running in US East (Ohio) to a ml.Inf1 instance in US West (Oregon) for inference workloads at any time and automatically continue to pay the Savings Plans price. Learn more »
Amazon SageMaker Studio is available at no additional charge
You can now access Amazon SageMaker Studio, the first fully integrated development environment (IDE) for free. SageMaker Studio gives you complete access and visibility into each step required to build, train, and deploy models. Using SageMaker Studio is free, you only pay for the AWS services that you use within Studio.
You can use many services within SageMaker Studio at no additional charge, including:
- SageMaker Pipelines to automate and manage automated ML workflows
- SageMaker Autopilot to automatically create ML models with full visibility
- SageMaker Experiments to organize and track your training jobs and versions
- SageMaker Debugger to debug anomalies during training
- SageMaker Model Monitor to maintain high quality models
- SageMaker Clarify to better explain your ML models, and
- SageMaker JumpStart to easily deploy ML solutions for many use cases
You only pay for the underlying compute and storage resources within SageMaker or other AWS services, based on your usage.
Amazon SageMaker Ground Truth
Learn more about pricing for Amazon SageMaker Ground Truth, a fully managed data labeling service that makes it easy to build highly accurate training data sets for machine learning.
Amazon SageMaker Edge Manager
Learn more about pricing for Amazon SageMaker Edge Manager to optimize, run, and monitor ML models on fleets of edge devices.
Lower total cost of ownership (TCO) with Amazon SageMaker
Amazon SageMaker offers at least 54% lower total cost of ownership (TCO) over a 3-year period compared to other cloud-based self-managed solutions. Learn more with the complete TCO analysis for Amazon SageMaker.
Amazon SageMaker Pricing Calculator
You can now estimate your spend for using Amazon SageMaker, using the SageMaker Pricing Calculator. With the pricing calculator, you can get a cost estimate for your use case, export your estimates for offline analysis, and adjust your spend based on your requirements.
On-Demand Notebook Instances
Amazon SageMaker Studio Notebooks
Amazon SageMaker Studio Notebooks are one-click Jupyter notebooks that can be spun up quickly. The underlying compute resources are fully elastic and the notebooks can be easily shared with others enabling seamless collaboration. You are charged for the instance type you choose, based on the duration of use.
On-Demand Notebook Instances
On-Demand Notebook Instances
On-Demand notebook instances are compute instances running the Jupyter notebook app. You are charged for the instance type you choose, based on the duration of use.
Amazon SageMaker Processing
Amazon SageMaker Processing lets you easily run your pre-processing, post-processing, and model evaluation workloads on fully managed infrastructure. You are charged for the instance type you choose, based on the duration of use.
Amazon SageMaker Data Wrangler
Amazon SageMaker Data Wrangler reduces the time it takes to aggregate and prepare data for machine learning from weeks to minutes. You pay for the time used to cleanse, explore, and visualize data. SageMaker Data Wrangler is priced per instance type by the second.*
Amazon SageMaker Feature Store
Amazon SageMaker Feature Store is a central repository to ingest, store and serve features for machine learning. You are charged for writes, reads, and data storage on the SageMaker Feature Store. Writes are charged as write request units per KB, reads are charged as read request units per 4KB, and data storage is charged per GB per month.
Amazon SageMaker Training
Amazon SageMaker makes it easy to train machine learning (ML) models by providing everything you need to train, tune, and debug models. You are charged for usage of the instance type you choose. When you use Amazon SageMaker Debugger to debug issues and monitor resources during training, you can use built-in rules to debug your training jobs or write your own custom rules. There is no charge to use built-in rules to debug your training jobs. For custom rules, you are charged for the instance type you choose, based on the duration of use.
Amazon SageMaker Hosting: Real-Time Inference
Amazon SageMaker provides real-time inference for your use cases needing real-time predictions. You are charged for usage of the instance type you choose. When you use Amazon SageMaker Model Monitor to maintain highly accurate models providing real-time inference, you can use built-in rules to monitor your models or write your own custom rules. For built-in rules, you get up to 30 hours of monitoring at no charge. Additional charges will be based on duration of usage. You are charged separately when you use your own custom rules.
Amazon SageMaker Batch Transform
Using Amazon SageMaker Batch Transform, there is no need to break down your data set into multiple chunks or manage real-time endpoints. SageMaker Batch Transform allows you to run predictions on large or small batch datasets. You are charged for the instance type you choose, based on the duration of use.
Pricing Example #1: Studio Notebooks
A data scientist goes through the following sequence of actions while using Amazon SageMaker Studio Notebooks.
- Opens notebook 1 in a TensorFlow kernel on an ml.c5.xlarge instance, then works on this notebook for 1 hour.
- Opens notebook 2 a ml.c5.xlarge instance. It will automatically open in the same ml.c5.xlarge instance that is running notebook 1.
- Works on notebook 1 and notebook 2 simultaneously for 1 hour.
- The data scientist will be billed for a total of two (2) hours of ml.c5.xlarge usage. For the overlapped hour where she worked on notebook 1 and notebook 2 simultaneously, each kernel application will be metered for 0.5 hour and she will be billed for 1 hour.
|Kernel Application||Notebook Instance||Hours||Cost per hour||Total|
Pricing Example #2: Processing
Amazon SageMaker Processing only charges you for the instances used while your jobs are executing. When you provide the input data for processing in Amazon S3, Amazon SageMaker downloads the data from Amazon S3 to local file storage at the start of a processing job.
The data analyst runs a Processing job to preprocess and validate data on two ml.m5.4xlarge instances for a job duration of 10 minutes. She uploads a dataset of 100 GB in S3 as input for the processing job, and the output data which is roughly the same size is stored back in S3.
|Hours||Processing Instances||Cost per hour||Total|
|1 * 2 * 0.167 = 0.334||ml.m5.4xlarge||$0.922||$0.308|
|General Purpose (SSD) Storage (GB)||Cost per hour||Total|
|100 GB * 2 = 200||$0.14||$0.0032|
The sub-total for Amazon SageMaker Processing job = $0.308;
The sub-total for 200 GB of general purpose SSD storage = $0.0032.
The total price for this example would be $0.3112
Pricing Example #3: Data Wrangler
As a data scientist, you spend three days using Amazon SageMaker Data Wrangler to cleanse, explore, and visualize your data for 6 hours per day. To execute your data preparation pipeline, you then initiate a SageMaker Data Wrangler job that is scheduled to run weekly.
The table below summarizes your total usage for the month and the associated charges for using Amazon SageMaker Data Wrangler.
|Application||SageMaker Studio Instance||Days||Duration||Total duration||Cost per hour||Cost sub-total|
|SageMaker Data Wrangler||ml.m5.4xlarge||3||6 hours||18 hours||$0.922||$16.596|
|SageMaker Data Wrangler job||ml.m5.4xlarge||-||40 minutes||2.67 hours||$0.922||$2.461|
From the table, you use Amazon SageMaker Data Wrangler for a total of 18 hours over 3 days to prepare your data. Additionally, you create a SageMaker Data Wrangler job to prepare updated data on a weekly basis. Each job lasts 40 minutes, and the job runs weekly for one month.
Total monthly charges for using Data Wrangler = $16.596 + $2.461 = $19.097
Pricing Example #4: Feature Store
You have a web application which issues reads and writes of 25 KB each to the Amazon SageMaker Feature Store. For the first 10 days of a month, you receive little traffic to your application, resulting in 10,000 writes and 10,000 reads each day to the SageMaker Feature Store. On day 11 of the month, your application gains attention on social media and application traffic spikes to 200,000 writes and 200,000 reads that day. Your application then settles into a more regular traffic pattern, averaging 80,000 writes and 80,000 reads each day through the end of the month.
The table below summarizes your total usage for the month and the associated charges for using Amazon SageMaker Feature Store.
|Day of the month||Total Writes||Total Write Units||Total Reads||Total Read Units|
|Days 1 to 10||100,000 writes
(10,000 writes * 10 days)
(100,000 * 25KB )
(10,000 * 10 days)
(100,000 * 25/4 KB )
|Day 11||200,000 writes||5000000
|Days 12 to 30||1,520,000 writes
(80,000 * 19 days)
(1,520,000 * 25KB)
(80,000 * 19 days)
(1,520,000 * 25/4KB)
|Total chargeable units||45,500,000 write units||12,740,000 read units|
|Monthly charges for writes and reads||$56.875
(45.5 million write units * $1.25 per million writes)
(12.74M read units * $0.25 per million reads)
++ All fractional read units are rounded to the next whole number
Total data stored = 31.5 GB
Monthly charges for data storage = 31.5 GB * $0.45 = $14.175
Total monthly charges for Amazon SageMaker Feature Store = $56.875 + $3.185 + $14.175 = $74.235
Pricing Example #5: Training
A data scientist has spent a week working on a model for a new idea. She trains the model 4 times on an ml.m4.4xlarge for 30 minutes per training run with Amazon SageMaker Debugger enabled using 2 built-in rules and 1 custom rule that she wrote. For the custom rule, she specified ml.m5.xlarge instance. She trains using 3 GB of training data in Amazon S3, and pushes 1 GB model output into Amazon S3. SageMaker creates General Purpose SSD (gp2) Volumes for each Training instance. SageMaker also creates General Purpose SSD (gp2) Volumes for each rule specified. In this example a total of 4 General Purpose SSD (gp2) Volumes will be created. SageMaker Debugger emits 1 GB of debug data to customer’s Amazon S3 bucket.
|Hours||Training Instance||Debug Instance||Cost per hour||Sub-total|
|4 * 0.5 = 2.00||ml.m4.4xlarge||n/a||$0.96||$1.92|
|4 * 0.5 * 2 = 4||n/a||No additional charges for built-in rule instances||$0||$0|
|4 * 0.5 = 2||ml.m5.xlarge||n/a||$0.23||$0.46|
|General Purpose (SSD) Storage for Training (GB)||General Purpose (SSD) Storage for Debugger built-in rules (GB)||General Purpose (SSD) Storage for Debugger custom rules (GB)||Cost per GB-Month||Sub- total|
|Cost||$0||No additional charges for built-in rule storage volumes||$0||$0.10||$0|
The total charges for training and debugging in this example are $2.38. The compute instances and general purpose storage volumes used by Amazon SageMaker Debugger built-in rules do not incur additional charges.
Pricing Example #6: Real-Time Inference
The model in Example #3 is then deployed to production to two (2) ml.c5.xlarge instances for reliable multi-AZ hosting. Amazon SageMaker Model Monitor is enabled with one (1) ml.m5.4xlarge instance and monitoring jobs are scheduled once per day. Each monitoring job take 5 minutes to complete. The model receives 100MB of data per day, and inferences are 1/10 the size of the input data.
|Hours per month||Hosting instances||Model Monitor Instances||Cost per hour||Total|
|24 * 31 * 2 = 1488||ml.c5.xlarge||$0.204||$303.522|
|31*0.08 = 2.5||ml.m5.4xlarge||$0.922||$2.305|
|Data In per month - Hosting||Data Out per month - Hosting||Cost per GB In or Out||Total|
|100MB * 31 = 3100MB||$0.02||$0.05|
|10MB * 31 = 310MB||$0.02||$0.01|
The sub-total for training, hosting, and monitoring = $305.827; The sub-total for 3100 MB of data processed In and 310MB of data processed Out for Hosting per month = $0.06. The total charges for this example would be $305.887 per month.
Note, for built-in rules with ml.m5.xlarge instance, you get up to 30 hours of monitoring aggregated across all endpoints each month, at no charge.
Pricing Example #7: Batch Transform
Amazon SageMaker Batch Transform only charges you for the instances used during while your jobs are executing. If your data is already in Amazon S3, then there is no cost for reading input data from S3 and writing output data to S3.
The model in Example #1 is used to run SageMaker Batch Transform. The data scientist runs four separate SageMaker Batch Transform jobs on 3 ml.m4.4xlarge for 15 minutes per job run. She uploads an evaluation dataset of 1 GB in S3 for each run, and inferences are 1/10 the size of the input data which are stored back in S3.
|Hours||Training Instances||Cost per hour||Total|
|3 * 0.25 * 4 = 3 hours||ml.m4.4xlarge||$0.96||$2.88|
|GB data In - Batch Transform||GB data Out - Batch Transform||Cost per GB In or Out||Total|
The sub-total for SageMaker Batch Transform job = $2.88; The sub-total for 4.4 GB into Amazon S3 = $0. The total charges for this example would be $2.90.