In the video on the left, Rendy shows you how
to stop an EC2 instance with AWS Data Pipeline

 

In the video on the right, Rendy walks you through
creating an AWS Data Pipeline role

CreatingDataPipeline_Thumbnail
StopEC2DataPipeline_Thumbnail

I don't need to run my Amazon EC2 instances at certain times, such as weekends or nights. How can I stop and start my Amazon EC2 instances at scheduled intervals without terminating the instances?

I would like a convenient method for stopping and starting my EC2 instances at scheduled intervals. There are Powershell and AWS CLI scripts that can accomplish this, but scripts can be error prone, require management of access keys, and introduce external dependencies that I would like to avoid.

The AWS Data Pipeline is uniquely suited to this task. Data Pipeline uses AWS technologies and can be configured to run AWS CLI commands on a set schedule with no external dependencies. Data Pipeline can write logs to S3 and runs in the context of an IAM role, which eliminates key management requirements. Data Pipeline is also cost effective; for example, the Data Pipeline free tier can be used to stop and start instances once per day. For more information, see AWS Data Pipeline Pricing.

Note
An Amazon EC2 t1.micro instance is started as the host environment for execution of a data pipeline. EC2 instances started for this purpose run for a default timeout period of 50 minutes. All resources used to host execution of a data pipeline are accrued to your account. Executing pipelines to stop and restart an EC2 instance for 100 minutes or less will use the same amount of resources as would be used by simply letting an EC2 t1.micro instance continue to run. You should stop and restart one or more EC2 t1.micro or larger instances for more than 100 minutes (50 minutes to stop and 50 minutes to start) to ensure that the method described in this article does not consume more resources than are conserved.

Data Pipeline requires that you create a customer managed policy for the DataPipelineDefaultResourceRole role if the the default policy associated with this role is the AmazonEC2RoleforDataPipelineRole AWS managed policy because AWS managed policies are read-only. If the policy attached to your DataPipelineDefaultResourceRole is not an AWS managed policy, you do not need to create a customer managed policy but can edit the policy assigned to the DataPipelineDefaultResourceRole role instead. Follow these steps to create a customer managed policy for the DataPipelineDefaultResourceRole role:

  1. In the AWS Management Console, in the Security and Identity section, choose Identity & Access Management (IAM) to open the Identity and Access Management dashboard.
  2. Choose Policies.
  3. Choose Create Policy to open the Create Policy page.
  4. Choose the option to Create Your Own Policy.
  5. Enter a policy name that you will associate with the DataPipelineDefaultResourceRole; for example, DataPipelineDefaultResourceRole_EC2_Policy might be suitable.
  6. Enter a description for the policy; for example, "Policy associated with the DataPipelineDefaultResourceRole when starting and stopping EC2 Instances with Data Pipeline."
  7. Enter the following information into the Policy Document section for the new policy:
    {
         "Version": "2012-10-17",
         "Statement": [
              {
                   "Effect": "Allow",
                   "Action": [
                        "s3:*",
                        "ec2:Describe*",
                        "ec2:Start*",
                        "ec2:RunInstances",
                        "ec2:Stop*",
                        "datapipeline:*",
                        "cloudwatch:*"
                   ],
                   "Resource": [
                        "*"
                   ]
              }
         ]
    }

    Note
    It is recommended that you apply the same permissions described here to any customer managed policy currently associated with the "DataPipelineDefaultResourceRole" role.
  8. Choose Validate Policy; after the policy is validated, choose Create Policy to create the new policy.
  9. After the new policy is created, attach the policy to the DataPipelineDefaultResourceRole role:
    1. Enter DataPipeline or other appropriate prefix for the Policy Type filter expression. Check the box next to the newly created policy and choose the Attach option from the Policy Actions drop-down menu.
    2. On the Attach Policy page, enter the filter expression "datapipeline" and select the check box next to the DataPipelineDefaultResourceRole entry that is returned by the filter.
    3. Choose Attach Policy at the bottom of the page to associate the new policy with the DataPipelineDefaultResourceRole.

Follow these steps to create and configure Data Pipeline to run AWS CLI commands that stop and start Amazon EC2 instances at scheduled intervals:

1. Create pipelines

Open the Data Pipeline console. Choose Create New Pipeline and enter the following information to create two pipelines:

Name: for example, "Start EC2 instances" and "Stop EC2 instances".
Description: Provide relevant details about the pipeline as needed.
Source: Choose Build using template and choose the template Run AWS CLI command.
AWS CLI command: This is where you specify what the pipeline does. Create two pipelines, one to run the aws ec2 start-instances command and another to run the aws ec2 stop-instances command.

Note: Both ec2 start-instances and ec2 stop-instances require valid values for the --region parameter.

For example, the following command could be used to start the specified EC2 instances:

aws ec2 start-instances --instance-ids i-abcd1234 i-987a654b i-ba154f3c --region us-east-x

This command provides the syntax used to stop the same EC2 instances you started:

aws ec2 stop-instances --instance-ids i-abcd1234 i-987a654b i-ba154f3c --region us-east-x

Important
If any instance ids passed to the --instance-ids parameter do not exist, the entire command fails and no instances are stopped or started. This would be a problem if any of the specified instance ids has been terminated. For example, if the instance id i-abcd1234 has been terminated, the aws ec2 start-instances example would not start any of the designated instances. Consider issuing separate, semicolon-delimited commands to guard against this scenario:

aws ec2 start-instances --instance-ids i-abcd1234 --region us-east-x;
aws ec2 start-instances --instance-ids i-987a654b --region us-east-x;
aws ec2 start-instances --instance-ids i-ba154f3c --region us-east-x

For more information about using the AWS CLI to start and stop EC2 instances, see start-instances and stop-instances in the AWS CLI documentation.

2. Apply Data Pipeline schedules

After creating separate pipelines to start and stop your instances, configure each pipeline with appropriate scheduling information. For more information, see Scheduling Pipelines.

3. Configure Data Pipeline logging

Enable logging for each pipeline and specify an S3 bucket in the same region to store the pipeline logs. Data Pipeline logging is not mandatory; however, if it is not enabled, the console displays a warning when you perform Data Pipeline validation.

4. Implement security access

Set the following options for implementing appropriate security access:

IAM Roles: Choose Custom
Pipeline Role:
DataPipelineDefaultRole
EC2 Instance Role: DataPipelineDefaultResourceRole

Note
Data Pipeline creates the necessary IAM roles for you.

5. Update role permissions

In the AWS Management Console, choose IAM, and then choose Roles. If you are using a non-managed policy, select the DataPipelineDefaultResourceRole role and edit the associated policy as described in step 7 of "Creating a Custom Policy for the DataPipelineDefaultResourceRole." Otherwise, associate the DataPipelineDefaultResourceRole with a custom policy as described in "Creating a Custom Policy for the DataPipelineDefaultResourceRole." For more information, see Editing Your Pipeline.

6. Activate the pipelines

Choose Activate in the console to activate the pipelines. You can monitor Data Pipeline activities in the console to verify that actions are completed successfully and on schedule.

EC2, Data Pipeline, schedule, stop, start, instances


Did this page help you? Yes | No

Back to the AWS Support Knowledge Center

Need help? Visit the AWS Support Center.