How do I use AWS Batch as a target for my CloudWatch Events rule?

Last updated: 2019-12-09

I want to use AWS Batch as a target for my Amazon CloudWatch Events rule.

Resolution

The following example shows how you can configure Amazon CloudWatch to submit an AWS Batch job when a new Amazon Simple Storage Service (Amazon S3) bucket is created.

Create a job definition

1.    Open the AWS Batch console.

2.    From the navigation bar, select your AWS Region.

3.    In the navigation pane, choose Job definitions, and then choose Create.

4.    For Job definition name, enter a unique name for your job definition.

Note: You can use up to 128 letters (uppercase and lowercase), numbers, hyphens, and underscores in your unique name.

5.    For Container image, enter amazonlinux.

6.    For Command, enter the following:

echo Ref::S3bucket

7.    For vCPUs, enter 2.

8.    For Memory (MiB), enter 500.

9.    Choose Next, and then choose Create.

Get the ARN of your job queue and job definition

1.    Open the AWS Batch console.

2.    In the navigation pane, choose Job queues.

3.    Choose your job queue.

4.    In the Overview section, copy the Queue ARN, and then save it for later.

5.    In the navigation pane, choose Job definitions, and then choose the job definition that you previously created.

6.    In the Job definitions attributes section, copy the Job definition ARN, and then save it for later.

Create the CloudWatch Events rule

1.    Open the CloudWatch console.

2.    In the navigation pane, choose Rules.

3.    Choose Create rule.

4.    In the Event Source section, choose Event Pattern.

5.    For Service Name, choose Simple Storage Service (S3).

6.    For Event Type, choose Bucket Level Operations.

Note: To trigger AWS Batch jobs on Amazon S3 object-level operations (for example, when a new object is uploaded to an existing bucket), see How Do I Enable Object-Level Logging for an S3 Bucket with AWS CloudTrail Data Events?

7.    Choose Any operation.

8.    In the Targets section, choose Add target.

9.    From the main menu, choose Batch job queue.

10.    For Job queue, paste in the job queue ARN that you copied earlier.

11.    For Job definition, paste in the job definition ARN that you copied earlier.

12.    For Job name, enter a name for your AWS Batch job.

13.    In the Configure input section, choose Input Transformer.

14.    In the first input box, enter the S3 bucket values to be sent when the event is triggered:

{"S3BucketNameValue":"$.detail.requestParameters.bucketName"}

Note: Replace S3BucketNameValue with your own value.

15.    In the second input box, enter the Parameters structure to be passed to the Batch job:

{"Parameters" : {"S3bucket": S3BucketNameValue}}

Note: Replace S3BucketNameValue with your own value. Replace S3bucket with the name of the parameter that you want to define in your AWS Batch job.

16.    Choose either Create a new role for this specific resource or Use existing role.

Note: If you choose an existing role, that role should have an AWS Identity and Access Management (IAM) policy that allows the batch:SubmitJob action.

17.    Choose Configure details.

18.    For Name, enter the name for your rule.

19.    For State, select the Enabled checkbox.

20.    Choose Create rule.

Test the new rule that you created

1.    Open the Amazon S3 console.

2.    Choose Create bucket.

3.    In the Bucket name field, type a unique DNS-compliant name for your new bucket.

4.    For Region, choose the same Region where you created the CloudWatch Events rule.

5.    Choose Create.

Check your logs

1.    Open the AWS Batch console.

2.    In the navigation pane, choose Jobs.

3.    Choose your job with a Status of SUCCEEDED.

4.    On the Job details page, in the Attempts section, choose View logs. The log displays your bucket name in the CloudWatch console.