How do I use AWS Batch as a target for my EventBridge rule?

Last updated: 2022-01-19

I want to use AWS Batch as a target for my Amazon EventBridge rule.

Resolution

The following example shows how to configure Amazon EventBridge to submit an AWS Batch job when a new Amazon Simple Storage Service (Amazon S3) bucket is created.

Create a job definition

1.    Open the AWS Batch console.

2.    From the navigation bar, select your AWS Region.

3.    In the navigation pane, choose Job definitions, and then choose Create.

4.    For Job definition name, enter a unique name for your job definition.

Note: You can use up to 128 letters (uppercase and lowercase), numbers, hyphens, and underscores in your unique name.

5.    For Container image, enter amazonlinux.

6.    For Command, enter the following:

echo Ref::S3bucket

7.    For vCPUs, enter 2.

8.    For Memory (MiB), enter 500.

9.    Choose Next, and then choose Create.

Get the ARN of your job queue and job definition

1.    Open the AWS Batch console.

2.    In the navigation pane, choose Job queues.

3.    Choose your job queue.

4.    In the Job queue details section, copy the Queue ARN, and then save it for later.

5.    In the navigation pane, choose Job definitions, and then choose the job definition that you previously created.

6.    In the Job definition details section, copy the Job definition ARN, and then save it for later.

Enable CloudTrail data events logging for objects in an S3 bucket

To trigger AWS Batch jobs on Amazon S3 object-level operations (for example, when a new object is uploaded to an existing bucket), see Enabling CloudTrail event logging for S3 buckets and objects.

Create the EventBridge rule

1.    Open the EventBridge console.

2.    Select Create rule.

3.    Enter a Name for your rule. You can optionally enter a Description.

4.    In Define pattern, select Event pattern.

5.    Select Pre-defined pattern by service.

6.    For Service provider, choose AWS.

7.    For Service name, choose Simple Storage Service (S3).

8.    For Event type, choose Bucket-Level API Call via CloudTrail.

9.    Choose Any operation.

10.    In the Select targets section, choose Batch job queue from the Target dropdown list.

11.    For Job queue, paste in the job queue ARN that you copied earlier.

12.    For Job definition, paste in the job definition ARN that you copied earlier.

13.    For Job name, enter a name for your AWS Batch job.

14.    In the Configure input section, choose Input Transformer.

15.    In the first input box, enter the S3 bucket values to be sent when the event is triggered:

{"S3BucketNameValue":"$.detail.requestParameters.bucketName"}

Note: Replace S3BucketNameValue with your own value.

16.    In the second input box, enter the Parameters structure to be passed to the Batch job:

{"Parameters" : {"S3bucket": S3BucketNameValue}}

Note: Replace S3BucketNameValue with your own value. Replace S3bucket with the name of the parameter that you want to define in your AWS Batch job.

17.    Choose either Create a new role for this specific resource or Use existing role.

Note: If you choose an existing role, that role should have an AWS Identity and Access Management (IAM) policy that allows the batch:SubmitJob action.

18.    Select Create.

Test the new rule that you created

1.    Open the Amazon S3 console.

2.    Choose Create bucket.

3.    In the Bucket name field, type a unique DNS-compliant name for your new bucket.

4.    For Region, choose the same Region where you created the EventBridge rule.

5.    Choose Create.

Check your logs

1.    Open the AWS Batch console.

2.    In the navigation pane, choose Jobs.

3.    Choose your job with a Status of SUCCEEDED.

4.    On the Job details page, in the Attempts section, choose View logs. The log displays your bucket name in the CloudWatch console.


Did this article help?


Do you need billing or technical support?