AWS Cloud Operations & Migrations Blog

Using OPA to create AWS Config rules

In this blog post, we will show how you can use AWS Config custom rules with Open Policy Agent (OPA) to evaluate the compliance of your AWS resources.

AWS Config enables you to assess, audit, and evaluate the configuration of your AWS resources. The service continuously monitors and records your AWS resource configurations and allows you to automate the evaluation of recorded configurations against desired configurations. AWS Config provides managed rules that address the most common use cases for evaluating compliance. AWS Config also supports custom rules that allow you to define your own logic by using AWS Lambda and one of the programming languages supported by AWS Lambda.

OPA is a general-purpose policy engine that has gained a lot of momentum in the Kubernetes community. It is often used to validate logic in deployment descriptors before applying them to a cluster. One example of this is creating a policy that only allows deployments that reference containers from trusted repositories.

Extending the OPA policy evaluation functionality to AWS resources allows you to develop policies in Rego, the policy language used by OPA. You can create policies in Rego and validate resource configuration in your account. This allows you to reuse policies you might have already developed and deploy them using AWS Config.

By using AWS Config custom rules and AWS Lambda, you can extend the functionality of custom rules to use OPA. This is useful for a couple of reasons:

  • Many customers already have complex policies that they developed in Rego to evaluate compliance in their environments. By using OPA with AWS Config rules, they can bring those policies into AWS Config and take advantage of features like conformance packs and organizational deployments.
  • By using OPA with AWS Config rules, customers can centralize the collection of compliance data in AWS Config. This eliminates the undifferentiated heavily lifting associated with managing custom solutions to track compliance. It also allows customers to take advantage of other managed platform for compliance like AWS Audit Manager, which uses compliance evaluation data produced by AWS Config to simplify the process of collecting audit evidence.

Solution overview

Workflow diagram showinng an overview of the solution. As resurces changes are detected by AWS Config, they trigger a rule that executed the Lambda code with OPA. The OPA engine evaluates the rule and returns a result of compliant or non compliant.

Figure 1: Solution architecture

Resource changes detected by AWS Config trigger a rule that executed the Lambda code with OPA. The OPA engine evaluates the rule and returns a result of compliant or non-compliant.

The solution consists of an AWS Config custom rule that triggers a Lambda function when a resource is modified. The Lambda function contains the logic to process rules based on the configuration item (CI) provided by AWS Config and the OPA binary in an AWS Lambda layer. Rego policies used by Lambda are stored in Amazon Simple Storage Service (Amazon S3).

When a resource is created or modified, AWS Config triggers the evaluation of the resource to create a CI that contains all the property values of the resource. That information is sent to the Lambda function, which retrieves the OPA policy associated with the rule from S3. It then uses that policy to evaluate the CI and determine if the resource is compliant. The evaluation results are sent to AWS Config, which determines the compliance state of the resource.

Let’s walk through the process of deploying the sample solution.

Prerequisites

This solution requires the deployment of a Lambda function with a Lambda layer. In the following steps, you’ll download, build, and deploy the required packages.

To complete the steps, you need the following:

Create an S3 bucket

Create an S3 bucket where you will store the Lambda artifacts that will be deployed and the Rego policies used by the solution. You can use an existing bucket, if you have one. Just make sure you know the bucket name and where the files are placed so you can modify the commands we provide.

  1. In the Amazon S3 management console, choose Create bucket.
  2. Enter a name for the bucket and select an AWS Region.
  3. In Block public access settings for this bucket, make sure Block all public access is selected to protect the data from being exposed to the internet.
  4. In Default encryption, choose Enable to encrypt the data on the bucket. You can leave the default option of Amazon S3 key (SSE-S3) It is a best practice to encrypt data on S3 buckets.
  5. Choose Create bucket.

Deploy the solution

In this post, we use AWS CloudShell to download and deploy the solution. You can use the same steps on your local computer download the solution in ZIP format from the GitHub repository. If you choose to use your own computer, make sure you have installed the AWS Command Line Interface and configured with access to the AWS account you would like to deploy to. You will also need a Git client.

In the AWS CloudShell console, open a session. You can choose the CloudShell icon in the AWS Management Console (shown in Figure 2).

Click on the ChloudShell icon in the console.

Figure 2: CloudShell icon

To download the code to the CloudShell session, use git to clone the repository. (You can also download the repository in ZIP format and then upload it to CloudShell. Choose Actions, choose Upload file, and then point to the ZIP file.) This repository contains a collection of samples and solutions for cloud operations service. The OPA sample is located in the ‘AWSConfig’ directory.

git clone https://github.com/aws-samples/aws-management-and-governance-samples.git

Change into the directory and located AWSConfig/AWS-Config-OPA.

Here is the directory structure for the solution:

  • cfn_templates contains the CloudFormation templates to deploy the Lambda function and AWS Config rules.
  • lambda_sources contains the source file for the Lambda function and the OPA binary that is a deployed as a layer for the Lambda function. Packaged sources are under the packaged_lambda_assets directory.
  • opa_policies contains Rego policies that correspond to rules deployed by CloudFormation templates.

Create the Lambda function

Download OPA engine binary and copy it in lambda_sources/layers/opa/bin directory. You can download the latest version by using the following command:

curl -L -o opa https://openpolicyagent.org/downloads/v0.31.0/opa_linux_amd64_static

The package_lambda.sh command will create the ZIP files for the OPA layer and the Lambda function. From the AWS CloudShell interface, run the following command:

./package_lambda.sh

The zipped packages will be under the lambda_sources/packaged_lambda_assets directory. Upload the OPA binary and Lambda source into the S3 bucket. We use the AWS Command Line Interface to upload the assets. Use the following command.

aws s3 sync ./lambda_sources/packaged_lambda_assets s3://[replace with bucket name]/packaged_lambda_assets

Upload the Rego policies

To upload the sample Rego policies to S3, use the following command.

aws s3 sync ./opa_policies s3://[replace with bucket name]/opa_policies

Deploy the Lambda function

We need to create the Lambda function that will run the OPA engine and evaluate the Rego policies. You’ll find a CloudFormation template that creates the Lambda function in cfn_templates\lambda_backend\opa-lambda.yaml. This template uses the resources that were copied to the S3 bucket.

To deploy the Lambda function from the AWS CloudShell prompt, make sure you are in the solution folder and then enter the following. Replace <S3 Bucket Name> with the name of your bucket.

aws cloudformation create-stack --stack-name opa-config-lambda \
--template-body file://cfn_templates/lambda_backend/opa-lambda.yaml \
--parameters ParameterKey=AssetsBucket,ParameterValue=<S3 Bucket Name> \
--capabilities "CAPABILITY_IAM"

Create the AWS Config rules

You’ll find a CloudFormation template that creates the AWS Config rule in the cfn_templates/config_rules directory. There are also a few samples. You can add your own Rego files to be deployed to this directory.

In the following example, we deploy the opa-ebs-encryption rule, which will evaluate Amazon Elastic Block Store (Amazon EBS) resources against the Rego policy defined.

To deploy the opa-ebs-encryption rule, run the CloudFormation template with the following command. Replace <S3 Bucket Name> with the name of your bucket.:

aws cloudformation create-stack --stack-name opa-ebs-encryption \
--template-body file://cfn_templates/config_rules/opa-ebs-encryption.yaml \
--parameters ParameterKey=AssetsBucket,ParameterValue=<S3 Bucket Name>

After the CloudFormation template has been deployed, the new rule for EBS encryption will evaluate all EBS volumes in your account.

Create more rules

To develop new rules, you’ll need to create one AWS Config rule and one Rego policy. You can deploy the rule with CloudFormation. Upload the Rego policy to the same S3 bucket used for other policies. The parameters for the AWS Config rule must match the name of the S3 bucket and the Rego policy. You can find these details when you edit the deployed rule.

Console screenshot of where parameters can be adjusted for an AWS Config rule

Figure 3: Parameters in the AWS Config console

Clean up

To clean up the solution, delete the CloudFormation stacks and the S3 bucket where the assets were stored.

Conclusion

In this blog post, we showed how you can extend the power of AWS Config custom rules to use Open Policy Agent. You might need to tailor the solution to meet the needs of your organization before you put it into production. For large-scale deployments, you might want to consider scalability, depending on the number of rules and policies you are using.

About the authors

Andres Silva

Andres Silva

Andres Silva (andress@amazon.com) is a Principal Specialist Solutions Architect with the Management Tools team at AWS. He has been working with AWS technology for more than 9 years. Andres works closely with the AWS Service teams to design solutions at scale that help customers implement and support complex cloud infrastructures. When he is not building cloud automation, he enjoys skateboarding with his 2 kids.

Eddie Esquivel

Eddie Esquivel

Eddie (esqueddi@amazon.com) is a Sr. Solutions Architect in the ISV segment at AWS. He has spent time at several startups focusing on Big Data and Kubernetes before joining AWS. Currently at AWS he’s focused on Management and Governance and helping customers make best use of AWS technology. In his spare time he enjoys spending time outdoors with his Wife and pet dog.

Ionut Dragoi

Ionut Dragoi

Ionut (ionudra@amazon.com) is an AWS solutions architect focused on helping customers from various industries solve business problems using AWS Cloud technologies. Ionut has expertise in scaling infrastructure and operations, and designs solutions that make use of infrastructure as code and automation. Outside work he likes to spend time skiing and playing tennis whenever possible.