AWS Security Blog

Use AWS Fargate and Prowler to send security configuration findings about AWS services to Security Hub

May 3, 2021: Since the author wrote this post, Security Hub has launched native features that simplify integration with Prowler as a findings provider. Therefore, Security Hub native integration with Prowler is now the recommended solution for sending findings from Prowler. For more information, see the Prowler documentation.


In this blog post, I’ll show you how to integrate Prowler, an open-source security tool, with AWS Security Hub. Prowler provides dozens of security configuration checks related to services such as Amazon Redshift, Amazon ElasticCache, Amazon API Gateway and Amazon CloudFront. Integrating Prowler with Security Hub will provide posture information about resources not currently covered by existing Security Hub integrations or compliance standards. You can use Prowler checks to supplement the existing CIS AWS Foundations compliance standard Security Hub already provides, as well as other compliance-related findings you may be ingesting from partner solutions.

In this post, I’ll show you how to containerize Prowler using Docker and host it on the serverless container service AWS Fargate. By running Prowler on Fargate, you no longer have to provision, configure, or scale infrastructure, and it will only run when needed. Containers provide a standard way to package your application’s code, configurations, and dependencies into a single object that can run anywhere. Serverless applications automatically run and scale in response to events you define, rather than requiring you to provision, scale, and manage servers.

Solution overview

The following diagram shows the flow of events in the solution I describe in this blog post.
 

Figure 1: Prowler on Fargate Architecture

Figure 1: Prowler on Fargate Architecture

 

The integration works as follows:

  1. A time-based CloudWatch Event starts the Fargate task on a schedule you define.
  2. Fargate pulls a Prowler Docker image from Amazon Elastic Container Registry (ECR).
  3. Prowler scans your AWS infrastructure and writes the scan results to a CSV file.
  4. Python scripts in the Prowler container convert the CSV to JSON and load an Amazon DynamoDB table with formatted Prowler findings.
  5. A DynamoDB stream invokes an AWS Lambda function.
  6. The Lambda function maps Prowler findings into the AWS Security Finding Format (ASFF) before importing them to Security Hub.

Except for an ECR repository, you’ll deploy all of the above via AWS CloudFormation. You’ll also need the following prerequisites to supply as parameters for the CloudFormation template.

Prerequisites

  • A VPC with at least 2 subnets that have access to the Internet plus a security group that allows access on Port 443 (HTTPS).
  • An ECS task role with the permissions that Prowler needs to complete its scans. You can find more information about these permissions on the official Prowler GitHub page.
  • An ECS task execution IAM role to allow Fargate to publish logs to CloudWatch and to download your Prowler image from Amazon ECR.

Step 1: Create an Amazon ECR repository

In this step, you’ll create an ECR repository. This is where you’ll upload your Docker image for Step 2.

  1. Navigate to the Amazon ECR Console and select Create repository.
  2. Enter a name for your repository (I’ve named my example securityhub-prowler, as shown in figure 2), then choose Mutable as your image tag mutability setting, and select Create repository.
     
    Figure 2: ECR Repository Creation

    Figure 2: ECR Repository Creation

Keep the browser tab in which you created the repository open so that you can easily reference the Docker commands you’ll need in the next step.

Step 2: Build and push the Docker image

In this step, you’ll create a Docker image that contains scripts that will map Prowler findings into DynamoDB. Before you begin step 2, ensure your workstation has the necessary permissions to push images to ECR.

  1. Create a Dockerfile via your favorite text editor, and name it Dockerfile.
    
    FROM python:latest
    
    # Declar Env Vars
    ENV MY_DYANMODB_TABLE=MY_DYANMODB_TABLE
    ENV AWS_REGION=AWS_REGION
    
    # Install Dependencies
    RUN \
        apt update && \
        apt upgrade -y && \
        pip install awscli && \
        apt install -y python3-pip
    
    # Place scripts
    ADD converter.py /root
    ADD loader.py /root
    ADD script.sh /root
    
    # Installs prowler, moves scripts into prowler directory
    RUN \
        git clone https://github.com/toniblyx/prowler && \
        mv root/converter.py /prowler && \
        mv root/loader.py /prowler && \
        mv root/script.sh /prowler
    
    # Runs prowler, ETLs ouput with converter and loads DynamoDB with loader
    WORKDIR /prowler
    RUN pip3 install boto3
    CMD bash script.sh
    
  2. Create a new file called script.sh and paste in the below code. This script will call the remaining scripts, which you’re about to create in a specific order.

    Note: Change the AWS Region in the Prowler command on line 3 to the region in which you’ve enabled Security Hub.

    
    #!/bin/bash
    echo "Running Prowler Scans"
    ./prowler -b -n -f us-east-1 -g extras -M csv > prowler_report.csv
    echo "Converting Prowler Report from CSV to JSON"
    python converter.py
    echo "Loading JSON data into DynamoDB"
    python loader.py
    
  3. Create a new file called converter.py and paste in the below code. This Python script will convert the Prowler CSV report into JSON, and both versions will be written to the local storage of the Prowler container.
    
    import csv
    import json
    
    # Set variables for within container
    CSV_PATH = 'prowler_report.csv'
    JSON_PATH = 'prowler_report.json'
    
    # Reads prowler CSV output
    csv_file = csv.DictReader(open(CSV_PATH, 'r'))
    
    # Create empty JSON list, read out rows from CSV into it
    json_list = []
    for row in csv_file:
        json_list.append(row)
    
    # Writes row into JSON file, writes out to docker from .dumps
    open(JSON_PATH, 'w').write(json.dumps(json_list))
    
    # open newly converted prowler output
    with open('prowler_report.json') as f:
        data = json.load(f)
    
    # remove data not needed for Security Hub BatchImportFindings    
    for element in data: 
        del element['PROFILE']
        del element['SCORED']
        del element['LEVEL']
        del element['ACCOUNT_NUM']
        del element['REGION']
    
    # writes out to a new file, prettified
    with open('format_prowler_report.json', 'w') as f:
        json.dump(data, f, indent=2)
    
  4. Create your last file, called loader.py and paste in the below code. This Python script will read values from the JSON file and send them to DynamoDB.
    
    from __future__ import print_function # Python 2/3 compatibility
    import boto3
    import json
    import decimal
    import os
    
    awsRegion = os.environ['AWS_REGION']
    prowlerDynamoDBTable = os.environ['MY_DYANMODB_TABLE']
    
    dynamodb = boto3.resource('dynamodb', region_name=awsRegion)
    
    table = dynamodb.Table(prowlerDynamoDBTable)
    
    # CHANGE FILE AS NEEDED
    with open('format_prowler_report.json') as json_file:
        findings = json.load(json_file, parse_float = decimal.Decimal)
        for finding in findings:
            TITLE_ID = finding['TITLE_ID']
            TITLE_TEXT = finding['TITLE_TEXT']
            RESULT = finding['RESULT']
            NOTES = finding['NOTES']
    
            print("Adding finding:", TITLE_ID, TITLE_TEXT)
    
            table.put_item(
               Item={
                   'TITLE_ID': TITLE_ID,
                   'TITLE_TEXT': TITLE_TEXT,
                   'RESULT': RESULT,
                   'NOTES': NOTES,
                }
            )
    
  5. From the ECR console, within your repository, select View push commands to get operating system-specific instructions and additional resources to build, tag, and push your image to ECR. See Figure 3 for an example.
     
    Figure 3: ECR Push Commands

    Figure 3: ECR Push Commands

    Note: If you’ve built Docker images previously within your workstation, pass the --no-cache flag with your docker build command.

  6. After you’ve built and pushed your Image, note the URI within the ECR console (such as 12345678910.dkr.ecr.us-east-1.amazonaws.com/my-repo), as you’ll need this for a CloudFormation parameter in step 3.

Step 3: Deploy CloudFormation template

Download the CloudFormation template from GitHub and create a CloudFormation stack. For more information about how to create a CloudFormation stack, see Getting Started with AWS CloudFormation in the CloudFormation User Guide.

You’ll need the values you noted in Step 2 and during the “Solution overview” prerequisites. The description of each parameter is provided on the Parameters page of the CloudFormation deployment (see Figure 4)
 

Figure 4: CloudFormation Parameters

Figure 4: CloudFormation Parameters

After the CloudFormation stack finishes deploying, click the Resources tab to find your Task Definition (called ProwlerECSTaskDefinition). You’ll need this during Step 4.
 

Figure 5: CloudFormation Resources

Figure 5: CloudFormation Resources

Step 4: Manually run ECS task

In this step, you’ll run your ECS Task manually to verify the integration works. (Once you’ve tested it, this step will be automatic based on CloudWatch events.)

  1. Navigate to the Amazon ECS console and from the navigation pane select Task Definitions.
  2. As shown in Figure 6, select the check box for the task definition you deployed via CloudFormation, then select the Actions dropdown menu and choose Run Task.
     
    Figure 6: ECS Run Task

    Figure 6: ECS Run Task

  3. Configure the following settings (shown in Figure 7), then select Run Task:
    1. Launch Type: FARGATE
    2. Platform Version: Latest
    3. Cluster: Select the cluster deployed by CloudFormation
    4. Number of tasks: 1
    5. Cluster VPC: Enter the VPC of the subnets you provided as CloudFormation parameters
    6. Subnets: Select 1 or more subnets in the VPC
    7. Security groups: Enter the same security group you provided as a CloudFormation parameter
    8. Auto-assign public IP: ENABLED
       
      Figure 7: ECS Task Settings

      Figure 7: ECS Task Settings

  4. Depending on the size of your account and the resources within it, your task can take up to an hour to complete. Follow the progress by looking at the Logs tab within the Task view (Figure 8) by selecting your task. The stdout from Prowler will appear in the logs.

    Note: Once the task has completed it will automatically delete itself. You do not need to take additional actions for this to happen during this or subsequent runs.

     

    Figure 8: ECS Task Logs

    Figure 8: ECS Task Logs

  5. Under the Details tab, monitor the status. When the status reads Stopped, navigate to the DynamoDB console.
  6. Select your table, then select the Items tab. Your findings will be indexed under the primary key NOTES, as shown in Figure 9. From here, the Lambda function will trigger each time new items are written into the table from Fargate and will load them into Security Hub.
     
    Figure 9: DynamoDB Items

    Figure 9: DynamoDB Items

  7. Finally, navigate to the Security Hub console, select the Findings menu, and wait for findings from Prowler to arrive in the dashboard as shown in figure 10.
     
    Figure 10: Prowler Findings in Security Hub

    Figure 10: Prowler Findings in Security Hub

If you run into errors when running your Fargate task, refer to the Amazon ECS Troubleshooting guide. Log errors commonly come from missing permissions or disabled Regions – refer back to the Prowler GitHub for troubleshooting information.

Conclusion

In this post, I showed you how to containerize Prowler, run it manually, create a schedule with CloudWatch Events, and use custom Python scripts along with DynamoDB streams and Lambda functions to load Prowler findings into Security Hub. By using Security Hub, you can centralize and aggregate security configuration information from Prowler alongside findings from AWS and partner services.

From Security Hub, you can use custom actions to send one or a group of findings from Prowler to downstream services such as ticketing systems or to take custom remediation actions. You can also use Security Hub custom insights to create saved searches from your Prowler findings. Lastly, you can use Security Hub in a master-member format to aggregate findings across multiple accounts for centralized reporting.

If you have feedback about this blog post, submit comments in the Comments section below. If you have questions about this blog post, start a new thread on the AWS Security Hub forum.

Want more AWS Security news? Follow us on Twitter.

Jonathon Rau

Jonathan Rau

Jonathan is the Senior TPM for AWS Security Hub. He holds an AWS Certified Specialty-Security certification and is extremely passionate about cyber security, data privacy, and new emerging technologies, such as blockchain. He devotes personal time into research and advocacy about those same topics.