AWS Security Blog

How to Facilitate Data Analysis and Fulfill Security Requirements by Using Centralized Flow Log Data

I am an AWS Professional Services consultant, which has me working directly with AWS customers on a daily basis. One of my customers recently asked me to provide a solution to help them fulfill their security requirements by having the flow log data from VPC Flow Logs sent to a central AWS account. This is a common requirement in some companies so that logs can be available for in-depth analysis. In addition, my customers regularly request a simple, scalable, and serverless solution that doesn’t require them to create and maintain custom code.

In this blog post, I demonstrate how to configure your AWS accounts to send flow log data from VPC Flow Logs to an Amazon S3 bucket located in a central AWS account by using only fully managed AWS services. The benefit of using fully managed services is that you can lower or even completely eliminate operational costs because AWS manages the resources and scales the resources automatically.

Solution overview

The solution in this post uses VPC Flow Logs, which is configured in a source account to send flow logs to an Amazon CloudWatch Logs log group. To receive the logs from multiple accounts, this solution uses a CloudWatch Logs destination in the central account. Finally, the solution utilizes fully managed Amazon Kinesis Firehose, which delivers streaming data to scalable and durable S3 object storage automatically without the need to write custom applications or manage resources. When the logs are processed and stored in an S3 bucket, these can be tiered into a lower cost, long-term storage solution (such as Amazon Glacier) automatically to help meet any company-specific or industry-specific requirements for data retention.

The following diagram illustrates the process and components of the solution described in this post.

Diagram illustrating the Diagram illustrating the process and components of the solution described in this post

As numbered in the preceding diagram, these are the high-level steps for implementing this solution:

  1. Create an S3 bucket in the central account.
  2. Create IAM roles and IAM policies for Kinesis Firehose and the CloudWatch Logs destination in the central account.
  3. Create a Kinesis Firehose delivery stream and configure it to send data to the S3 bucket in the central account.
  4. Create a CloudWatch Logs destination and configure it to send logs to Kinesis Firehose in the central account.
  5. Create an IAM role for VPC Flow Logs in the source accounts.
  6. Enable VPC Flow Logs to send data to the CloudWatch Logs log group in the source accounts.
  7. Finally, in the source accounts, set a subscription filter on the CloudWatch Logs log group to send data to the CloudWatch Logs destination.

Configure the solution by using AWS CloudFormation and the AWS CLI

Now that I have explained the solution, its benefits, and the components involved, I will show how to configure a source account by using the AWS CLI and the central account using a CloudFormation template. To implement this, you need two separate AWS accounts. If you need to set up a new account, navigate to the AWS home page, choose Create an AWS Account, and follow the instructions. Alternatively, you can use AWS Organizations to create your accounts. See AWS Organizations – Policy-Based Management for Multiple AWS Accounts for more details and some step-by-step instructions about how to use this service.

Note your source and target account IDs, source account VPC ID number, and target account region where you create your resources with the CloudFormation template. You will need these values as part of implementing this solution.

Gather the required configuration information

To find your AWS account ID number in the AWS Management Console, choose Support in the navigation bar, and then choose Support Center. Your currently signed-in, 12-digit account ID number appears in the upper-right corner under the Support menu. Sign in to both the source and target accounts and take note of their respective account numbers.

Screenshot showing where you can find your AWS account ID number

To find your VPC ID number in the AWS Management Console, sign in to your source account and choose Services. In the search box, type VPC and then choose VPC Isolated Cloud Resources.

Screenshow showing how to search for your VPC ID number

In the VPC console, click Your VPCs, as shown in the following screenshot.

Screenshot of Your VPCs in the VPC console

Note your VPC ID.

Screenshot showing where to find your VPC ID number

Finally, take a note of the region in which you are creating your resources in the target account. The region name is shown in the region selector in the navigation bar of the AWS Management Console (see the following screenshot), and the region is shown in your browser’s address bar. Sign in to your target account, choose Services, and type CloudFormation. In the following screenshot, the region name is Ireland and the region is eu-west-1.

Screenshot showing where to find the region name and region

Create the resources in the central account

In this example, I use a CloudFormation template to create all related resources in the central account. You can use CloudFormation to create and manage a collection of AWS resources called a stack. CloudFormation also takes care of the resource provisioning for you.

The provided CloudFormation template creates an S3 bucket that will store all the VPC Flow Logs, IAM roles with associated IAM policies used by Kinesis Firehose and the CloudWatch Logs destination, a Kinesis Firehose delivery stream, and a CloudWatch Logs destination.

To configure the resources in the central account:

  1. Sign in to the AWS Management Console, navigate to CloudFormation, and then choose Create Stack. On the Select Template page, type the URL of the CloudFormation template (https://s3.amazonaws.com/awsiammedia/public/sample/VPCFlowLogsCentralAccount/targetaccount.template) for the target account and choose Next.
    Screenshot showing how to specify an S3 template URL

The CloudFormation template defines a parameter, paramDestinationPolicy, which sets the IAM policy on the CloudWatch Logs destination. This policy governs which AWS accounts can create subscription filters against this destination.

  1. Change the Principal to your SourceAccountID, and in the Resource section, change TargetAccountRegion and TargetAccountID to the values you noted in the previous section.
{
  "Version" : "2012-10-17",
  "Statement" : [
        {"Effect" : "Allow",
            "Principal" : {"AWS" : "SourceAccountID"},
            "Action" : "logs:PutSubscriptionFilter",
            "Resource" : "arn:aws:logs:TargetAccountRegion:TargetAccountID:destination:VPCFlowLogsDestination"}]
}
  1. After you have updated the policy, choose Next. On the Options page, choose Next.
    Screenshot of supplying the CloudWatch destination policy
  2. On the Review page, scroll down and choose the I acknowledge that AWS CloudFormation might create IAM resources with custom names check box. Finally, choose Create to create the stack.
  3. After you are done creating the stack, verify the status of the resources. Choose the check box next to the Stack Name and choose the Resources tab, where you can see a list of all the resources created with this template and their status.
    Screenshot of all resources created by the template and their status

This completes the configuration of the central account.

Configure a source account

Now that you have created all the necessary resources in the central account, I will show you how to configure a source account by using the AWS CLI, a tool for managing your AWS resources. For more information about installing and configuring the AWS CLI, see Configuring the AWS CLI.

To configure a source account:

  1. Create the IAM role that will grant VPC Flow Logs the permissions to send data to the CloudWatch Logs log group. To start, create a trust policy in a file named TrustPolicyForCWL.json by using the following policy document.
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "",
      "Effect": "Allow",
      "Principal": {
        "Service": "vpc-flow-logs.amazonaws.com"
      },
      "Action": "sts:AssumeRole"
    }
  ]
}

Use the following command to create the IAM role, specifying the trust policy file you just created. Note the returned Arn value because that will also be passed to VPC Flow Logs later.

aws iam create-role \      
      --role-name PublishFlowLogs \      
      --assume-role-policy-document file://~/TrustPolicyForCWL.json

The following role shows the Arn value.

{
    "Role": {
        "AssumeRolePolicyDocument": {
            "Statement": {
                "Action": "sts:AssumeRole",
                "Effect": "Allow",
                "Principal": {
                    "Service": "vpc-flow-logs.amazonaws.com"
                }
            }
        },
        "RoleId": "AAOIIAH450GAB4HC5F431",
        "CreateDate": "2017-05-29T14:42:19.121Z",
        "RoleName": "PublishFlowLogs",
        "Path": "/",
        "Arn": "arn:aws:iam::123456789012:role/PublishFlowLogs"
    }
}
  1. Now, create a permissions policy to define which actions VPC Flow Logs can perform on the source account. Start by creating the permissions policy in a file named PermissionsForVPCFlowLogs.json. The following set of permissions (in the Action element) is the minimum requirement for VPC Flow Logs to be able to send data to the CloudWatch Logs log group. 
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Action": [
        "logs:CreateLogGroup",
        "logs:CreateLogStream",
        "logs:PutLogEvents",
        "logs:DescribeLogGroups",
        "logs:DescribeLogStreams"
      ],
      "Effect": "Allow",
      "Resource": "*"
    }
  ]
}   

Now, associate the permissions policy with the IAM role by running the following command in the AWS CLI.

aws iam put-role-policy --role-name PublishFlowLogs --policy-name Permissions-Policy-For-VPCFlowLogs --policy-document file://~/PermissionsForVPCFlowLogs.json
  1. By running the following command, you will create a CloudWatch Logs log group that will be used to configure the destination for your VPC Flow Logs.
aws logs create-log-group --log-group-name vpc-flow-logs
  1. Now that you have created an IAM role with required permissions and a CloudWatch Logs log group, change the placeholder values of the role in the following command and run it to enable VPC Flow Logs.
aws ec2 create-flow-logs --resource-type VPC --resource-ids vpc-12345678 --traffic-type ALL --log-group-name vpc-flow-logs --deliver-logs-permission-arn arn:aws:iam::123456789012:role/PublishFlowLogs
  1. Finally, change the destination arn in the following command to reflect your targetaccountregion and targetaccountID, and run the command to subscribe your CloudWatch Logs log group to CloudWatch Logs in the central account.
aws logs put-subscription-filter --log-group-name "vpc-flow-logs" --filter-name "AllTraffic" --filter-pattern "" --destination-arn "arn:aws:logs:targetaccountregion:targetaccountID:destination:VPCFlowLogsDestination"

This completes the configuration of this central logging solution. Within a few minutes, you should start seeing compressed logs sent from your source account to the S3 bucket in your central account (see the following screenshot). You can process and analyze these logs with the analytics tool of your choice. In addition, you can tier the data into a lower cost, long-term storage solution such as Amazon Glacier and archive the data to meet the data retention requirements of your company.

Screenshot of logs sent from the source account to the S3 bucket in the central account

Summary

In this post, I demonstrated how to configure AWS accounts to send VPC Flow Logs to an S3 bucket located in a central AWS account by using only fully managed AWS services. If you have comments about this post, submit them in the “Comments” section below. If you have questions about implementing this solution, start a new thread on the CloudWatch forum.

– Tomasz