Umesh shows you how to push
VPC flow logs to Splunk using
Amazon Kinesis Firehose

Umesh_SEA0818

I'm installing Splunk heavy forwarders to analyze my Amazon Virtual Private Cloud (Amazon VPC) data. I'm pushing data from AWS sources to Splunk clusters for processing, but it takes multiple steps. How can I better integrate my AWS data with Splunk?

Instead of using heavy forwarders, you can use Splunk's HTTP Event Collector (HEC) and Amazon Kinesis Data Firehose to send data and application events to Splunk clusters. You can do this by:

  1. Creating a Data Firehose delivery stream
  2. Configuring AWS Lambda for record transformation
  3. Configuring VPC flow logs
  4. Creating a CloudWatch Logs subscription to your stream

Before you begin, be sure to:

Start creating the Data Firehose delivery stream

1.    Create your delivery stream. For Source, choose Direct PUT or other sources.

2.    Choose Next.

Configure record transformation with AWS Lambda

1.    Configure record transformation.
Note: Be sure to choose Enabled for Record transformation under Transform source records with AWS Lambda. You must enable this option, because CloudWatch sends the logs as compressed .gzip files. Amazon Kinesis must extract these files before they are usable.

2.    For Lambda function, select Create new.

3.    On the Choose Lambda blueprint pop-up window that appears, for Lambda blueprint, choose Kinesis Firehose CloudWatch Logs Processor.

4.    Select the new tab that opens in your browser to create the new Lambda function.
For Name, enter a name for the Lambda function.
For Role, select Create a custom role.

5.    Select the new tab that opens in your browser to create a new create an AWS Identity and Access Management (IAM) role.
For Role Name, be sure that the name is lambda_basic_execution.

6.    Choose Allow to create the role and return to the Lambda function configuration page.

7.    Choose Create function, and then wait for the function to be created.

8.    Increase the Timeout to 1 minute from the default 3 seconds to prevent the function from timing out.

9.    Choose Save.

Finish creating the Data Firehose delivery stream

1.    Sign in to the Amazon Kinesis console.

2.    In the navigation pane, select Data Firehose.

3.    For your delivery stream, choose Lambda function.
Choose the name of your new AWS Lambda function from the drop-down menu.
For Destination, choose Splunk.
Enter the Splunk HEC details, including the Splunk HEC endpoint that you created before. The Splunk HEC endpoint must be terminated with a valid SSL certificate. Use the matching DNS hostname to connect to your HEC endpoint. The format for the cluster endpoint is https://YOUR-ENDPOINT.splunk.com:8088.
For Splunk endpoint type, choose Raw endpoint, and then enter the authentication token.

4.    Choose Next.

5.    (Optional) Create an S3 backup for failed events or all events by choosing an existing bucket or creating a new bucket. Be sure to configure S3-related settings such as buffer conditions, compression and encryption settings, and error logging options in the delivery stream wizard.

6.    Under IAM role, choose Create New.

7.    In the tab that opens, enter a Role name, and then choose Allow.

8.    Choose Next.

9.    Choose Create delivery stream.

Configure VPC flow logs

If you already have a VPC flow log that you want to use, skip to the next section.

1.    Sign in to the CloudWatch console.

2.    In the navigation pane, choose Logs.

3.    For Actions select Create log group.

4.    Enter a Log Group Name.

5.    Choose Create log group.

6.    Sign in to the Amazon VPC console.

7.    In the navigation pane under Virtual Private Cloud, choose Your VPCs.

8.    In the content pane, select your VPC.

9.    Choose the Flow logs view.

10.   Choose Create flow log.
For Filter, select All.
For Destination log group, select the log group you just created.
For IAM role, select an IAM role that allows your VPC to publish logs to CloudWatch.
Note:
If you don't have an appropriate IAM role, choose Set Up Permissions under IAM role. Choose Create a new IAM role. Leave the default settings selected. Choose Allow to create and associate the role VPCFlowLogs with the destination log group.

11.   Choose Create to create your VPC flow log.

12.   Establish a real-time feed from the log group to your delivery stream.
For AWS Lambda instructions, see Accessing Amazon CloudWatch Logs for AWS Lambda.
For Amazon Elasticsearch Service (Amazon ES) instructions, see Streaming CloudWatch Logs Data to Amazon Elasticsearch Service.
For Kinesis Data Firehose, create a CloudWatch Logs subscription in the AWS Command Line Interface (AWS CLI) using the following instructions.

Create a CloudWatch Logs subscription

1.    Grant access to CloudWatch to publish your Kinesis Data Firehose stream with the correct role permissions.

2.    Sign in to the AWS CLI.

3.    Create your trust policy (such as TrustPolicyforCWLToFireHose.json) using the following example JSON file. Be sure to replace YOUR-RESOURCE-REGION with your resource's AWS Region.

{
  "Statement": {
    "Effect": "Allow",
    "Principal": { "Service": "logs.region.amazonaws.com" },
    "Action": "sts:AssumeRole"
  }
}

4.    Create the role with permissions from the trust policy using the following example command:

$ aws iam create-role --role-name CWLtoKinesisFirehoseRole --assume-role-policy-document file://TrustPolicyForCWLToFireHose.json

5.    Create your IAM policy (such as PermissionPolicyForCWLToFireHose.json) using the following example JSON file. Be sure to replace:
YOUR-AWS-ACCT-NUM with your AWS account number,
YOUR-RESOURCE-REGION with your resource's Region, and
FirehoseSplunkDeliveryStream with your stream's name.

{
    "Statement":[
      {
        "Effect":"Allow",
        "Action":["firehose:*"],
        "Resource":["arn:aws:firehose:YOUR-RESOURCE-REGION:YOUR-AWS-ACCT-NUM:deliverystream/FirehoseSplunkDeliveryStream"]
      },
      {
        "Effect":"Allow",
        "Action":["iam:PassRole"],
        "Resource":["arn:aws:iam::YOUR-AWS-ACCT-NUM:role/CWLtoKinesisFirehoseRole"]
      }
    ]
}

6.    Attach the IAM policy to the newly created role using the following example command:

$ aws iam put-role-policy 
    --role-name CWLtoKinesisFirehoseRole 
    --policy-name Permissions-Policy-For-CWL 
    --policy-document file://PermissionPolicyForCWLToFireHose.json

7.    Create a subscription filter using the following example command. Be sure to replace YOUR-AWS-ACCT-NUM with your AWS account number, YOUR-RESOURCE-REGION with your resource's Region, and FirehoseSplunkDeliveryStream with your stream's name.

$ aws logs put-subscription-filter 
   --log-group-name " /vpc/flowlog/FirehoseSplunk" 
   --filter-name "Destination" 
   --filter-pattern "" 
   --destination-arn "arn:aws:firehose:YOUR-RESOURCE-REGION:YOUR-AWS-ACCT-NUM:deliverystream/FirehoseSplunkDeliveryStream" 
   --role-arn "arn:aws:iam::YOUR-AWS-ACCT-NUM:role/CWLtoKinesisFirehoseRole"

Did this page help you? Yes | No

Back to the AWS Support Knowledge Center

Need help? Visit the AWS Support Center

Published: 2018-11-08