Business Productivity
Cross-application audit log analysis with AWS AppFabric
Organizations are adopting increasing numbers of cloud-based software-as-a-service (SaaS) applications to support remote work, better collaboration, and improved productivity for employees. But using a variety of different SaaS applications means that audit logs are kept in multiple systems with different schemas. AWS AppFabric is a service that quickly connects multiple applications together helping security professionals have a comprehensive view of user activity and perform cross-application audit log analysis.
In this blog, read how to configure AppFabric to send application audit logs to Amazon OpenSearch Serverless through Amazon Kinesis Data Firehose. After, use OpenSearch Serverless to analyze and visualize aggregated and normalized audit logs.
The security challenges
Some security incidents may only be detectable when the audit logs of multiple SaaS applications are correlated. The evidence for identifying data exfiltration, compromised accounts, and privilege abuse may be spread across multiple applications used for file management, instant messaging, email, and more. Security and IT teams must monitor each application and breakdown data silos that create fragmented incidents.
Correlating disparate logs across multiple applications is more difficult when there are no common identifiers or schema between them. Each application uses different user IDs (e.g.: 15432 vs. name@example.com), making it hard for security teams to match application activity happening across multiple SaaS apps to the same individual user. Applications also structure and label their log data differently. The location of the responsible IP address will vary between logs, and an “export” in one application may be a “download” in another.
With specialized development skills, security professionals build systems to aggregate, normalize, and enrich each application’s logs. However, this consumes security engineering resources that could contribute to more strategic security initiatives.
The AWS AppFabric service
AppFabric addresses these challenges and makes the security professional more productive. The service ingests audit log data from each application, converts it into the schema created by the Open Cybersecurity Schema Framework (OCSF) project, and then enriches this data by translating application-specific user IDs into email addresses.
AppFabric stores the aggregated logs in Amazon Simple Storage Service (Amazon S3), or sends them to Kinesis Data Firehose. You can use a variety of compatible security tools with the normalized data without maintaining custom integration work.
Overview of solution
I illustrate how to configure OpenSearch Serverless to receive the AppFabric logs through Kinesis Data Firehose. Many customers commonly use OpenSearch for log analytics and the serverless option allows customers to get started without worrying about cluster management.
Image 1: The audit logs from AppFabric flow through Kinesis Data Firehose to OpenSearch Serverless
Walkthrough
In this walkthrough, I configure AppFabric, OpenSearch Serverless, and Kinesis Data Firehose.
Prerequisites
For this walkthrough, you must have the following prerequisites:
- An AWS account.
- Access to the AWS console with a role that can administer AppFabric, Kinesis Data Firehose, and OpenSearch Serverless.
- Administrative access to one or more of the applications supported by AppFabric as detailed in the AppFabric Administration Guide. This walkthrough requires that you connect one of your organization’s SaaS applications to collect audit logs.
Note: If you are not the administrator of your organization’s SaaS applications, you can ask the administrator to perform the appropriate steps in the AppFabric Administration Guide and then provide you the needed authorization data.
Choose a region
AWS AppFabric is available in the US East (N. Virginia), Asia Pacific (Tokyo), and Europe (Ireland) regions. You may select from any of these regions when following the steps below but all services must be configured in the same region.
Configuring OpenSearch Serverless and Kinesis Data Firehose
Prior to configuring any log ingestions for AppFabric, I prepare the data destination.
To create an OpenSearch Serverless collection follow the steps as listed:
- Sign in to the AWS Management Console and open the Amazon OpenSearch Service console at https://console.aws.amazon.com/aos/home .
- In the navigation pane, under Serverless, choose Collections, and then choose Create collection.
- Enter a Collection name and select the Time series collection type.
- In the Security section, choose Easy create and then choose Next.
- On the review screen, choose Submit and wait for the collection to be created.
Image 2: Beginning to configure the OpenSearch collection settings
To allow public network access to the OpenSearch Dashboards, follow these steps:
- In the collection overview pane, in the Network section, choose the policy link below Associated Policy.
- In the network policy details screen, choose Edit.
- In the first rule, select Enable access to OpenSearch Dashboards then provide your collection name in the field below.
- Choose Update and then close the browser tab.
Before leaving the OpenSearch Serverless configuration, I copy the new collection’s Amazon Resource Name (ARN) to use in the next step.
To create the Kinesis Data Firehose delivery stream, follow these steps:
- Open the Kinesis Data Firehose console at https://console.aws.amazon.com/firehose/home and choose Create delivery stream.
- For Source, choose Direct PUT.
- For Destination, choose Amazon OpenSearch Serverless.
- In the Destination settings section, for OpenSearch Serverless collection, choose Browse and then choose the collection that was created above.
- For Index, enter appfabric.
- In the Backup settings section, specify a bucket for the backup of data that fails record processing. (Create a new bucket in the region, if needed.)
- Choose Create delivery stream and wait for the stream to be created.
Image 3: Creating the Kinesis Firehose delivery stream
This created a new role for the delivery stream. Now I make two policy changes to enable the delivery to OpenSearch.
Allowing Kinesis Data Firehose to put data into OpenSearch Serverless
To modify the Data Firehose role’s policy, follow these steps:
- On the configuration pane for the delivery stream I just created, I choose the Configuration
- In the Service access section, choose the role link below IAM role.
- In the Permissions policies section, for Add permissions, choose Create inline policy.
- Under Select a service, search for OpenSearch Serverless and choose the result.
- Under Actions allowed, expand Write, and select APIAccessAll.
- Under Resources, choose Add Arn.
- Choose Text, paste the ARN of the OpenSearch Serverless collection created above, and choose Add ARNs.
- Choose Next, enter OpenSearch-API-Access for the Policy name, and choose Create policy.
If you inspect the new inline policy added to the role, it should be similar to this:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "aoss:APIAccessAll",
"Resource": "arn:aws:aoss:eu-west-1:444455556666:collection/abcdef01234567890 "
}
]
}
Before leaving the IAM role configuration, I copy the delivery stream role’s ARN to use it in the next section.
To modify the collection’s data access policy, follow these steps:
- Return to the Amazon OpenSearch Service console and view the details of the serverless collection created above.
- In the Data access section, choose the policy link below Associated policy.
- Choose Edit.
- In the section for Rule 1, under Select principals, for Add principals, choose IAM users and roles.
- In the text box, paste the delivery stream role’s ARN or use the search box to find the role by name.
- Choose Save, then choose Save You may close the Data access policies browser tab.
Creating an index template
When the logs are delivered to OpenSearch, I want to ensure that the time and IP address in the log are indexed properly. I create an index template to define those properties. To avoid the need to reindex data later, it is important to create this index template prior to sending any logs to OpenSearch.
To create an index template, follow these steps:
- Return to the Amazon OpenSearch Service console and view the details of the serverless collection created above.
- Choose the link for the OpenSearch Dashboards URL.
- From the dashboard home page, choose Dev tools.
- In the Dev Tools console, replace the existing text with:
PUT _index_template/appfabric_template
{
"index_patterns": ["appfabric"],
"template": {
"mappings": {
"properties": {
"time": {
"type": "date"
},
"device": {
"properties": {
"ip": {
"type": "ip"
}
}
}
}
}
}
}
In the above text, on the line for “index patterns,” the text “appfabric” must match the index name that you specified earlier when creating the Data Firehose delivery stream.
- Choose the play button to send the request. The result should be 200 – OK.
Image 4: Creating the OpenSearch index template
Now that OpenSearch Serverless is ready to receive data from Kinesis Data Firehose, I configure AWS AppFabric to aggregate the logs from the application sources.
Configuring AWS AppFabric
I now create AppFabric log ingestions for each of the supported applications that my organization uses. Each ingestion uses the same Kinesis Data Firehose destination, illustrating how additional applications can be aggregated with your existing data at any time.
For more details, review the Getting started section of the Administration Guide. The post, New AWS AppFabric Improves Application Observability for SaaS Applications, also provides a detailed walkthrough of AppFabric configuration.
How to create an App bundle
An app bundle stores all of your AppFabric configuration. You can create one app bundle per AWS region and specify if AppFabric should use an AWS-owned Key Management Service (KMS) key or a customer-managed key.
I now configure AppFabric to use an AWS-owned key for encryption. If you would like to use a key that you manage, make the appropriate changes in step 3.
- Sign in to the AWS Management Console and open the AWS AppFabric console at https://console.aws.amazon.com/appfabric/home .
- In the navigation pane, choose App bundle then choose Create app bundle.
- In the Encryption section, choose AWS owned, and then choose Create app bundle.
How to create the App authorizations
Complete these steps for each application to which AppFabric will connect. These steps require administrative access to the SaaS application.
- In the AWS AppFabric console navigation pane, choose App authorizations.
- Choose Create app authorization.
- In the App authorization section, for Application, choose the application for which you would like to retrieve audit logs.
The steps for completing the App authorization are unique to each SaaS application. Please consult the supported applications section of the AppFabric Administration Guide for the necessary details. Repeat the step of creating the App authorization for each supported application that you manage.
When you are finished configuring the authorizations, you should see that the status of each is Connected.
Image 5: AWS AppFabric app authorizations
How to create the ingestion
- In the AWS AppFabric console navigation page, choose Ingestions.
- Choose Create ingestion.
- For App authorization, select the first of the applications you configured above.
- For Destination, choose Amazon Kinesis Data Firehose.
- For Firehose delivery stream name, choose the delivery stream created earlier.
- For Schema & Format, choose OCSF – JSON.
- Choose Create ingestion.
Repeat these steps to create the ingestion for each application that you have authorized. Data ingestion will begin immediately. If desired, you can monitor the ingestion in the Kinesis Data Firehose console where delivery stream metrics are shown.
Viewing the data in OpenSearch
Before your logs can be viewed for the first time, an index pattern must be created in OpenSearch.
How to create the index pattern
- Return to the Amazon OpenSearch Service console and view the details of the serverless collection created earlier.
- Choose the link for the OpenSearch Dashboards URL.
- Choose Manage to go to Stack Management
- Choose Index Patterns and then choose Create index pattern.
- For Index pattern name, type “appfabric” and then choose Next step.
- For Time field, select “time” and then choose Create index pattern.
Image 6: Creating an OpenSearch index pattern
Using the OpenSearch Dashboard
Now that audit logs from your SaaS application are flowing into OpenSearch, you can use the features of OpenSearch to discover and visualize the logs. A sample dashboard is shown below. Consult the OpenSearch Dashboards documentation for more details.
Image 7: An example OpenSearch dashboard of the AppFabric audit logs
Cleaning up
To avoid incurring future charges, delete the following resources:
- The AppFabric ingestions and app authorizations. As described in the documentation, to clean up your AppFabric resources, you must delete them in the reverse order in which you created them.
- The Kinesis Data Firehose delivery stream.
- The OpenSearch Serverless collection.
Conclusion
This post showed how AppFabric aggregates normalized and enriched audit logs from supported SaaS applications and you learned how to send those logs to OpenSearch Serverless for analyzing. With this foundation, you can build the queries and visualizations that support your business objectives and your security investigation needs.
OpenSearch is one option for log analysis. You can also use AppFabric with the security tool of your choice. Get started with AppFabric today to correlate audit logs across applications and become more effective in securing your organization.