AWS Database Blog

Security auditing in Amazon RDS for Oracle: Part 2

Security auditing is an effective method of enforcing strong internal controls that can enable you to monitor business operations to find activities that may deviate from company policy and meet various regulatory compliance requirements. Security auditing allows you to record the activity on the database for future examination and is one part of an overall database security strategy.

In this series of posts, we look at the security auditing options available for Amazon Relational Database Service (Amazon RDS) for Oracle. In Part 1, we explained the relevance of security auditing and how to use Amazon RDS for Oracle security auditing features. This post takes a closer look at how Database Activity Streams (DAS) for Amazon RDS for Oracle allows you to stream your security audit data.

What is DAS and what does it provide?

Database Activity Streams is a feature that provides a real-time data stream of all database activity. When combined with partner applications, it helps you monitor, audit, and protect your databases from unauthorized access as well as meet compliance and regulatory requirements. It reduces the work required to satisfy the compliance goals and facilitates migration to managed database services in AWS. Real-time data provided by DAS is integrated into existing monitoring and alert infrastructure, allowing you to use your existing processes, tools, reports, and know-how.

Beyond external security threats, managed databases need to provide protection against insider attacks. Database administrators (DBAs) typically have elevated levels of access, and auditing inappropriate or unapproved actions of DBAs is a necessity—and, for many, a regulatory requirement. DAS protects you from these internal threats by implementing a protection model that DBAs can’t disable. Database activity is securely transmitted outside the scope of access by internal users, thereby guarding the data from internal threats.

When you enable Database Activity Streams, an agent process is installed in the database host machine. All SQL statements that are run in the database engine are sent to this agent process. SQL statements are sent in an encrypted form via Amazon Kinesis Data Streams to partner products, consumed in your own applications, or captured with Amazon Kinesis Data Firehose for real-time analytics. DAS supports all Amazon RDS supported instance types, with the exception of T2 and T3 micro and small instances. Please note that the memory optimized db.r5 classes, which use the naming pattern db.r5.instance_size.tpcthreads_per_core.memratio, aren’t supported.

Benefits and uses

Database Activity Streams provides a near-real-time stream of all SELECT, DML, DDL, DCL, and TCL run in your DB instance. The audit data is collected via unified database auditing (as discussed in Part 1). Your database audit records are encrypted and asynchronously pushed to a Kinesis data stream provisioned on behalf of your RDS for Oracle DB instance.

You can manage the permissions to enable and disable DAS via AWS Identity and Access Management (IAM). This enables the separation of duties between security and compliance personnel and DBAs. You can also integrate with third-party database activity monitoring tools for a more complete centralized audit solution. DAS is available for Oracle 19c Standard Edition or Enterprise Edition.

The Kinesis data stream is created automatically when this feature is enabled via the AWS Command Line Interface (AWS CLI), API, or the AWS Management Console. You can also configure AWS services such as Kinesis Data Firehose and AWS Lambda to consume the stream and store the data. Database Activity Streams is a free feature, but Kinesis Data Streams charges for data streams. As of this writing, when DAS is enabled, a new data stream is created. Reusing an existing data stream isn’t supported currently. If you have hundreds of RDS instances, we recommend streaming to a Firehose delivery stream and tagging each stream with an instance identifier when consolidating.

With DAS, the collecting, searching, and processing of the database audit logs takes place outside of the source database. This has the benefit of moving security audit analysis away from the source database. You still experience some resource consumption from enabling the native Oracle feature unified auditing.

We suggest implementing this solution for the following use cases:

  • Auditing for compliance
  • Building a data lake of audit records from various sources, including relational databases, thereby optimizing and streamlining the long-term storage of audit data
  • Integrating with security products for near-real-time incident monitoring
  • Although DAS isn’t the most efficient option, you can use it to capture all changes to a particular or group of tables and stream the data to data warehouse database
  • Protection against ransomware by streaming data across Regions or accounts

Operating model: DBA and DAS administrator

Because DAS is a service external to the Oracle instance, you can choose to separate the audit administration duties. The administration of DAS can be a different individual from RDS database administration. You might consider this to minimize the probability of fraudulent activity. Database engineers can be responsible for managing the activities related to the smooth functioning of the database to serve the application. The DAS administrator can monitor all database and data activities in a secure fashion without the risk of audited events being changed.

Prerequisites and costs

DAS requires the use of Oracle unified auditing. This is an Oracle feature in both Standard Edition 2 and Enterprise Edition. No extra Oracle licensed option is required to use DAS. Although DAS is fee to use, you may generate costs from the following:

  • Kinesis data streams (one per database)
  • A customer key in AWS Key Management Service (AWS KMS)
  • Additional storage space required to host these services

Enable Database Activity Streams

You can enable DAS via several different setups:

  • Grant access to partner applications for Kinesis Data Streams and KMS key to monitor the database activity
  • Connect Kinesis Data Streams to Kinesis Data Firehose to save activities to Amazon Simple Storage Service (Amazon S3) for long-term retention
  • Connect to Lambda to analyze or monitor your database activities

When enabling DAS, be mindful of the following:

  • DAS is currently supported for Oracle Database 19c using version 19.0.0.0.ru-2019-07.rur-2019-07.r1 and higher.
  • Oracle Multitenant is not supported.
  • Amazon RDS for Oracle Read Replicas are not supported.
  • In an RDS for Oracle DB instance, you create and manage audit policies yourself. Unlike Amazon Aurora, Amazon RDS for Oracle doesn’t capture database activities by default.

You should consider performance of enabling DAS. Unified auditing has a low overhead, as stated by Oracle:

“‪For typical use cases of auditing privileged users or auditing key database operations, the performance impact is so low that it cannot even be measured due to low audit volume spread throughout the week. You could begin to see performance impact of 1% when the audit load increases to a few hundred audit events/second. For most use cases, ‪you are not going to see overhead beyond this, but for cases where organizations want to audit application usage, it is best to tune the audit policies.” 

You can enable DAS at database creation time or change a current database on the Actions menu on the RDS console.

Because the audit stream is encrypted, you have to specify a KMS key upon starting DAS. Starting DAS clears existing audit data. Make sure you have this data backed up if needed. The permissions to modify unified auditing are locked down when DAS is started, so you can’t purge unified audit trail records or change unified auditing privileges.

When you enable DAS, you automatically create a Kinesis data stream that allows Oracle unified auditing events from the database to be sent asynchronously encrypted to the data stream. DAS manages the unified auditing for you with a policy called RDS_DAS_PROTECTION_POLICY. This policy makes sure that modification of the audit trail and use of audit commands is always audited. The system privileges audited are as follows:

POLICY_NAME                     AUDIT_OPTION                       CONDITION 
------------------------------ ----------------------------------- --------- 
RDS_DAS_PROTECTION_POLICY       AUDIT ANY                           NONE 
RDS_DAS_PROTECTION_POLICY       AUDIT SYSTEM                        NONE 
RDS_DAS_PROTECTION_POLICY       CREATE AUDIT POLICY                 NONE 
RDS_DAS_PROTECTION_POLICY       ALTER AUDIT POLICY                  NONE 
RDS_DAS_PROTECTION_POLICY       DROP AUDIT POLICY                   NONE 
RDS_DAS_PROTECTION_POLICY       AUDIT                               NONE 
RDS_DAS_PROTECTION_POLICY       NOAUDIT                             NONE 
RDS_DAS_PROTECTION_POLICY       AUDIT_ADMIN                         NONE

These audited events can then be consumed by targets such as Kinesis Data Firehose or Amazon S3. For more information, see Monitoring Amazon RDS for Oracle using Database Activity Streams.

Processing

You can process the database activity stream with the AWS SDK. In the following example, we use the AWS SDK for Python (Boto3) to take the encrypted stream and decrypt it to display in JSON format:

import base64
import json
import zlib
import aws_encryption_sdk
from aws_encryption_sdk import CommitmentPolicy
from aws_encryption_sdk.internal.crypto import WrappingKey
from aws_encryption_sdk.key_providers.raw import RawMasterKeyProvider
from aws_encryption_sdk.identifiers import WrappingAlgorithm, EncryptionKeyType
import boto3
REGION_NAME = 'eu-west-1'                    # us-east-1
RESOURCE_ID = 'db-ABCDEFGHIJKLMNOPQ' # db-ABCDEFGHIJKLMNOPQ'                       STREAM_NAME = 'aws-rds-das-' + RESOURCE_ID  # aws-rds-das-cluster-ABCD123456

enc_client = aws_encryption_sdk.EncryptionSDKClient(commitment_policy=CommitmentPolicy.REQUIRE_ENCRYPT_ALLOW_DECRYPT)

class MyRawMasterKeyProvider(RawMasterKeyProvider):
    provider_id = "BC"

    def __new__(cls, *args, **kwargs):
        obj = super(RawMasterKeyProvider, cls).__new__(cls)
        return obj

    def __init__(self, plain_key):
        RawMasterKeyProvider.__init__(self)
        self.wrapping_key = WrappingKey(wrapping_algorithm=WrappingAlgorithm.AES_256_GCM_IV12_TAG16_NO_PADDING,
                                        wrapping_key=plain_key, wrapping_key_type=EncryptionKeyType.SYMMETRIC)

    def _get_raw_key(self, key_id):
        return self.wrapping_key


def decrypt_payload(payload, data_key):
    my_key_provider = MyRawMasterKeyProvider(data_key)
    my_key_provider.add_master_key("DataKey")
    decrypted_plaintext, header = enc_client.decrypt(
        source=payload,
        materials_manager=aws_encryption_sdk.materials_managers.default.DefaultCryptoMaterialsManager(master_key_provider=my_key_provider))
    return decrypted_plaintext


def decrypt_decompress(payload, key):
    decrypted = decrypt_payload(payload, key)
    return zlib.decompress(decrypted, zlib.MAX_WBITS + 16)


def main():
    session = boto3.session.Session()
    kms = session.client('kms', region_name=REGION_NAME)
    kinesis = session.client('kinesis', region_name=REGION_NAME)

    response = kinesis.describe_stream(StreamName=STREAM_NAME)
    shard_iters = []
    for shard in response['StreamDescription']['Shards']:
        shard_iter_response = kinesis.get_shard_iterator(StreamName=STREAM_NAME, ShardId=shard['ShardId'],
                                                         ShardIteratorType='LATEST')
        shard_iters.append(shard_iter_response['ShardIterator'])

    while len(shard_iters) > 0:
        next_shard_iters = []
        for shard_iter in shard_iters:
            response = kinesis.get_records(ShardIterator=shard_iter, Limit=10000)
            for record in response['Records']:
               # pdb.set_trace()
                record_data = record['Data']
                record_data = json.loads(record_data)
                payload_decoded = base64.b64decode(record_data['databaseActivityEvents'])
                data_key_decoded = base64.b64decode(record_data['key'])
                data_key_decrypt_result = kms.decrypt(CiphertextBlob=data_key_decoded,
                                                      EncryptionContext={'aws:rds:db-id': RESOURCE_ID})
                print(decrypt_decompress(payload_decoded, data_key_decrypt_result['Plaintext']))
            if 'NextShardIterator' in response:
                next_shard_iters.append(response['NextShardIterator'])
        shard_iters = next_shard_iters


if __name__ == '__main__':
    main()

The output from the stream contains some internal messages; these include heartbeats and management on the unified audit stream by the database activity stream. You should strip out these messages when looking at the stream for database commands that have been run on the database by the application. For more information about the format of the messages, see Examples of an audit log for an activity stream.

If there is a problem with Kinesis Data Streams, the audit data is buffered on disk up to 90% usage. After 90% usage, we can’t track the changes with DAS except for the date timestamp where there is a gap in the audit stream. When Kinesis Data Streams becomes available, the buffered data is replayed by DAS.

If you encounter slow performance with Kinesis Data Streams, that may affect DAS. DAS uses the disk as a buffer and activates a feature in the Kinesis Data Streams to read data from disk at its upload speed so that data doesn’t accumulate on the for RDS for Oracle source database.

The following diagram illustrates the processing architecture.

The stream doesn’t separate out the different messages by type. If this is a requirement, you can use Lambda with the SDK for Python. For more information, see Filter Amazon Aurora database activity stream data for segregation and monitoring.

Summary

In this post, we took a deep dive into monitoring Amazon RDS for Oracle using Database Activity Streams. For an overall look at security auditing on Amazon RDS for Oracle, refer to Part 1 of this series. We welcome your comments.


About the Authors

Tom Harper is the Manager of EMEA Relational Databases Specialist Team, based in Manchester, UK. He works in the Enterprise space to support innovation in the database domain. His team helps customers adopt the best migration strategy and design database architectures on AWS with relational managed solutions.

Sundar Raghavan is a Principal Database Specialist Solutions Architect at Amazon Web Services (AWS) and specializes in relational databases. Among his customers are clients across multiple industries in support of Oracle, PostgreSQL, and migration from Oracle to PostgreSQL on AWS. Previously, Sundar served as a database and data platform architect at Oracle, Cloudera/Horton Works. He enjoys reading, watching movies, playing chess and being outside when he is not working.