Why are my CloudWatch logs failing to export to S3 buckets?

Last updated: 2020-05-26

I want to export my Amazon CloudWatch log data to Amazon Simple Storage Service (Amazon S3) buckets. However, the export task failed. Why are my CloudWatch logs failing to export to S3 buckets?

Resolution

To troubleshoot tasks that failed during creation, check the following settings:

  • Region – Confirm that your CloudWatch Logs log streams and S3 buckets are in the same Region.
  • S3 bucket policies – By default, all S3 buckets and objects are private. Only the resource owner (the AWS account that created the bucket) can access the bucket and any objects it contains. Use bucket policies to set access permissions on the S3 buckets to CloudWatch Logs.
  • S3 bucket prefixes – When you set the S3 bucket policy, it's a best practice to include a randomly generated string as the prefix for the bucket. If you use a prefix, you must also specify the randomly generated string in the S3 bucket prefix settings when you create the export task. Otherwise, the export task creation fails.
  • AWS Identity and Access Management (IAM) policies – Confirm that the IAM user (IAM role) who created the export task has full access to Amazon S3 and CloudWatch Logs.
  • Resource quotas – There are CloudWatch Logs service quotas that restrict the number of running or pending export tasks per account per Region. Be sure that you are operating within the allowed quotas.
  • Type of server-side encryption – Be sure that you're using a supported type of server-side encryption. Exporting to S3 buckets encrypted with SSE-KMS is not supported. Exporting to S3 buckets that are encrypted with AES-256 is supported.

To troubleshoot tasks that failed after creation, check the Time Range setting. If you export log streams with large amounts of data and specify a long time range, the export task might fail. In this case, specify a shorter time range.

Note: It may take up to 12 hours for the logs to be available for exporting and the export task itself can take some time. For real-time processing or continuously archiving new data to S3, use subscription filters. You can stream to Amazon Kinesis Data Firehose and set Amazon S3 as the target. For archiving historical data to S3, export your data to Amazon S3.