AWS Cloud Operations & Migrations Blog

Using AWS Cost Explorer and cost allocation tags to view Amazon S3 costs by bucket

AWS customers often have many users and groups within their organization utilizing Amazon Simple Storage Service (Amazon S3) buckets. In addition, customers often need a way to accurately understand the costs on a per-bucket basis for cost observability and charge back mechanisms. This is also important if a customer is entering the AWS Migration Acceleration Program (MAP) where the S3 baseline cost per existing bucket is utilized to allocate credits. Instructions exist for how to manage this using the AWS Management Console, but this may not be viable when you have a large number of S3 buckets. In this post, we provide steps for automating the creation and assignment of cost allocation tags so they can be analyzed in AWS Cost Explorer.

Solution Overview

The setup consists of the following configuration steps:

Step 1: Programmatically add cost allocation tag to S3 buckets

In this step we will use a small Python script using the AWS SDK for Python (Boto3). This script will programmatically add a tag “cost:bucketName” with the value matching the name of the S3 bucket. When implementing a tagging strategy be sure to follow AWS tagging best practices.

The user executing these steps needs to have the following permissions:

s3:ListAllMyBuckets
s3:GetBucketTagging
s3:PutBucketTagging

  1. In the AWS Console, launch AWS CloudShell. AWS CloudShell is a browser-based, pre-authenticated shell that you can launch directly from the AWS Console.

Figure 1. Open AWS CloudShell window ready for user input

  1. Save and upload the below script as add_tag_to_buckets.py
#!/usr/bin/env python3
import boto3 
from botocore.exceptions import ClientError

# This script will add or update a tag "cost:bucketName" with the value of the bucket name to all S3 buckets

s3_client = boto3.client('s3')
s3_res = boto3.resource('s3')

response = s3_client.list_buckets()
buckets = response['Buckets']

print('Tagging all S3 buckets with new cost:bucketName Tag')

for bucket in buckets:
    print('tagging: ' + bucket['Name'])
    bucket_obj = s3_res.Bucket(bucket['Name'])
    bucket_tagging = bucket_obj.Tagging()
    try: 
        tags = bucket_tagging.tag_set
    except ClientError as e:
        if e.response['Error']['Code'] == "NoSuchTagSet":
            tags = []
        else:
            print("ERROR: ", e, " occurred retrieving existing Tags. skipping...")
            continue

    # Upsert cost:bucketName
    key_index = [i for i in range(len(tags)) if tags[i]['Key'] == 'cost:bucketName']
    if len(key_index) > 0:
        key_index = key_index[0]
        tags[key_index]['Value'] = bucket['Name']
    else:
        tags.append({
            'Key': 'cost:bucketName',
            'Value': bucket['Name']
        })

    try:
        res = bucket_tagging.put(Tagging={'TagSet': tags})
    except ClientError as e:
        print("ERROR: ", e, " occurred setting TagSet. skipping...")
        continue
  1. Run the script
    $ python3 add_tag_to_buckets.py

Step 2: Activate user-defined cost allocation tags

After you create and apply user-defined tags to your resources, it can take up to 24 hours for the tags to appear on your cost allocation tags page for activation. After you select your tags for activation, it can take up to another 24 hours for tags to activate and be available for use in Cost Explorer.

  1. Open the Billing and Cost Management console and from the left navigation pane, choose Cost allocation tags.
Billing & Cost Management Dashboard landing page

Figure 1. Billing and Cost Management Dashboard

  1. Under User-defined cost allocation tags, search for your tag name `cost:bucketName` and then activate the tag. This ensures that Cost Explorer and your AWS Cost and Usage Reports will include these tags.
Cost allocation tags with search filter containing “cost:bucketName”]

Figure 2. Cost allocation tags

Step 3: Confirm Amazon S3 costs in AWS Cost Explorer

Customers can then use the filters in AWS Cost Explorer to view S3 Bucket charges grouped by cost allocation tags to give a per bucket view.

  1. Add a new filter including the S3 (Simple Storage Service)
  2. Change the Group by to Tag: S3-Bucket-Name
AWS Cost Explorer, with Filter of Service to S3 (Simple Storage Service) and Group by specifying Tag: cost:bucketName

Figure 3: Group by cost allocation tag

Additionally, you can use Cost Explorer to further filter your cost data. In figure 4 you’ll see that we can exclude non tagged buckets, and exclude the Data Transfer costs to provide information for S3 Baselining in the Migration Acceleration Program (MAP).

Cost Explorer filters including S3 (simple Storage Service) and excluding AP-DataTransfer-Out-Bytes and No tag key cost:bucketName

Figure 4. Figure MAP S3 Baselining

In addition to the Tag being available in Cost Explorer, you can use it in Cost & Usage Reports for more detail analysis.

Conclusion

In this blog post I’ve shown how using AWS Cost Explorer and cost allocation tags can provide a solution for AWS customers who need to view their Amazon S3 costs on a per-bucket basis. The setup process is simple and can be automated, making it easy to manage even for large numbers of S3 buckets. By programmatically adding a cost allocation tag to each bucket, activating the user-defined tag, and analyzing the costs in AWS Cost Explorer, customers can gain greater visibility and control over their S3 costs, making it easier to allocate costs and make informed decisions about their usage of the service.

About the author:

Bryn Price

Bryn Price is a technologist, paragliding pilot, and Senior Specialist Solutions Architect at AWS. With more than 20 years of experience in the technology industry, from telecommunications, banking to software companies. He is now focusing on helping customers modernize their technology and transform their business to SaaS. He loves defying gravity and discussion of any topics from migration to microservices.