Skip to main content

AWS Solutions Library

Guidance for Implementing Google Privacy Sandbox Key/Value Service on AWS

Allows demand and sell-side platforms to query real-time data to inform bidding or auction logic through the Chrome Protected Audience API (PAAPI)

Overview

This Guidance demonstrates how to deploy the Google Chrome Privacy Sandbox Key/Value service within a trusted execution environment (TEE) on AWS. The Key/Value service allows implementers to fetch real-time signals to inform remarketing to custom audiences through the Protected Audience API (PAAPI). This real-time data helps ad buyers determine how to bid and helps sellers pick winning bids in a privacy-enhanced way. This Guidance intends to simplify the implementation of the Key/Value service while optimizing for cost and latency.

How it works

Overview

This architecture diagram shows an overview of how to deploy Privacy Sandbox’s Protected Audience API Key/Value service. For the data loading component, open the other tab.

Architecture diagram overviewing the integration between AWS and Google Privacy Sandbox Key/Value service. Shows flow involving Android and Chrome browser users, advertising technology users, AWS WAF, Elastic Load Balancing, AWS Nitro Enclaves, Amazon SNS, Amazon SQS, Amazon S3, NAT gateway, and Google or third-party KMS.

Data loading

This architecture diagram shows the data loading component and demonstrates patterns for ingesting the first-party data needed for real-time auction and bidding. For an overview, open the other tab.

Architecture diagram illustrating an AWS Cloud solution for an advertising technology company, showing the data flow from first-party data sources through Amazon SQS, Amazon Kinesis Data Streams, Amazon S3, AWS Lambda, Amazon ECS, Amazon EventBridge, Amazon SNS, and AWS Nitro Enclaves for Google Privacy Sandbox key/value service data loading.

Well-Architected Pillars

The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.

Amazon CloudWatch, which provides logging and tracing (through AWS X-Ray) for the Key/Value service running in Nitro Enclaves and Amazon ECS, makes it easy for you to understand the performance characteristics. Additionally, Amazon Elastic Container Registry (Amazon ECR) lets you store and version containers for deployment to Amazon ECS, and Amazon EC2 Auto Scaling makes it easy to manage the scaling and compute of the Key/Value service running in Nitro Enclaves.

Read the Operational Excellence whitepaper 

AWS Identity and Access Management (IAM) policies enable you to scope services and resources to the minimum privilege required to operate. Amazon S3 encrypts data at rest using server-side encryption with Amazon S3 managed keys. AWS WAF provides distributed denial of service protection for the public endpoint to prevent malicious attacks, and you can easily create security rules to control bot traffic. Additionally, this Guidance uses Amazon VPC endpoints to enable the services running in Nitro Enclaves and Amazon ECS to privately connect to supported AWS services.

Read the Security whitepaper 

This Guidance stores data in Amazon S3 before uploading it into the Key/Value service, so you can use S3 Versioning to preserve, replicate, retrieve, and restore every version of an object. This enables you to recover from unintended user actions and application failures. Additionally, this Guidance uses Auto scaling groups to spread compute across Availability Zones, and Elastic Load Balancing distributes traffic to healthy Amazon EC2 instances. Finally, Amazon SNS and Amazon SQS, which serve as an endpoint for low-latency data updates, provide a push-and-pull-based system to deliver data and facilitate data persistence before it is processed downstream.

Read the Reliability whitepaper 

Amazon ECS reduces the complexity of scaling containerized workloads on AWS. By using Amazon ECS containers that run on AWS Graviton processors, you can achieve higher throughputs and lower latencies for requests. Additionally, Amazon SNS can publish messages within milliseconds, so upstream applications can send time-critical messages to the Key/Value service through a push mechanism.

Read the Performance Efficiency whitepaper 

As a serverless service, Amazon S3 automatically scales your data storage, enabling you to optimize your costs based on the storage you actually use, and you can avoid the costs of provisioning and managing physical infrastructure. Additionally, Amazon S3 provides different storage classes that help you further optimize storage and reduce costs, and you can set up lifecycle rules to delete data after a period of time. Amazon ECS also helps optimize costs by dynamically scaling compute resources up and down based on demand, so you only pay for the resources used. Finally, by decreasing the number of empty receives to an empty queue, Amazon SQS long polling enhances the efficiency of processing inventory updates, further optimizing costs.

Read the Cost Optimization whitepaper 

Instances powered by AWS Graviton processors enable you to serve more requests at lower latency and with up to 60 percent less energy than comparable Amazon EC2 instances, helping you lower the carbon footprint of your workloads. Additionally, Amazon SQS long polling improves resource efficiency by reducing API requests, minimizing network traffic, and optimizing resource utilization.

Read the Sustainability whitepaper 

Deploy with confidence

Ready to deploy? Review the sample code on GitHub for detailed deployment instructions to deploy as-is or customize to fit your needs. 

Go to sample code

Disclaimer

The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.

Did you find what you were looking for today?

Let us know so we can improve the quality of the content on our pages