This Guidance helps you centralize operations and set up seamless workflows among agency, advertiser, and publisher teams. It addresses key topics, including audience analysis, identity resolution, campaign insights visualization, personalized customer experiences, and media attribution. By following the best practices outlined, you can drive greater return on ad spend through improved collaboration between media planning, buying, analytics, and creative execution teams. The Guidance showcases how to leverage AWS for audience enrichment, data hygiene, democratizing performance data, and hyper-personalization.

Please note: [Disclaimer]

Architecture Diagram

Download the architecture diagram PDF 
  • Data Flow
  • This diagram shows the data flow process for advertising agency planning management. For a detailed reference architecture diagram, open the other tab.

  • Detailed Architecture Diagram
  • This architecture diagram shows how to modernize advertising planning management in detail. For the data flow, open the other tab.

Well-Architected Pillars

The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.

The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.

  • The Guidance uses Step Functions for workflow orchestration, enabling failure anticipation, source identification, and mitigation. The core services (Amazon Bedrock, AWS Clean Rooms, and AWS Entity Resolution) are fully managed, reducing operational burden. Step Functions interacts with these services using direct integration to perform business operations, monitor data flow, and anticipate failures.

    Read the Operational Excellence whitepaper 
  • Amazon DataZone streamlines data discovery and sharing while maintaining appropriate access levels. This service creates and manages IAM roles between data producers and consumers, granting or revoking Lake Formation permissions for data sharing. By using IAM, you can help ensure that policies have minimum required permissions to limit resource access, reducing unauthorized access risks.

    Read the Security whitepaper 
  • Step Functions orchestrates workflows by monitoring AWS Entity Resolution workflow status and direct service integration with Amazon Bedrock. Step Functions monitors workflows and automatically handles errors and exceptions with built-in try/catch and retry. It also automatically scales the operations and underlying compute to run the steps of the workflow in response to increase in requests.

    Read the Reliability whitepaper 
  • You can achieve business use cases in near real-time by invoking LLM models through API calls to Amazon Bedrock. AWS Entity Resolution allows for record matching, using rule-based or machine learning (ML) models on-demand or automatically. As fully managed services that reduce overhead from managing underlying resources, Amazon Bedrock and AWS Entity Resolution enhance performance efficiency through reduced operational burdens.

    Read the Performance Efficiency whitepaper 
  • S3 buckets for the campaign analytics module use the S3 Intelligent-Tiering storage class, reducing costs based on access patterns. By leveraging S3 Intelligent-Tiering, storage costs are reduced based on data access patterns.

    You can review QuickSight author and reader account activity to identify and remove inactive accounts. Removing inactive QuickSight accounts minimizes the number of required subscriptions, further optimizing costs.

    Read the Cost Optimization whitepaper 
  • Athena's query result reuse feature reduces the usage of compute resources for running the same SQL queries on large datasets within a specific time period, returning the same results. This feature minimizes redundant compute resource usage, supporting sustainability efforts.

    Read the Sustainability whitepaper 

Implementation Resources

A detailed guide is provided to experiment and use within your AWS account. Each stage of building the Guidance, including deployment, usage, and cleanup, is examined to prepare it for deployment.

The sample code is a starting point. It is industry validated, prescriptive but not definitive, and a peek under the hood to help you begin.

[Content Type]


This [blog post/e-book/Guidance/sample code] demonstrates how [insert short description].


The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.

References to third-party services or organizations in this Guidance do not imply an endorsement, sponsorship, or affiliation between Amazon or AWS and the third party. Guidance from AWS is a technical starting point, and you can customize your integration with third-party services when you deploy the architecture.

Was this page helpful?