Analyze customer behavior to create a personalized customer experience
This Guidance helps you improve customer retention by performing data collection and analysis on customer demographics, behavior, and preferences. You can achieve data optimization by building a modern customer data platform and a data analytics pipeline that generates actionable data insights about your customers. With a modern data architecture on AWS, you can use purpose-built data services to rapidly build scalable data lakes, ensure compliance, and easily share data across organizational boundaries.
Architecture Diagram
Step 1
Data is collected from multiple data sources across the enterprise, including software-as-a-service (SaaS) applications, edge devices, logs, streaming media, and social networks.
Online web activity comes from web sites, social media platforms, emails, and online campaigns. Offline sources include purchase history and subscriptions – primarily customer relationship management (CRM) and 3rd party data.
Step 2
Based on the type of data source, you can ingest the data into a data lake in AWS by using AWS Database Migration Service (AWS DMS), AWS DataSync, Amazon Kinesis, Amazon Managed Streaming for Apache Kafka (Amazon MSK), or Amazon AppFlow.
Step 3
AWS Data Exchange can be used to integrate third-party data into the data lake.
Step 4
Build a scalable data lake by using AWS Lake Formation, and use Amazon Simple Storage Service (Amazon S3) for data lake storage.
Step 5
You can also use Lake Formation to enable unified governance, which helps you centrally manage security, access control (table, row, or column level security), and audit trails. It also enables automatic schema discovery and conversion to required formats.
Step 6
AWS Glue extracts, transforms, catalogs, and ingests data across multiple data stores. Use Glue DataBrew for visual data preparation and AWS Lambda for enrichment and validation.
Step 7
Amazon QuickSight provides machine learning (ML) powered business intelligence. Amazon Redshift is used as a cloud data warehouse. Amazon SageMaker and AWS ML services can be used to build, train, and deploy ML models, and add intelligence to your applications.
Redshift Spectrum and Amazon Athena have interactive querying, analyzing, and processing capabilities. Amazon Managed Service for Apache Flink is used to transform and analyze streaming data in real time.
Step 8
Store unified customer profile information in Amazon OpenSearch Service.
Step 9
Build a single customer profile view with the help of identity resolution data coming from Amazon Neptune.
Step 10
With Amazon API Gateway, you can expose developed APIs as microservices.
Step 11
Activate the unified customer data and send it to internal and external parties.
Well-Architected Pillars
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
-
Operational Excellence
The Customer Data Analytics Platform (CDAP) reference architecture is fully serverless. Your solution can be deployed with infrastructure as code and automation for fast iteration and consistent deployments. Use Amazon CloudWatch for application and Infrastructure monitoring.
-
Security
Use Lake Formation for unified governance to centrally manage security, access control (at the table, row, column security level), and audit trails. It also enables automatic schema discovery and conversion to required formats. API Gateway enforces policies that control security aspects such as authentication, authorization, or traffic management.
-
Reliability
Serverless architecture enables the solution to be automatically scalable, available, and deployed across all Availability Zones.
-
Performance Efficiency
By using serverless technologies, you only provision the exact resources you need. To maximize the performance of the CDAP solution, test with multiple instance types. Use API Gateway Edge endpoints for geographically dispersed customers. Use Regional for regional customers (and when using other AWS services within the same Region).
-
Cost Optimization
By using serverless technologies and automatically scaling, you only pay for the resources you use. Serverless services don’t cost anything while they’re idle.
-
Sustainability
Minimize your environmental impact. Data lake uses processes to automatically move infrequently accessed data to cold storage with Amazon S3 Lifecycle configurations. By extensively using managed services and dynamic scaling, this architecture minimizes the environmental impact of the backend services.
Implementation Resources
A detailed guide is provided to experiment and use within your AWS account. Each stage of building the Guidance, including deployment, usage, and cleanup, is examined to prepare it for deployment.
The sample code is a starting point. It is industry validated, prescriptive but not definitive, and a peek under the hood to help you begin.
Related Content
[Title]
Disclaimer
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.
References to third-party services or organizations in this Guidance do not imply an endorsement, sponsorship, or affiliation between Amazon or AWS and the third party. Guidance from AWS is a technical starting point, and you can customize your integration with third-party services when you deploy the architecture.