This Guidance uses the MACH principles of Microservices, API-first, Cloud-native SaaS, and Headless applications to seamlessly integrate multiple systems on the AWS platform. Unified Commerce encompasses all customer-facing touch-points, and delivers a unified experience regardless of channel, breaks down the silos of a multi-channel approach, and puts marketing and operations together to delight your customer with a coherent brand engagement that will increase advocacy.
- Frontend applications, or heads, use a common set of microservices and other applications that are abstracted behind an API layer such as AWS AppSync, creating headless applications.
- Common microservices such as Amazon DynamoDB and Amazon Neptune provide application logic and data to power the frontend experience applications. They usually provide services that differentiate the retailer’s offer from that of their competitors.
- Software-as-a-service (SaaS) applications are used where possible to provide mature evergreen application logic, especially where the service is undifferentiating for the retailer.
- Traditional commercial off-the-shelf (COTS) applications can also be deployed in AWS services such as Amazon Elastic Compute Search (Amazon EC2) and Amazon Relational Database Service (Amazon RDS) to provide application services that are not available as SaaS or have not yet been decomposed into microservices.
- Existing systems of record or location-based systems, such as on-premises warehouse management systems and enterprise resource planning (ERP) or finance software, are also integrated behind the aggregation API.
- All microservices and applications produce events that are published to Amazon EventBridge custom event buses and consumed by decoupled applications by using rules.
- Application data and events are streamed into a data platform such as Amazon Simple Storage Service (Amazon S3) or Amazon Athena for real-time and historical analysis and reporting.
- Personalization for dynamic content and marketing offers is based on real-time events and pushed to the customer on their chosen engagement channels.
Machine learning uses the data layer as source for generating forecasts and intelligent insight.
The proposed architecture is capable of running at scale as it leverages managed services where possible. The traditional COTS applications would leverage Amazon EC2 instance metrics with Amazon CloudWatch alarms and logs. Auto Scaling groups and managed Amazon RDS can recover from failure.
The architecture uses managed services where possible, so a large portion of security responsibility falls to AWS, following best practices of security including Amazon S3 encrypted data, IAM roles scoped down, and Amazon DynamoDB encryption at rest. Strong identity is enforced for Consumers through Amazon Cognito, and for operators through IAM roles. Amazon CloudWatch Logs and Amazon CloudTrail provide Traceability, and can be used with organization-wide capabilities, such as Amazon Guard Duty, AWS Security Hub, and a central SIEM.
Using managed services, reliability is achieved by default. Redundancy in storage on Amazon S3 and Amazon DynamoDB, scaling of Amazon SageMaker instances, Amazon Redshift, Amazon Athena, Amazon Forecast, Amazon Pinpoint, Amazon Personalize, Amazon AppSync, and Amazon EventBridge are also highly available by design. In case of any issues, the data can be replayed from raw events on Amazon S3 using the same pipeline. Events can also be replayed by using the Amazon EventBridge archive and reply functionality. The container architecture scales horizontally on a choice of either Amazon Elastic Container Service (ECS) or Amazon Elastic Kubernetes Service (EKS) running on Fargate and dynamically adapts to capacity demands.
The use of managed and serverless services ensures the minimum cost for the architecture, because they’re designed to charge only when in use.
The proposed architecture uses managed and serverless services where possible to have a sustainable approach, only running when needed. The AWS customer carbon footprint tool can be used to obtain total impact figures.
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.