[SEO Subhead]
This Guidance shows how your travel and hospitality (T&H) business can effectively detect fraudulent transactions and gain fraud prediction insights on AWS. Amazon SageMaker processes point-of-sale (POS) terminal data and detects anomalies, then uses ML to predict fraud and quickly alert fraud prevention teams. Additionally, Amazon Bedrock converts fraud prediction datasets for natural language querying so that your leadership team can gain understanding through a chatbot UI by using plain English. By using this Guidance, you can better detect and prevent fraud in real time while garnering insights to support decisions that protect your business assets.
Note: [Disclaimer]
Architecture Diagram

[Architecture diagram description]
Step 1
Ingest POS data to AWS using various data transfer services, including Amazon API Gateway, AWS DataSync, Amazon Kinesis Data Streams, or Amazon Managed Streaming for Apache Kafka (Amazon MSK).
Get Started

Well-Architected Pillars

The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
-
Operational Excellence
SageMaker simplifies building, training, and deploying ML models, and QuickSight enables you to create interactive dashboards and visualizations without specialized skills or complex data engineering. As fully managed services, SageMaker and QuickSight handle infrastructure scaling, maintenance, and automation. This frees you to focus on optimizing ML models and gaining insights from data visualizations, rather than managing underlying complexities.
-
Security
This Guidance uses AWS Identity and Access Management (IAM) to support least privilege access and role separation, reducing the risk of unauthorized access or actions. Additionally, Amazon S3 bucket policies support bucket access control, encryption at rest, version and object locking, and access logging. These policies enable you to make sure that only authorized roles, users, or services can read or write data to the buckets. You can also use the log data for security analysis, auditing, and compliance purposes. Finally, QuickSight supports row-level security, enabling you to control access to specific data rows based on user or group membership, preventing unauthorized data exposure.
-
Reliability
SageMaker provides automatic scaling, fault tolerance, and model validation features to support reliable and accurate ML model deployment. It automatically distributes training data and models across multiple Availability Zones (AZs) and enables automated retraining to maintain model accuracy over time. Additionally, Amazon S3 provides reliable and durable data storage capabilities through its built-in high-availability features. For example, it replicates data across multiple AZs and redundant storage facilities, minimizing data loss and supporting consistent data access.
-
Performance Efficiency
SageMaker offers optimized hardware instances, distributed training, and automatic model tuning to improve the performance and efficiency of your ML workloads. For example, you can choose optimal instance types, and your workload scales based on demand, resulting in efficient utilization of compute and memory resources. Additionally, Amazon S3 provides highly scalable and low-latency data storage, enabling efficient data ingestion and retrieval for training and deploying ML models.
-
Cost Optimization
Amazon S3 offers lifecycle policies and storage classes that optimize storage and costs. For example, Amazon S3 Intelligent-Tiering automatically moves objects to the most cost-effective storage tier based on access patterns. Additionally, SageMaker offers per-second billing for training instances, so you only pay for the resources you use, and it scales automatically to optimize costs. And as a managed service, it removes the overhead of provisioning and managing your own ML infrastructure.
-
Sustainability
Amazon Bedrock provides a sustainable foundation for running data-intensive workloads. By scaling automatically, it optimizes resource usage and avoids the environmental impact of overprovisioned or inefficient infrastructure. Additionally, Amazon S3 provides highly durable and secure object storage that protects data integrity, minimizing the need for redundant backups. It also supports efficient data storage and management practices, such as Amazon S3 Lifecycle policies like Amazon S3 Intelligent-Tiering and object expiration. Using these management capabilities, you can minimize the storage resources required for your workloads and reduce the energy consumption associated with storing and processing data.
Related Content

[Title]
Disclaimer
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.
References to third-party services or organizations in this Guidance do not imply an endorsement, sponsorship, or affiliation between Amazon or AWS and the third party. Guidance from AWS is a technical starting point, and you can customize your integration with third-party services when you deploy the architecture.