Transform your customer experience with a modernized banking architecture
This Guidance helps financial institutions build a modern core banking system using native AWS services. Banks traditionally have legacy core banking applications, which are monolithic and lack open architecture. With a modern cloud-based core, banks can be more agile and innovate to better serve their financial services customers by adding new functionalities and releasing features quickly.
Architecture Diagram
Step 1
The API layer interfaces the core platform with the upstream applications by creating and managing accounts.
Step 2
Real-time transactions come in from the payment networks.
Step 3
The transaction data Amazon Virtual Private Cloud (Amazon VPC) registers all transactions and hosts microservices to manage the ledger database.
Step 4
Data is replicated in real time from Amazon Quantum Ledger Database (Amazon QLDB) to a secondary database that performs better with query patterns such as scanning or searching data.
Step 5
Data from the secondary database is captured in Amazon Managed Streaming for Apache Kafka (Amazon MSK) using Kafka Connect to independently build and scale downstream applications.
Step 6
Downstream applications and microservices consume from Amazon MSK and scale independently of each other.
Step 7
Data from Amazon MSK is consumed to perform both real-time and batch data analytics.
Step 8
Batch files come in from the acquiring banks and are processed by the issuing bank. Transaction values are updated in the ledger database.
Step 9
The issuing bank’s data center is connected to the AWS environment using AWS Direct Connect, which offers reliable connectivity to the network routers and payment hardware security module (HSM) devices.
Well-Architected Pillars
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
-
Operational Excellence
The platform is built using native AWS services, which integrate natively with Amazon CloudTrail and Amazon CloudWatch for monitoring, logging, and auditing purposes. The applications are built as microservices and scale independently of each other using an event-driven architecture.
-
Security
The Amazon API Gateway and AWS WAF protects all of the API requests coming into the platform. The various resources are also logically isolated from each other using VPCs.
-
Reliability
All services are scalable to multiple AZs within the region to provide high resiliency. Reliability is also improved by using Amazon MSK to capture data and to build an event-driven platform.
-
Performance Efficiency
Amazon FSx for Lustre is a shared file system suitable for batch processing requirements where the batch jobs need to finish within a certain timeframe. In addition, real-time transactions need to be written to the database and the response sent within about 200ms. This is achieved by having Direct Connect with the bank’s data center for network connectivity and having as few hops as possible for the transactions to be written to the ledger database.
-
Cost Optimization
Amazon QLDB is a serverless database and the customer only pays for what they use. Amazon Elastic Kubernetes Service (Amazon EKS) also allows customers to build a microservices platform and scale the services as needs change.
-
Sustainability
Leveraging native AWS services and serverless technologies such as Amazon QLDB, API Gateway, Amazon Simple Storage Service (Amazon S3), and Amazon DynamoDB helps build a platform that scales with growth in business and helps the bank avoid building and keeping over-provisioned resources.
Implementation Resources
A detailed guide is provided to experiment and use within your AWS account. Each stage of building the Guidance, including deployment, usage, and cleanup, is examined to prepare it for deployment.
The sample code is a starting point. It is industry validated, prescriptive but not definitive, and a peek under the hood to help you begin.
Related Content
[Title]
Disclaimer
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.