AWS for Industries

Banking Apps Built on AWS: A Deep Dive into SmartStream’s SaaS Architecture

The number of software companies building applications for the financial services industry (FSI) continues to grow. These fintech companies face challenges such as ensuring that their applications stay compliant with banking regulations while rapidly delivering new features for their clients. FSI software vendors have therefore been moving to the cloud to meet their agility, security, and cost objectives. In this blog post, we describe how SmartStream migrated its banking application to AWS, and how the company leveraged AWS security services to satisfy its clients’ compliance requirements. We focus, in particular, on key management systems and database audit trails on AWS.

About SmartStream

SmartStream Technologies Ltd. provides software and managed services for banks, asset managers, and broker dealers worldwide. Its software automates financial transactions and post trade processing for 70 of the world’s top 100 banks. SmartStream offers its Transaction Lifecyle Management (TLM) OnDemand application as a SaaS solution.

TLM OnDemand includes industry-leading solutions, such as reconciliations, corporate actions, collateral management, cash and liquidity management, fees and expense management, and reference data. Beyond TLM OnDemand, SmartStream also provides a Business Process Outsourcing service, which includes IT application support and business operations for financial institutions.

TLM Application Architecture

Many TLM applications were originally built as a Java application with an Oracle database, and deployed at the financial institution’s data center. For security reasons, it is a single tenant application, with each bank having its own database.

Every day, TLM ingests files of completed financial transactions via Secure FTP ( SFTP ) from its client banks. The TLM application processes and reconciles these transactions, and stores the results in the database. Personnel at each bank can use a Web browser to log in to TLM to view the post trade results and run reports.

The following diagram shows the final application architecture, planned in phases, after the migration to AWS.

SmartStream TLM On Demand Architecture on AWSFigure 1 – SmartStream TLM On Demand Architecture on AWS

File ingestion is handled by AWS Transfer for SFTP, which is a fully managed service that eliminates the need to maintain your own SFTP server. The bank uses its regular SFTP client to transfer the files, and AWS Transfer automatically stores each file in an Amazon S3 bucket. The SFTP protocol encrypts the files in transit. To comply with the bank’s requirements, all files are also encrypted at rest. Files remain in the S3 bucket for 92 days, after which they are automatically archived to Amazon S3 Glacier by using S3 Lifecycle rules.

TLM is a Java-based Web application which resides in an Auto Scaling group of Amazon EC2 instances placed in multiple Availability Zones.  The EC2 instances reside behind an AWS Application Load Balancer (ALB) which is tied to an AWS WAF (Web Application Firewall). In order for bank personnel to access TLM, they must first log in to their VPN. AWS WAF only allows traffic from the bank’s IP address and blocks all other addresses.

TLM runs through each transaction file, does the reconciliation and post trade processing, and stores the results in an RDS Oracle database. RDS Multi-AZ ensures that the database remains highly available, with a Primary instance in one Availability Zone, and a Standby instance in another.

Both the TLM application and RDS log errors and noteworthy events to Amazon CloudWatch, which is a centralized logging service. CloudWatch can be set up to generate alarms when certain errors or keywords appear in the logs.  IT personnel at the bank and at SmartStream can be notified via email, text message, or a Slack channel when an alarm occurs.

Compliance Considerations for Banks

Within the global banking industry, SaaS providers have to follow cloud hygiene guidelines and align to the expectations of different regulatory authorities such as the SEC, the European Banking Authority and the Monetary authority of Singapore. For example, the SEC has a rule that requires that certain trade records be immutable, in other words, that the data cannot be changed or deleted after it has been written. This is also known as WORM (Write Once Read Many) compliance, and may be implemented using S3 Object Lock, as described in this blog post.

For regulatory reasons, many FSI companies additionally require that their data be encrypted at rest. They also want to own and manage the keys used to encrypt that data. In cases where a SaaS provider hosts their data, banks want an audit trail that logs all events related to accessing sensitive information, such as credit card numbers. They also want to be alerted whenever permissions related to such access are changed. The next sections describe how SmartStream implemented these requirements.

Using AWS KMS with Cross-account Access

To enable the client bank to own the encryption key, while the SaaS provider uses it to encrypt the bank’s data, we need a mechanism to share the key between the bank’s AWS account and the SaaS providers’ account. To do this, we can use AWS Key Management Service (AWS KMS) with a Customer Master Key (CMK) that is managed by the bank. The bank needs to set up cross-account permissions for the SaaS provider to use the CMK.

Using AWS KMS to encrypt data in Amazon S3 and Amazon RDSFigure 2 – Using AWS KMS to encrypt data in Amazon S3 and Amazon RDS

This approach is described in this AWS KMS blog post. The bank first creates a CMK in its AWS KMS service. During the key creation process, the bank adds the SaaS provider’s AWS account as an external account that has permissions to use the key. After the steps are completed, the bank will provide the key’s ARN to the SaaS provider.

The SaaS provider can then encrypt the data in its RDS Oracle database using that CMK. The steps to do this are described in this RDS and AWS KMS blog post. Another CMK can be used to encrypt the bank’s data in Amazon S3, as described in this Amazon S3 documentation. Note that the SaaS application remains single tenant – each bank client is hosted in a separate AWS account and uses its own RDS Oracle database.

Using AWS KMS Bring Your Own Key (BYOK)

If the bank does not want to manage the encryption keys, then the SaaS provider can create the keys for RDS and Amazon S3 using the provider’s own AWS KMS, and not allow access to their AWS KMS by any other party.  This is the simplest solution for the provider to implement,

However, some banks want the ability to bring their own key material that is imported into the CMK. In that case, the SaaS provider can use AWS KMS’s Bring Your Own Key (BYOK) feature.

Using AWS KMS's Bring Your Own Key featureFigure 3 – Using AWS KMS’s Bring Your Own Key feature

The steps to import the key material are given in this AWS documentation.

RDS Oracle Fine Grained Auditing and CloudWatch Logs

In order to meet bank regulations for auditing access to sensitive data, SmartStream used RDS Oracle’s Data Redaction and Fine Grained Auditing (FGA) features. RDS can send FGA events to Amazon CloudWatch Logs, and CloudWatch alarms can then notify bank personnel of any unauthorized access or changes.

Oracle’s Data Redaction feature can mask sensitive information such as credit card numbers. For example, a Social Security Number or credit card number can be partially redacted to show only the last four digits, with the rest presented as asterisks. In any given table, certain columns can be identified as having redacted or masked data. Permissions can then be granted such that only specific Oracle users can read or write columns containing the redacted data.

Oracle’s FGA feature can be used to create audit policies that track access to such columns or changes in permissions. These policies generate events that can be sent to an audit trail. In order for the audit trail to be published to CloudWatch Logs, the audit events must be logged to an XML file (not to a database table). To generate audit logs in XML, SmartStream used the audit_trail parameter in the ADD_POLICY procedure, as described here.

The Oracle log files, including the audit file, can be published to CloudWatch Logs. This section, entitled Publishing Oracle logs to Amazon CloudWatch Logs explains how to do this. The logs appear as a separate Log Group in CloudWatch. CloudWatch alarms can then be created based on key words or error messages that appear in the logs, as described in this blog post.

Conclusion

As a fintech solution provider, SmartStream has achieved its agility, security, and compliance objectives by migrating to AWS. SmartStream estimates that it reduced clients’ reconciliations-related cost between 30-40% with its Business Process Outsourcing and OnDemand solutions, leveraging managed services such as RDS, CloudWatch Logs, and AWS KMS.

To learn more about building Financial Services applications on AWS, please visit https://aws.amazon.com/financial-services/.

Rana Dutt

Rana Dutt

Rana Dutt is a Senior Solutions Architect at Amazon Web Services. He has a background in architecting scalable software platforms for healthcare, financial services and telecom companies, and is passionate about helping customers build on AWS.

Peter Hainz

Peter Hainz

Peter Hainz is Global Head of Product Management for Managed Services at SmartStream. In this role, he establishes cloud security, architectural designs and outsourcing solutions for top financial institutions around the globe. Peter is passionate about teaching cloud and treasury workflows, which he lectured at three top universities in Vienna.

Saloni Shah

Saloni Shah

Saloni Shah is a Technical Account Manager at Amazon Web Services. She has a passion for databases and security, and focuses on helping customers keep their AWS environments operationally healthy and secure.