AWS for Industries

How cloud increases flexibility of trading risk infrastructure for FRTB compliance

The Basel Regulatory Framework strives for a strict boundary between financial institutions’ trading books and banking books to prevent regulatory arbitrage. The Trade Book element of these regulations (the Fundamental Review of the Trading Book or FRTB) seeks, in exchange for more rigorous risk modeling, to strengthen trade risk management allowing institutions to optimize the level of reserve capital. FRTB requires institutions to pursue risk analysis via one of two well-prescribed mechanisms: the Internal Model Approach (IMA) – which minimizes capital reserve requirements, or the easier-to-implement Standardized Approach (SA). The Basel III Monitoring Report (October 2019) indicates that the overall minimum required capital (MRC) could rise as much as 18.6% for large European banks: the impact on an individual bank dependent upon whether it chooses to calculate risk capital requirements using the SA or IMA. Another limited impact study concluded that up to 10% capital efficiency saving could be achieved by implementing IMA in preference to SA. However, contrary to expectations, market analysis indicates a decrease in the adoption of IMA. Relative to SA, IMA is perceived to be significantly more complex and costly to implement, and few banks have done sufficient research into the IMA calculation method to quantify the benefit. Due to the potential systemic risks should all banks adopt SA, Regulators are asking some global systemically important banks (G-SIB) to implement IMA. In this post, we discuss why risk managers and heads of trading desks, irrespective of whether they implement IMA or SA, must look beyond legacy IT systems to address these challenges.

What are the challenges associated with FRTB? Compute workloads are expected to increase between three- and tenfold. Leveraging specialized processing hardware (for example, FPGA, GPUs) and/or more sophisticated distributed architectures can deliver corresponding increases in compute performance. However, to achieve this, a bank’s risk algorithms would require significant refactoring or replacement. Some institutions may need to pursue this modernization approach, while others will – at least initially – scale existing algorithms as cost-effectively as possible.

Meanwhile, from a data perspective, FRTB increases the volume of data needed by tenfold. This is a consequence of the new regulations requiring model stress testing look back periods to be extended from one to ten years. This historical data must also pass several data quality checks that must be evidenced. Larger volumes of high-quality market data with known lineage are necessary to provide sufficient context to meet the Risk Factor Eligibility Test (RFET). If a risk factor cannot meet RFET data requirements, it must be treated as a non-modelable risk factor (NMRF). The resultant predictions must be consistent with the actual trading desks’ daily P&L positions. If the IMA method fails to align, then trading desks must revert to the SA until a modified IMA model is validated; which, in turn, directly impacts the profitability of that trading desk.

Whether migrating to SA as the final destination, or as a temporary stepping stone before moving on to fully leverage the benefits of IMA, one must address several technology challenges:

  1. How do you optimally deliver the increased computational and storage requirements?
  2. How do you address FRTB’s significant data management, quality, and lineage challenges?
  3. How do you ensure solution flexibility to enable trading businesses to adapt to a continuously shifting regulatory landscape?

In addition, recent periods of extreme market volatility have demonstrated the fragility of existing on-premises risk systems; the rapid intraday risk recalculations stretching some systems beyond breaking point.

The need for on-demand, highly elastic, and cost-efficient compute resources has never been more pressing. Indeed McKinsey suggests, “what is needed is nothing less than a fundamental overhaul” of trading-risk infrastructure.

Data – Enabling cost-effective storage, ensuring quality, governance, and lineage at scale

A robust data strategy is at the heart of a successful FRTB implementation, but how can the required approach be rapidly and cost-effectively implemented?

FSI organizations already store and analyze large quantities of data in the cloud. FINRA, a not-for-profit organization authorized by the US Congress to monitor market activity and protect America’s investors, stores up to 155 billion market events per day in Amazon Simple Storage Service (Amazon S3), which represents a daily new data volume of up to 7 terabytes. In aggregate, FINRA stores over 37 petabytes of data, and can query more than 67 trillion records online. Euronext, looking to establish a fully GDPR-compliant and encrypted data solution, moved its data lake for post-trade data, comprising 400 billion records under management to AWS, and is adding 1.5 billion trading-related messages a day. Meanwhile, by leveraging AWS data infrastructure, DTCC turned a potential $4 million capital expense into $903/month of operational costs.

However, securing robust, cost-effective, and scalable data infrastructure is only the start of the journey. FRTB requires banks to establish demonstrable firm-wide data governance and lineage. Variance in data quality, availability (across jurisdictions and products), and changes must be actively monitored and securely managed. In response, AWS customers are building data pools using Amazon S3 to collect and curate the required organizational data and metadata, such as sources, timestamps, owners. These data pools may then be monitored and managed through AWS’s data management services, which enforce fine-grained auditable data access controls for groups or other services. Downstream applications may extract, transform, and load (ETL) this data using AWS Glue, which also provides schema version history, and enables data changes to be tracked over time. Meanwhile, the AWS Transfer Family, AWS Storage Gateway, or if the quantity of data to be ingested is sufficiently large, a member of the AWS Snow Family provides the mechanisms via which structured or unstructured data can be ingested at scale from the customer’s on-premises data stores and archives.

Financial services institutions are increasingly ingesting generic news, trade, and market information from third-party data service providers. Reuters (using AWS Data Exchange), provides a 30-day News Archive that includes breaking news in the financial services industry.  Bloomberg Market Data Feed (B-PIPE) offers programmatic access to Bloomberg’s complete catalogue of content covering the same asset classes as the Bloomberg Terminal. Data services from other third-party data vendors and exchanges include offerings from CME Group, FactSet, ICE Data Services, Morningstar, Refinitiv, and Xignite. Driven by FRTB, interbank collaborations are also emerging. These collaborations pool data to broaden the coverage and enable more assets to be modeled, thereby reducing the number of NMRF and minimizing the capital requirements for each participant.

For example, in February 2019 CanDeal announced DataVault Innovations, a collaboration with TickSmith and the six largest banks in Canada (BMO Nesbitt Burns Inc.CIBC World MarketsNational Bank Financial Inc.RBC Capital MarketsScotia Capital, and TD Securities). DataVault is the world’s first production-grade, multi-party FRTB data pool and has delivered a +400% increase in FRTB modelability for its participants. Maintaining data lineage and control in such multi-party systems is a challenge. AWS Data Exchange addresses this by enabling a data producer to design curated datasets and securely offer them to multiple data consumers. Subsequent changes to the contents of a dataset result in a new dataset being created during a revision creating a new version. The AWS Data Exchange producer may then publish the latest data, making it available to authorized consumers. These AWS Data Exchange mechanisms give data consumers full control over which revision of the data repository they use; thereby addressing FRTB’s strict data lineage requirements.

Highly elastic and diverse computational capabilities

AWS is providing highly scalable and cost-effective cloud compute to customers. For example, a US-based global systemically important bank (G-SIB) runs models for its Comprehensive Capital Analysis and Review (CCAR) reports with peak usage of 100,000 Amazon Elastic Compute Cloud (Amazon EC2) cores on AWS. The bank reduced its compute time per job from days to minutes and run models that were previously not possible on-premises. By better managing its capital, the bank was able to redeploy millions of dollars. Other customers scale even further: Western Digital runs over 2.5 million simulation jobs on a single high performance computing (HPC) cluster of 1 million virtual CPUs. Such compute scale and savings are highly relevant when considering FRTB’s requirements.

Moreover, building on AWS can help banks decrease the need for complex trade-offs by enabling optionality. At each point in the FRTB journey, the bank can optimize performance/cost by selecting from 14 different families of Amazon EC2 CPU instances and 175 CPU instance types optimized for compute-intensive, memory-intensive, or AI/ML workloads. A bank might pragmatically decide to initially scale out existing risk algorithms across a population of cost-effective computing resources and choose to evaluate optimization or replacement alternatives only at a later time. To enable this flexibility, they can draw on the several services on AWS: AWS Batch, which provides a traditional scale-out HPC grid solution familiar to most banks; Amazon EMR for data-intensive workloads; and Amazon SageMaker for industry-leading machine learning at scale.

For example, FINRA deploys 50,000 compute nodes supporting AWS analytics and AI/ML services to run half a trillion validation checks per day on trillions of records. Meanwhile, to aid risk algorithm modernization or replacement strategies, the AWS Partner Network offers services to help customers explore different modeling approaches and benchmarked both performance and veracity against the baseline provided by the customer’s existing systems. Finally, for bank’s that want to shorten implementation times and avoid in-house development, the following FRTB solutions (IMA and SA) are available from AWS Partner Network: IHS Markit FRTB Solution Suite, Murex MX.3, Calypso and SimplexFX.

Cloud agility enabling business sustainability

By converting large upfront capex into ongoing opex expenditure where customers pay only for what they use, AWS enables financial institutions to pursue sustainable FRTB IMA strategies even under the most challenging of economic conditions. AWS infrastructure allows such solutions to be scaled globally without worrying about the heavy lifting usually involved, like expensive networking hardware or complicated WAN networks. G-SIBs can cost-effectively deploy FRTB SA or IMA solutions to each location as mandated by the local regulator and then rapidly adapt these solutions when local regulations subsequently change. Such agility is not only essential when addressing large-scale regulatory driven transformations, but also common place day-to-day operational challenges. Risk managers and heads of trading desks will be familiar with the following scenario:

The local regulator challenges a bank about the appropriateness of the risk models used. Resolving this potential issue requires ongoing interaction between the regulator and the bank’s quantitative analysts team. Once the modified risk algorithms address the regulator’s concerns, the regulator requests that daily capital tests are re-run using the new models over an extended historical period (years) and discrepancies – relative to previously results – reported.

Banks with traditional on-premises IT may have no alternative but to cannibalize existing development and UAT resources, thereby freezing new developments for many months. In contrast, banks with AWS based risk systems may simply and rapidly create one or more “back testing” environments with no impact on “business-as-usual” activities; the integration/continuous delivery (CI/CD) pipeline enabling the bank’s quantitative team to rapidly iterate and converge risk models towards the regulator’s requirements.

To conclude, compliance with FRTB regulations presents a challenge, especially so in the current economic environment. It should be noted that Basel’s banking book regulations pose an identical set of technology challenges for a bank’s middle office. Yet such challenges also provide strategic opportunities. As financial institutions modernize and evolve their IT systems, they create benefits that extend far beyond the initial transformational drivers. The agility and innovation enabled by the cloud are multiplicative and a catalyst for organizational change, driving operational efficiency and opening the door to accelerating innovation on behalf of customers.

For readers interested in technical details, the post How to improve FRTB’s Internal Model Approach implementation using Apache Spark and Amazon EMR compares and contrasts the use of classic Black Scholes and Longstaff Schwartz (LS) models for FRTB Option Pricing measuring performance across a range of AWS Compute instance types.

Richard Nicholson

Richard Nicholson

Richard is a Principal Solution Architect in the Amazon Web Services (AWS) Financial Service EMEA business and market development team. Richard works on areas as diverse as front office risk system architectures and back office core mainframe migration. Prior to AWS, Richard spent 18 years in his own company focused on the development and use of runtime self-adaptive software systems across a diverse range of industries including Finance Services and Industrial IoT. An Astrophysicist by training, Richard entered the Financial Service industry in 1995, as an Infrastructure Systems Administrator for Salomon Brothers.

Stephan Schmidt-Tank

Stephan Schmidt-Tank

Stephan leads Amazon Web Services’ (AWS) specialist team for the Financial Services industry in the EMEA region. In this role, he is responsible for leading the development and execution of AWS’s strategic initiatives in the financial services industry in the UK/Ireland, Europe, the Middle East, and Africa. He works with customers across banking, payments, capital markets, and insurance to help them transform their existing businesses and to bring new, innovative solutions to market by leveraging AWS services. Stephan has more than 19 years of experience driving transformational change at a variety of organizations. Before joining AWS, he served as the Chief Operating Officer for Operations & Technology and the Head of Structural Reform at Barclays Investment Bank in London. He also served as Chief of Staff at Barclays Africa Group in Johannesburg and led the Strategy Team for Barclays Group. Prior to joining Barclays, Stephan advised financial institutions as a consultant at McKinsey & Company for more than eight years.