AWS HPC Blog

Real-time quant trading on AWS

Real-time quant trading on AWSThis post was contributed by Boris Litvin, Financial Services SA; Sam Farber, Startup SA; Ronny Rodriguez, Senior Financial Services TPM; Adeleke Coker, Global Solutions Architect

There are a variety of vendors already providing real-time quant trading systems, the question is: “do we need another one?” We believe the answer is: “yes.” Quant trading is a never-ending arms race to extract alpha – the excess return a trader is able to attain, relative to the market. This is fueled by a flywheel of new compute capabilities, data analytics, and AI/ML. These made short-term (intraday) and mid-term (intraday to one week) alphas not only easier to discover, but also more feasible to execute.

As a result, we see an increased demand for a real-time quant trading cloud-native system to research and execute short/mid-term alpha, a popular strategy due to the higher risk/adjusted returns when compared to traditional ones.

In this post, we’ll show you an open-source solution for a real-time quant trading system that you can deploy on AWS. We’ll go over the challenges brought on by monitoring portfolios, the solution, and its components. We’ll finish with the installation and configuration process and show you how to use it.

The nature of these short to mid-term trading strategies is very elastic, causing the timing of trading signals to be unpredictable. Portfolios change often, frequently overlap, and often contain hundreds to thousands of different positions. This in turn generates unpredictable and uneven demands for the compute capacity needed to manage these short-term portfolios throughout their lifecycle in real-time. Moreover, cost optimization is an important factor, influencing the feasibility of specific opportunities – in other words, price/performance matters.

The solution to the challenges described above is an AWS-native trading system capable of scaling up or down with near-zero operational overhead.

Figure 1 – This architecture describes the one-time installation process of the solution. It first details the use of the AWS Cloud Development Kit (AWS CDK) to deploy the solution into your AWS account. It then looks at the method of uploading certain values like API keys in AWS Secrets Manager. The diagram also shows the different application stacks that are created through AWS CloudFormation, like AWS Batch, different databases, as well as AWS Lambda. Keep in mind that for any specific customizations, stacks will need to be redeployed for application coding changes.

Figure 1 – This architecture describes the one-time installation process of the solution. It first details the use of the AWS Cloud Development Kit (AWS CDK) to deploy the solution into your AWS account. It then looks at the method of uploading certain values like API keys in AWS Secrets Manager. The diagram also shows the different application stacks that are created through AWS CloudFormation, like AWS Batch, different databases, as well as AWS Lambda. Keep in mind that for any specific customizations, stacks will need to be redeployed for application coding changes.

Figure 2 – This architecture describes the operational aspect of the solution and first shows the process of inserting the portfolio into Amazon DynamoDB. It then looks at the AWS Batch job that is triggered through AWS Lambda whenever there is a change in the portfolio. Finally, we have AWS Batch writing market data to Amazon Timestream as well as the use of Amazon Managed Grafana to produce real-time visualizations. Amazon EventBridge is also utilized to automatically trigger events based on trading hours.

Figure 2 – This architecture describes the operational aspect of the solution and first shows the process of inserting the portfolio into Amazon DynamoDB. It then looks at the AWS Batch job that is triggered through AWS Lambda whenever there is a change in the portfolio. Finally, we have AWS Batch writing market data to Amazon Timestream as well as the use of Amazon Managed Grafana to produce real-time visualizations. Amazon EventBridge is also utilized to automatically trigger events based on trading hours.

AWS Batch is our mechanism to achieve system elasticity. It allows the flexibility to run on any Amazon Elastic Compute Cloud (Amazon EC2), Amazon Elastic Container Service (Amazon ECS) or Amazon Elastic Kubernetes Service (Amazon EKS) compute fleet, including EC2 Spot instances without operational and DevOps overhead. The event-driven design addresses the unpredictable nature of information arrival (i.e., signal generation, portfolio creation, news).

Amazon Timestream enables developers to achieve better productivity by eliminating undifferentiated heavy lifting. Most notably: a schema-less design with automatic duplicate detection on inserts. In addition, the service comes with time series functionality such as interpolation and smoothing. Amazon Managed Grafana automatically connects to Amazon Timestream and other data sources for both real-time and historical visualization/dashboards. Finally, we did the work to assemble disparate components into a single, one-click, deployable stack to ensure smooth initial installation and ongoing SDLC.

Deploying the solution

These instructions will guide you to set up a real-time market portfolio application on AWS through the AWS CDK. The deployed CDK infrastructure comes with an example portfolio of the S&P 500 based on Market Intraday Momentum. The intraday momentum pattern says that the first half-hour returns on the market since the previous day’s market close will predict the last half-hour returns. This predictability is typically stronger on more volatile days, on higher volume days, on recession days, and on major macroeconomic news release days.

Note: You will need a subscription and an API key to a market data feed like B-PIPE or IEX for this solution to fully work.

Initial Setup
You will use AWS Cloud9 as the IDE to setup the code and deploy the CDK environment. You can also use a different IDE if you’d prefer.

  1. Navigate to the AWS Cloud9 console and press Create environment.
  2. Enter a name – MarketPortfolioEnv.
  3. Use a t2.micro instance type.
  4. Leave all other settings as default and choose Create.
  5. After a few minutes, the environment should be created. Under Cloud9 IDE, press Open.
  6. In the command line at the bottom, clone the Git repository using the following command:
git clone https://github.com/aws-samples/quant-trading.git

CDK Deployment

Now that the environment is setup, let’s deploy the application. You’ll need to run a few commands to get everything set up and this will allow for the entire application to be spun up through the CDK.

In the Cloud9 CLI, navigate to the repository by entering the following command:

cd quant-trading

Then, type in the following command to install the necessary dependencies, bootstrap the environment, and deploy the application using the CDK code:

./deployment.sh

If you get an error saying the docker build failed and says no space left on device run this command:

chmod +x aws-quant-infra/src/utils/resize_root.sh &&
aws-quant-infra/src/utils/resize_root.sh 50

If you get an error from creating the DynamoDB replica instance in the DB stack, you’ll need to go to the DynamoDB console and delete the replica from the console and delete the DB stack, then redeploy the CDK stack.

Adding an API key to Begin Data Flow

You can have data come in from either IEX or B-PIPE (Bloomberg Market Data Feed). In this section, you’ll enter the API key in AWS Secrets Manager and that will enable the Intraday Momentum application to start working.

  1. Navigate to the AWS Secrets Manager console.
  2. You should see two secrets created: api_token_pk_sandbox and api_token_pk.
Figure 2 - Using AWS Secrets Manager to store your API keys.

Figure 2 – Using AWS Secrets Manager to store your API keys.

  1. Select api_token_pk.
  2. Scroll down to the section that says Secret value and towards the right, select Retrieve secret value.
  1. Then, choose Edit and paste in your IEX or B-PIPE API key.
  2. Press Save.

Looking at the Results

You can view the results of the Intraday Momentum application after the day end by going to the DynamoDB table.

  1. Navigate to the AWS DynamoDB console.
  2. On the left, select Tables and then choose the table called MvpPortfolioMonitoringPortfolioTable.
  3. Then, press the orange button in the top right that says Explore table items.
Using the AWS DynamoDB console to look at your results.

Figure 3- Using the AWS DynamoDB console to look at your results.

You should then see data populated at the bottom under Items returned.

Note: If you don’t see any data, select the orange Run button to scan the table and retrieve the data.

If you’d like to analyze this data further, you can download it in CSV format by selecting Actions, then Download results to CSV. You can also use Amazon Managed Grafana to visualize the results.

Clean Up

You can delete the entire stack using the CDK.

Using the CLI where you deployed the stack, enter the following command:

cdk destroy --all

Conclusion

Quant trading, given its scale, short-term portfolio lifetime, and unpredictable nature of information arrival, uniquely benefits from cloud elasticity. However, designing the system to harvest the elasticity requires expert level knowledge (both, trading and cloud) and is resource intensive.

In this article, we described the system we built to jump-start quant trading on AWS and how to get set up by deploying the stack from the GitHub repository.

Sam Farber

Sam Farber

Sam Farber is a Startup Solutions Architect working with FinTech Startup companies. His role involves coming up with practical solutions to problems that FinTech startups face. He is a former Software Engineer and Technical Trainer with hobbies that include snowboarding, golf, and traveling.

Adeleke Coker

Adeleke Coker

Adeleke Coker is a Global Solutions Architect with AWS. He works with customers globally to provide guidance and technical assistance in deploying production workloads at scale on AWS. In his spare time, he enjoys learning, reading, gaming and watching sport events.

Boris Litvin

Boris Litvin

Boris Litvin is Principal Solutions Architect, responsible for Financial Services industry innovation. He is a former Quant and FinTech founder, passionate about quantitative trading and data science.

Ronny Rodriguez

Ronny Rodriguez

Ronny Rodriguez is a Senior Technical Program Manager for the Financial Service Industry, passionate about containers and serverless technology. He is a former Lead Cloud Engineer at large financial corporation, and helped other companies to migrate and succeed in the public cloud.