AWS Partner Network (APN) Blog

Accelerate Migration and Modernization with a Reusable Solution to Deploy Container Applications on AWS

By Artem Kobrin, AWS Ambassador and Head of Cloud Practice – Neurons Lab
By Phurba Dorjee (PD) Sherpa, Sr. Migration & Modernization Architect – AWS

Neurons Lab

One of the key practices promoted by DevOps is continuous integration and continuous delivery (CI/CD), which plays a pivotal role in reducing the time required to release software updates. Organizations have a variety of tools at their disposal for implementing CI/CD, and Amazon Web Services (AWS) offers a suite of its own tools while also allowing seamless integration with third-party solutions.

In this post, we will explore a step-by-step deployment guide to create a reusable solution to deploy container applications on AWS, accelerating the migration journey by leveraging the AWS Cloud Development Kit) (AWS CDK) and GitHub Action.

This approach involves using AWS CDK’s built-in capabilities to create the Docker container for the application, upload it to Amazon Elastic Container Registry (Amazon ECR), and smoothly deploy it to an Amazon Elastic Container Service (Amazon ECS) cluster. This method simplifies the deployment process, making it efficient and effective.

Neurons Lab is an AWS Specialization Partner and AWS Marketplace Seller with Competencies in Machine Learning and Healthcare Consulting. An artificial intelligence (AI) consultancy that provides end-to-end services—from identifying high-impact AI applications to integrating and scaling the technology—Neurons Lab empowers companies to capitalize on AI’s capabilities.

Neurons Lab’s global team is made of data scientists, domain experts, cloud specialists, user design experts, and business strategists, all supported by an extensive talent pool of 500+ experts. This expertise allows Neurons Lab to solve the most complex AI challenges, mobilizing and delivering with outstanding speed while supporting customers’ evolving long-term needs.

An example project is Neurons Lab’s collaboration with iPlena, which provides AI-Physiotherapy as a service and needed to build a mobile application with an AI and machine learning (ML) component that could recommend a set of therapeutic exercises to compensate for users’ physiological problems. For this, Neurons Lab created a secure and scalable foundation on AWS, and to accelerate the journey also developed MLOps pipelines and leveraged its reusable solution to deploy applications on the AWS Cloud.

Solution Overview

This solution leverages a combination of key services to streamline the process:

  • GitHub Actions: Serves as a workflow orchestration tool, hosting the pipeline for deployment.
  • AWS CDK: This powerful framework is instrumental in constructing and managing the infrastructure required for deployment.
  • Amazon ECS on AWS Fargate: This service enables container orchestration, allowing the solution to run containers without managing the underlying infrastructure.
  • AWS Key Management Service (AWS KMS): Ensures secure key storage and management for encryption needs.
  • Amazon Relational Database Service (Amazon RDS): Provides managed database solution to support the application.
  • AWS IAM OpenID Connect Identity Provider: This service facilitates federated authentication, establishing trust between GitHub and AWS. It enables GitHub Actions to deploy on AWS without the need to store AWS secrets and credentials, enhancing security and convenience.

The following diagram represents the architectural outcome from the deployment of the code.

NeuronsLab-Accelerator-Architecture-1.png Figure 1 – Architecture representation.


Before you embark, make sure you’ve covered the essential prerequisites:

  • AWS account: You must have an AWS account with sufficient permissions to create the necessary resources.
  • GitHub account: Ensure you have a GitHub account with the appropriate permissions to configure GitHub repositories, create workflows, and manage GitHub secrets.
  • Git client: Have a Git client ready to clone the provided source code. You can use the following command to clone the repository:
      • bashCopy code
      • git clone
  • Python and Pip: Ensure you have Python and Pip installed on your system.
  • Python dependencies: Install the required Python dependencies using the following command:
      • pip install -r requirements
  • AWS CDK: Familiarize yourself with AWS CDK by referring to the official documentation.
  • Docker: Have Docker installed and running on your system.
  • AWS credentials: Configure your AWS credentials to ensure seamless interaction with AWS services.
  • Amazon Route 53 public hosted zone: Create a public hosted zone in Amazon Route 53.
  • Environment variables: Create a .env file in the deployment directory based on the .env.sample file. Configure the necessary variables to tailor the deployment to your specific requirements.
  • AWS CDK installation: Install AWS CDK using the following command:
      • npm install -g aws-cdk
  • AWS CDK bootstrap: If not already bootstrapped, bootstrap AWS CDK using the command:
      • cd secure-container-accelerator/deploy
      • cdk bootstrap

With these prerequisites in place, you’ll be well-prepared to embark on your deployment journey.

Step 1: Automate Deployment with GitHub Actions

Integrating GitHub Actions into your project revolutionizes the way you build, test, and deploy your applications. Leverage the power of GitHub Actions by creating a new workflow file in your repository at .github/workflows/cdk-deploy.yml

Please find below a GitHub Actions workflow that seamlessly deploys your Application using the accelerator:

name: CDK Deploy

     - main
     id-token: write   # This is required for requesting the JWT
     contents: read    # This is required for actions/checkout

 AWS_REGION : ${{ vars.AWS_REGION }}

   runs-on: ubuntu-latest
   - name: configure aws credentials
     uses: aws-actions/configure-aws-credentials@v2
       role-to-assume: arn:aws:iam::${{ env.AWS_ACCOUNT_ID }}:role/GitHubActionCDKRole
       role-session-name: samplerolesession
       aws-region: ${{ env.AWS_REGION }}
   - uses: actions/checkout@v2
   - name: Install dependencies
     run: |
       npm install -g aws-cdk
       pip install -r deploy/requirements.txt
   - name: Bootstrap
     run: |
       cd deploy
       cdk bootstrap aws://${{ env.AWS_ACCOUNT_ID }}/${{ env.AWS_REGION }}
   - name: Build
     run: |
       cd deploy
       cdk deploy --require-approval never

The workflow will trigger every time you push changes to the main branch. It automates the process of configuring AWS credentials, installing dependencies, bootstrapping, and deploying your application with AWS CDK.

To ensure secure integration between GitHub and AWS, we’ve also incorporated OIDC (OpenID Connect) configuration. This establishes a trusted connection, enhancing security by verifying the identity of GitHub Actions, reducing the risks associated with storing long-lived AWS access keys in the GitHub repository.

The configure AWS credentials step is an essential security measure in the workflow. It uses the aws-actions/configure-aws-credentials action to securely assume an identity and access management (IAM) role, which further enhances the security posture of your deployment process.

Step 2: Construct with AWS CDK

Leverage AWS CDK to create and maintain the infrastructure required to deploy your containerized application to AWS. By using the AWS CDK, you can take advantage of your existing programming skills and tools to create maintainable and readable infrastructure code.

The following code sample demonstrates a comprehensive approach to defining your AWS infrastructure using AWS CDK constructs.


Figure 2 – AWS CDK sample code.

In this example, the AWS CDK code creates an application stack that includes an Amazon Virtual Private Cloud (Amazon VPC), AWS Fargate, Application Load Balancer, security configurations, public and private subnets, and VPC flow logs encrypted with AWS KMS keys for monitoring and auditing purposes.

By using AWS CDK constructs, you can easily define your infrastructure and deploy your containerized application to AWS. The high-level constructs and reusable patterns provided by AWS CDK make it an ideal solution for creating and maintaining cloud resources, allowing you to focus on what matters most: building and improving your application.

Full code can be downloaded from GitHub

Step 3: Test AWS CDK Infrastructure with cfn-nag

cfn-nag is an open-source command-line tool that provides static code analysis for AWS CloudFormation templates. By integrating cfn-nag with AWS CDK, you can perform security checks on your infrastructure code during the development process, making it easier to identify and fix potential issues.

cfn-nag provides various rule packs to help ensure your infrastructure complies with different security and compliance requirements. Here are some of the available rule packs:

  • AWS Solutions: This rule pack helps ensure your AWS infrastructure follows best practices. By applying these rules, you can optimize your infrastructure for cost, performance, security, and fault tolerance.
  • HIPAA Security: The Health Insurance Portability and Accountability Act (HIPAA) security rule pack checks your infrastructure for compliance with HIPAA security requirements. This is particularly useful for organizations in the healthcare industry that handle protected health information (PHI).
  • NIST 800–53 rev 4: This rule pack checks your infrastructure for compliance with the National Institute of Standards and Technology (NIST) Special Publication (SP) 800–53 Revision 4, Security and Privacy Controls for Federal Information Systems and Organizations. It helps ensure your infrastructure adheres to the security controls and guidelines specified by NIST.
  • NIST 800–53 rev 5: Similar to the NIST 800–53 rev 4 rule pack, this rule pack checks your infrastructure for compliance with the updated NIST SP 800–53 Revision 5, Security and Privacy Controls for Information Systems and Organizations. It provides an updated set of security controls and guidelines.
  • PCI DSS 3.2.1: The Payment Card Industry Data Security Standard (PCI DSS) 3.2.1 rule pack checks your infrastructure for compliance with the PCI DSS requirements. This is essential for organizations that handle, store, or process credit card data, as it helps ensure the security and protection of cardholder data.

For more information on the available rule packs, refer to the cfn-nag rules documentation.

Step 4: Integrate cfn-nag and Rule Packs with AWS CDK

To integrate cfn-nag, HIPAA Security checks, and AWS Solutions checks with AWS CDK, you need to add the required libraries to your project and modify your CDK application script. The sample code provided below demonstrates how to do this:

"""AWS CDK Test Script"""
import os
import sys
from aws_cdk import (
from stack.app_stack import AppStack
from cdk_nag import AwsSolutionsChecks, HIPAASecurityChecks
from dotenv import load_dotenv

# App
app = App()

# Stack
   app, 'ContainerAcceleratorStack',
       account=os.getenv('AWS_ACCOUNT_ID', os.getenv('CDK_DEFAULT_ACCOUNT')),
       region=os.getenv('AWS_DEFAULT_REGION', os.getenv('CDK_DEFAULT_REGION'))

# Synth

In this example, the HIPAASecurityChecks and AwsSolutionsChecks classes from the cfn-nag library are imported and added to the application using the Aspects.of(app).add() method. This enables cfn-nag to perform compliance checks on your AWS CDK infrastructure code during synthesis.

The verbose and reports options can be set to True to generate detailed output and reports on the compliance checks performed.

NeuronsLab-Accelerator–cfn-nag-test-output-3Figure 3 – cfn-nag test output.

By integrating cfn-nag, HIPAA Security checks, and AWS Solutions checks with your AWS CDK infrastructure code, you can ensure your application follows best practices and complies with relevant security and compliance requirements, making it more robust and reliable.

Step 5: Build and Deploy a Docker Application

In this section, we’ll demonstrate how to build a Docker image of a Streamlit application and deploy it using AWS CDK. Streamlit is a popular open-source library for building interactive web applications. We’ll integrate the Streamlit app with AWS CDK infrastructure for seamless deployment.

To get started, create a simple Streamlit application in the app directory. You can refer to the official Streamlit documentation and example projects to understand how to structure your app effectively. Additionally, consider any specific requirements or libraries your application may need, and ensure they are listed in the application’s requirements.txt or equivalent file.

Put a Dockerfile in the project’s `./app/` directory to guide AWS CDK on how to build the Docker image for the application. This Dockerfile should specify the base image, install the dependencies, copy your Streamlit app code into the image, and define the command to run the Streamlit app. Creating an efficient Dockerfile is a crucial step, as it ensures your application runs smoothly within a containerized environment.

It’s important to note that while this guide provides a high-level overview of the process, creating a robust Streamlit application, optimizing the Dockerfile, and ensuring compatibility with AWS CDK may require additional development and testing. You may want to explore Streamlit’s customization options and AWS CDK constructs to tailor the deployment to your specific needs.

The following Dockerfile is used to build Docker image with the Streamlit application:

FROM python:3.8-slim

WORKDIR /usr/app/src
COPY requirements.txt requirements.txt
RUN pip install --no-cache-dir -r requirements.txt
CMD ["streamlit", "run",\
   "--browser.serverAddress", "",\
   "--server.enableCORS", "False",\
   "--server.port", "80",\

With the Streamlit application in place, we can now integrate it into AWS CDK infrastructure. We’ll use the ecr_assets.DockerImageAsset construct to build the Docker image of our Streamlit application:

app_path = os.environ.get('APP_PATH', '../app')

# Docker Image
image = ecr_assets.DockerImageAsset(
   self, 'AppImage', directory=app_path)

Next, we’ll create an Amazon ECS cluster and deploy the Streamlit application using the ecs_patterns.ApplicationLoadBalancedFargateService construct:

cluster = ecs.Cluster(self, 'AppCluster', vpc=vpc)
service = ecs_patterns.ApplicationLoadBalancedFargateService(
   self, 'AppService',
   self, 'AppServiceAlbUrl', description='App Service ALB URL',
   export_name='appServiceAlbUrl', value=service.load_balancer.load_balancer_dns_name)

This code snippet creates an Amazon ECS cluster and deploys the Streamlit application as a container using AWS Fargate. The application will be accessible via an Application Load Balancer.

By following these steps, you can easily build and deploy a Streamlit application using AWS CDK. The integration with AWS CDK infrastructure allows for seamless updates and deployments as your application evolves.

Cleanup (Optional)

If you wish to remove the resources created during this deployment, follow these steps:

  • Navigate to the project directory where you cloned the repository to deploy folder:
    • cd secure-container-accelerator/deploy
  • Use AWS CDK to destroy the resources:
    • cdk destroy
  • Confirm the resource deletion when prompted.
  • You can also manually clean up any additional resources created outside of AWS CDK, such as Docker containers, images, or any other environment-specific resources.

Please note that cleaning up resources is optional, and you should exercise caution when doing so to avoid unintended data loss or disruption. Be sure to confirm the deletion of resources during the process.


By leveraging AWS CDK constructs, GitHub Actions, and integration with cfn-nag for compliance testing, this reusable solution streamlines the infrastructure setup and deployment process, allowing developers to focus on delivering business value.

If you’re ready to take your application migration and modernization to the next level, contact Neurons Lab at You can also learn more about Neurons Lab in AWS Marketplace.

Sample code, software libraries, command line tools, proofs of concept, templates, or other related technology are provided as AWS Content or Third-Party Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content or Third-Party Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content or Third-Party Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content or Third-Party Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.


Neurons Lab – AWS Partner Spotlight

Neurons Lab is an AWS Specialization Partner and AI consultancy that provides end-to-end services—from identifying high-impact AI applications to integrating and scaling the technology—that empowers companies to capitalize on AI’s capabilities.

Contact Neurons Lab | Partner Overview | AWS Marketplace | Case Studies