AWS Cloud Operations & Migrations Blog

Standardizing infrastructure delivery in distributed environments using AWS Service Catalog

A common security design pattern and best practice among many of our enterprise customers is to provide application isolation through the adoption of a multi-account strategy. Many customers choose to create individual AWS accounts based on software development lifecycle (SDLC) phases such as Development (Dev), Quality Assurance (QA), and Production (Prod), to provide complete separation across environments. However, if application requirements aren’t fully understood at time of account creation, it can be difficult to provision the necessary infrastructure components. Additionally, as more accounts are created, customers are looking for a way to enforce infrastructure compliance and consistency across these different accounts.

AWS Service Catalog can help address these challenges and enable developers to quickly, securely, and easily deploy infrastructure components in any environment. The following diagram illustrates this workflow. Here, AWS Service Catalog is used to share both production and non-production infrastructure components to application Dev/QA/Production accounts.

While many customers are familiar with the benefits of using AWS Service Catalog as a “single pane of glass” to provision infrastructure, the deployment of products can also be automated. In the diagram above, products shared with application accounts can be deployed directly from their respective continuous integration/continuous delivery (CI/CD) pipelines. This creates an environment where developers can tightly couple code and infrastructure dependencies, with ownership of individual components distributed across separate teams.

This model provides two key benefits:

  1. It allows a centralized team to enforce compliance and standardization by defining and sharing approved versions of infrastructure.
  2. It creates a self-service environment, where application owners can decide for themselves which infrastructure components to use.

The following diagram describes this process in more detail. Here, the definition of infrastructure is handled by the Shared Services team, which creates a catalog of network and compute-based resources that are shared with application accounts. Multiple versions can be shared, with the application owner taking responsibility for determining which component best meets their requirements. These products can then be deployed as part of application CI/CD processes using an AWS service like AWS CodePipeline. Deploying infrastructure in this manner also has a security advantage because application pipeline permissions can then be scoped to the AWS Service Catalog portfolio rather than the underlying AWS services, ensuring least privilege.

Automation for building out a CI/CD pipeline leveraging AWS Service Catalog account portfolio sharing is available in the following GitHub repository under Amazon software license:

GitHub repository –  https://github.com/aws-samples/aws-service-catalog-reference-architectures/tree/master/labs/xacct-pipeline

The repository contains a read-me guide that includes step-by-step instructions and AWS CloudFormation templates required to create a CI/CD pipeline using the AWS CodePipeline service.

About the AWS Service Catalog CI/CD pipeline solution

In this blog post, I’ll show you how to build a CI/CD pipeline that uses infrastructure delivery through AWS Service Catalog. In this scenario, a shared services team has defined a virtual private cloud (VPC) landing zone and they want to make this infrastructure available to application teams for use in their CI/CD pipelines.

This demonstration will cover the following areas:

  1. Provisioning an AWS Service Catalog portfolio that hosts an approved VPC template in a shared services account.
  2. Sharing the VPC product with our application account.
  3. Creating a CI/CD pipeline with AWS CodePipeline.
  4. Deploying the VPC product and an Amazon EC2 instance into our newly built VPC.

The process is described in the following architecture diagram:

How do I deploy the AWS Service Catalog CI/CD pipeline solution?

The Git repository contains all the necessary AWS CloudFormation templates and a read-me guide. Separate CloudFormation templates are provided to configure the required AWS Service Catalog infrastructure in the shared services and application accounts, along with the sample CI/CD pipeline. Detailed instructions on how and where to execute these CloudFormation templates are available in the read-me guide.

The guide will take the user through the following process of building out our reference architecture:

In the shared service account:

  1. Upload the VPC CloudFormation template to a local Amazon S3 bucket.
  2. Create a master AWS Service Catalog portfolio to host our production infrastructure.
  3. Create an AWS Service Catalog product for our VPC infrastructure.
    • Provide product, support, and version details.
    • Reference the VPC template stored in our Amazon S3 bucket.
  4. Share the master AWS Service Catalog portfolio with the application account.

In the application account:

  1. Accept the AWS Service Catalog portfolio share from the shared service account.
  2. Create the following AWS Identity and Access Management (IAM) roles:
    • An IAM role (e.g., PipelineRole) providing the CodePipeline service with the ability to launch our various pipeline steps.
    • An IAM role (e.g., CFNLaunchRole) providing the CloudFormation service with the ability to launch our AWS Service Catalog products.
    • An IAM role (e.g., ProductLaunchRole) providing the AWS Service Catalog service with the ability to deploy products into our local AWS account.
  3. Create a local AWS Service Catalog portfolio for our VPC infrastructure.
    • Associate the IAM CFNLaunchRole to the portfolio, allowing CloudFormation to launch products.
    • Add the VPC product from the shared service AWS Service Catalog to our local catalog.
    • Create a launch constraint and assign the IAM ProductLaunchRole to the portfolio, allowing our AWS Service Catalog account to deploy products to the local AWS application account.
  4. Create a local Amazon S3 bucket that will act as our source code repository for CodePipeline artifacts.
  5. Create a CodePipeline workflow with the following stages:
    • An Amazon S3 source stage that triggers when a file is uploaded to the S3 artifact created earlier.
    • A VPC deployment stage that deploys the AWS Service Catalog VPC product.
      • Reference the VPC template from within the Amazon S3 artifact.
      • Use the IAM CFNLaunchRole to launch the VPC CloudFormation template.
    • An Amazon EC2 deployment stage that deploys an EC2 instance into our newly built VPC:
      • Reference the Amazon EC2 template from within the S3 artifact above.
      • Use the IAM CFNLaunchRole to launch the Amazon EC2 CloudFormation template.
    • Test the AWS Service Catalog automated pipeline.
      • Upload our source code artifact to the S3 bucket created in step 4.
        • A VPC is deployed through AWS Service Catalog.
        • An EC2 instance is launched within this VPC.
      • Wait for the pipeline to finish deploying our network and compute resources.

Conclusion

The architecture used in this blog post describes the process of standardizing infrastructure delivery by invoking shared AWS Service Catalog products from within a simple CI/CD pipeline. Customers will likely choose to enhance this solution to support more complex scenarios, such as populating the master catalog with additional infrastructure components or compiling and deploying custom application code using services such as AWS CodeBuild and AWS CodeDeploy. As you have seen in this blog post, AWS Service Catalog provides customers with a secure and flexible delivery framework.

About the Author

Kristopher Lippe is a Boston-based Enterprise Solutions Architect for AWS. He is a technology enthusiast who enjoys helping customers find innovative solutions to complex business challenges. His core areas of focus are Storage, Networking, and Security. When he’s not working with customers on their journey to the cloud, he enjoys reading, golf, and home renovation projects.