AWS Cloud Operations & Migrations Blog

Perform continuous cookbook integration testing and delivery for AWS OpsWorks for Chef Automate

Any Chef server should be a hub of tested and trusted cookbooks that can be added to node run lists easily. However, the testing and delivery of cookbooks to the server itself can be an arduous task. To simplify and expedite this necessary process, we’ve leveraged AWS technologies to create a pipeline that executes integration testing and delivers the cookbook to the Chef server. This allows for the automation of a regular and essential part of cookbook development. It’s worth noting that there are other CI/CD solutions available, and that a similar pipeline could be created with tools outside of AWS.

Overview

This solution shows how you can continuously upload verified cookbooks to a Chef server. After committing changes to an AWS CodeCommit repository, a pipeline in AWS CodePipeline is triggered. The pipeline has two AWS CodeBuild stages. The first stage tests the cookbook by converging it on a test instance also performs general linting, style, and syntax checks. We use Test Kitchen to run Bats tests to confirm that the cookbook passes application-specific tests before being uploaded to the Chef server. The second CodeBuild stage uploads the cookbook to the Chef server if the previous stage executed successfully.

Set Up

To provision the resources needed here, we use AWS CloudFormation. We’ll walk through each resource as we work through the stages of this pipeline. Note that before working through this solution, a fully functional OpsWorks for Chef Automate server must be running in your AWS account. You also want to ensure that you have access to a workstation with an updated and configured AWS CLI. This setup also requires the starter kit obtained from creating the OpsWorks for Chef Automate server to be uploaded to an Amazon S3 bucket. The starter kit bundle zip file contains the necessary files which allows us to perform the cookbook upload after the tests pass. The last CodeBuild stage extracts the .chef directory containing the necessary certificates and knife.rb file from the starter kit.

To continuously test and deliver cookbooks to the OpsWorks for Chef Automate server, we can leverage a combination of AWS resources. For our cookbook repository, we use CodeCommit, which is a Git-based source control tool. Pushes to the master branch of this repository trigger CodePipeline, which is a continuous delivery service that can model, visualize, and automate the steps required to release software. Within the pipeline there are two CodeBuild stages. CodeBuild can compile source code, run unit tests, and produce artifacts that are ready to deploy. These resources can all be provisioned using CloudFormation, the AWS Management Console, or the AWS API directly.

Components

CodeCommit

The first item in this solution is a CodeCommit repository, which we create using CloudFormation:

"CookbookRepository": {
       "Type": "AWS::CodeCommit::Repository",
       "Properties": {
         "RepositoryDescription": "Chef cookbook repository",
         "RepositoryName": "aws_chef_cookbook_repository"
       }
     }

We’ll use this repository as the source stage of our pipeline, so when changes are pushed to the master branch of this repository, the pipeline will execute. To set up that link, the CodePipeline resource specifies the CodeCommit resource as the source in the “Stages” section:


         "Stages": [
           {
             "Name": "Source",
             "Actions": [
               {
                 "InputArtifacts": [],
                 "Name": "Source",
                 "ActionTypeId": {
                   "Category": "Source",
                   "Owner": "AWS",
                   "Version": "1",
                   "Provider": "CodeCommit"
                 },
                 "OutputArtifacts": [
                   {
                     "Name": "MyApp"
                   }
                 ],
                 "Configuration": {
                   "RepositoryName": {
                     "Fn::GetAtt": [
                       "CookbookRepository",
                       "Name"
                     ]
                   },
                   "BranchName": "master",
                   "PollForSourceChanges": "True"
                 },
                 "RunOrder": 1
               }
             ]
           },       

From there, CodePipeline stores the cookbook in the CodePipeline S3 bucket, which will allow us to pass the cookbook between pipeline stages. Each CodeBuild project will use its own buildspec.yml file, which gives a set of instructions for what to execute during that stage. For this first stage, we want to run our integration tests by using kitchen. We’re also running Chefstyle and Foodcritic in order to adhere to best practices and proper syntax. If the cookbook fails any of these tests, the pipeline will stop executing and information about the failure will be shown by clicking the “Details” link on the failed stage in the CodePipeline.

The following buildspec.yaml will be used during the CodePipeline first CodeBuild stage:

version: 0.2

 phases:
   build:
     commands:
       - chefstyle .
       - foodcritic . -t ~FC069 -t ~FC071 -t ~FC078
       - kitchen test

You’ll notice we’re running foodcritic and choosing to not run certain rules. You can read more about that at foodcritic.io.

"CodeBuildProject": {
       "Type": "AWS::CodeBuild::Project",
       "Properties": {
         "Artifacts": {
           "Type": "CODEPIPELINE"
         },
         "Description": "AWS CodeBuild Project for Chef cookbook testing",
         "Environment": {
           "Type": "LINUX_CONTAINER",
           "ComputeType": "BUILD_GENERAL1_SMALL",
           "Image": "chef/chefdk"
         },
         "Name": "Chef-Cookbook-Testing",
         "ServiceRole": {
           "Ref": "CodeBuildRole"
         },
         "Source": {
           "Type": "CODEPIPELINE"
         },
         "TimeoutInMinutes": 10
       }
     },

What is Test Kitchen?

Test Kitchen is a testing harness we can use to automatically test Cookbooks. A .kitchen.yml file specifies which tests to run on which cookbooks. The tests themselves can be written in a variety of frameworks: Bats, Minitest, Rspec, ServerSpec, and more. In this example, we use Bats, which is the default framework for Test Kitchen. More information on Test Kitchen can be found in Chef’s Documentation and in Test Kitchen’s Github repository.

After the first CodeBuild stage is completed, an optional manual approval stage can be used in order to confirm that this cookbook should be uploaded to the server. This approval stage would cause the pipeline to stop so someone can manually approve or reject the action. We could also use Amazon SNS to send a notification when an approval is ready. If you are confident with the tests, this manual approval stage may not be needed, which would make the process fully automated. If a manual stage is not used, the two buildspec.yml files can be combined into a single CodeBuild project which performs the testing and upload in one step.

The final CodeBuild stage of this pipeline will upload the cookbook, which has passed the previous tests, to the Chef server. To accomplish this, we’re leveraging the AWS CLI to pull the starter kit from S3, unzip its contents, and then use the credentials to successfully upload the cookbook to the Chef server via Berkshelf.

This buildspec-upload.yml:

version: 0.2

 phases:
   build:
     commands:
      - apt-get update -y && apt-get install python-pip unzip -y
      - pip install awscli
      - aws s3 cp s3://[STARTER_KIT_BUCKET]/starter_kit.zip .
      - unzip *starter_kit.zip -d starter_kit
      - find starter_kit -type d -name '.chef' -exec cp -r {} . \;
      - berks install
      - berks upload

will be used by the second CodeBuild stage of CodePipeline:

"CodeBuildProjectUpload": {
       "Type": "AWS::CodeBuild::Project",
       "Properties": {
         "Artifacts": {
           "Type": "CODEPIPELINE"
         },
         "Description": "AWS CodeBuild Project for Chef cookbook upload",
         "Environment": {
           "Type": "LINUX_CONTAINER",
           "ComputeType": "BUILD_GENERAL1_SMALL",
           "Image": "chef/chefdk",
           "EnvironmentVariables": [
             {
               "Name": "PATHTOSTARTERKIT",
               "Value": {"Ref": "PathToStarterKit"}
             }
           ]
         },
         "Name": "Chef-Cookbook-Upload",
         "ServiceRole": {
           "Ref": "CodeBuildRole"
         },
         "Source": {
           "Type": "CODEPIPELINE",
           "BuildSpec": "buildspec-upload.yml"
         },
         "TimeoutInMinutes": 10
       }
     },

After this stage is executed, the pipeline will complete, and the cookbook will be on the Chef server, tested and ready to be deployed to nodes.

Summary

This blog post shows how you can implement continuous integration testing and delivery of cookbooks to your OpsWorks for Chef Automate Chef server. By using CodePipeline, a simple push to the master branch of a CodeCommit repository can trigger testing via the first CodeBuild stage, and delivery via the second CodeBuild stage. An approval stage between those build stages allows for administrators to intervene in the pipeline processes, if needed. However, this can be a fully automated process that ensures complete integration testing before cookbooks arrive to the Chef server.

About the Authors

Ted Neykov is a Cloud Support Engineer who supports customers using AWS ECS, AWS CloudFormation and AWS OpsWorks among other DevOps AWS services. Outside of work Ted attends local DevOps tech meet ups.
Maggie O’Toole has been a Cloud Support Engineer at AWS since 2017. She focuses on supporting customers in using DevOps technologies, specializes in containers and configuration management, and enjoys building out infrastructure.