AWS DevOps Blog

How to Create an Automated Database Continuous Integration and Release Management Workflow with Datical and AWS

Editors note: This blog post is out of date. For an up-to-date blog post on how to implement CI/CD for your database you can try this post “Deploy, track, and roll back RDS database code changes using open source tools Liquibase and Jenkins.”

 

Thank you to my colleague Erin McGill for reviewing and providing valuable feedback on this blog.

Why create a database continuous integration and release management workflow?

Cloud adoption among large enterprises is at record numbers after nearly a decade of growing investments. AWS is a market leader in serving the growing enterprise appetite for cloud. Enterprises look to AWS for the following key benefits: cost reduction, flexibility, and availability. These benefits stem from the fact that cloud means hardware utilization is on-demand, delivery is near-instant along with scale up or scale down of infrastructure, and business continuity is better guaranteed as services can be replicated across the globe.

As cloud footprints are growing, organizations are looking beyond basic infrastructure as a service and are keen on getting more out of cloud investments by taking advantage of platform services to enable serverless architectures, machine learning, or intelligent networking such as automatic load-balancing and more. However, as cloud footprints start to mature, organizations are being held back by a slow, manual, and opaque database release process which threatens all the benefits of the cloud delivery model. In response, enterprises are now realizing the need to make database code deployment as fast, transparent, and automated as application releases.

To streamline software delivery from the cloud, organizations are looking to bring continuous integration to their databases in order to:

  • Discover mistakes quickly.
  • Deliver updates faster and more frequently.
  • Help developers write better code.
  • Automate the database release management process.

The end goal is make sure database code can be promoted automatically and in lockstep with application code changes. With database continuous integration and delivery, software development teams can deliver smaller, less risky deployments, making it possible to respond more quickly to business or customer needs.

In our previous post, Building End-to-End Continuous Delivery and Deployment Pipelines in AWS, we walked through steps for implementing a continuous deployment and automated delivery pipeline for your application.

In this follow-up post, we walk through steps for building a continuous deployment workflow for databases using AWS CodePipeline (a fully managed continuous delivery service) and Datical (a database release automation application) to demonstrate complete database change management from check-in to final deployment. While the example covered in this post uses AWS CodeCommit for source code control and Amazon RDS for database hosting, Datical integrates just as easily with other services, such as git for source code control and Amazon EC2 for database hosting to bring database continuous integration to a variety of AWS-based delivery models.

 

Prerequisites

You’ll need an AWS account, an Amazon EC2 key pair, and administrator-level permissions for AWS Identity and Access Management (IAM), AWS CodePipeline, AWS CodeCommit, Amazon RDS, Amazon EC2, and Amazon S3.

From Datical, you’ll need access to Datical software. You can request a custom demo of Datical here.

Overview

Here are the steps:

  1. Install and configure Datical DB.
  2. Create a client in Datical Keycloak.
  3. Create a RDS database instance running the Oracle database engine.
  4. Set up database version control using AWS CodeCommit.
  5. Create Lambda Function to Call Datical from AWS CodePipeline.
  6. Create a Datical Project.
  7. Enable automated release management for your database through AWS CodePipeline .
  8. Best practices for continuous delivery.

1. Install and configure Datical DB

Datical 5 is a microservices architected log running server process. As such, you will need to install it on an AWS EC2 instance. This guide assumes you will be installing on Ubuntu Server 16.04 LTS (HVM), SSD Volume Type. Datical 5 prerequisites include 4 cores and 8GB RAM. For this example, we will use a m5.2xlarge instance with 100GB of storage. Make certain that you can access SSH, HTTP, and HTTPS with your VPC configuration.

The external hostname of your EC2 instance needs to be discoverable from local machine. Simply issue “sudo hostname <hostname>” to the results of “nslookup <external IP address>”.

Note: On Windows machines, Datical recommends using Cygwin Shell to access instead of cmd.exe.

Step 1: Set up User and Path

On the system where you are going to install Datical, do the following:

Create the datical user, set a password, and add it to sudo and docker groups.

$ sudo useradd -m datical
$ sudo passwd datical
$ sudo usermod -aG sudo datical
$ sudo groupadd docker
$ sudo usermod -aG docker datical

Step 2: Set up an installation repository

Log in as the datical user you created. All the remaining steps must be run as that user.

$ su - datical

Open a command shell window on the server where you are installing Datical. Run the following command to download a repository from Artifactory for the installation script to use.

$ curl -s https://customer:anonymous@artifactory.datical.net/artifactory/debian/install/stable/xenial/datical-server-install.sh | sudo sh

Note: the repository is created in /opt.

When the step completes successfully, you get the following message:

——————————————————–

| A new version
of datical-server has been installed.

| You must
execute ‘/opt/datical-server/bin/datical-upgrade’ to apply the changes

——————————————————–

Step 3: Run the installation script

At the command prompt in your SSH tool, run the following script:

$ /opt/datical-server/bin/datical-upgrade 

After it runs for a moment you are prompted for Artifactory login credentials. Use the account credentials you obtained from Datical.

Note:

If you mistype the Artifactory repository user name or password, the installation will fail later. Restart the script to try again.

sudo login prompts: Depending on your system, you may be prompted for a sudo password. Enter the password for the datical user.

Wait for the services to start. The script downloads the Docker images for the services and starts Datical. Wait until all services come up before going to the next step.
Run the following script to check services status.

$ datical-health

Note: if you encounter error ‘-su: datical-health: command not found’, then set the path explicitly as follows and re-run the command.

$ PATH=/opt/datical/bin:$PATH

All services start as status critical and move to passing. Wait for all services to show as passing.

Step 4: Include the databases image

Run the following command. It pulls a Docker image that contains preconfigured databases.

$ sudo apt-get install -y datical-services-demo

Note: if you encounter error ‘Permission denied’, then run below command and rerun the step.

$ sudo chmod 755 /opt/datical/bin/datical-*

Step 5: Install the internal application database

This step installs the demo PostgreSQL database.

At the command prompt in your SSH tool, run the following command:

$ datical-demo-postgres up -d

You get output from the command as it downloads and installs packages and starts the service. You can track the log with this command:

$ datical-demo-postgres logs -f

The last message before the prompt returns is

Creating daticalpostgres_postgres_1  …done.

Run the top command to see that the pull and startup is complete before proceeding to the next step.

Step 6: Install a packaged audit database

Note: This step uses the same PostgreSQL database image that you set up for the application database.

Start a psql command line session.  When you are prompted for the admin password, enter demo.  It does not echo your input.

$ datical-demo-postgres exec postgres psql -h postgres -U admin -d postgres
Password for user admin: demo

At the postgres-# prompt, do the following:

Create a user as shown. It shows CREATE ROLE when successful.

Create a database as shown. It shows CREATE DATABASE when successful.

Exit

postgres-# create user auditdb password 'demo';
CREATE ROLE
postgres-# create database auditdb owner auditdb;
CREATE DATABASE
postgres-# \q

Step 7: Running the Initialization Wizard After Installation

In your browser, go to https://<external_url>.

Use the IP address of your virtual image for <external_url>. You can look up IP address from AWS console. https://console.aws.amazon.com/ec2

Note:

If your browser warns about your connection’s security certificate, select to proceed anyway.

The wizard starts with a Welcome page. Click Next to continue.

Step 8: Application DB Connection

  1. Enter the required information. The values are for the demo version of postgres you installed.Hostname or IP of the database: postgresDatabase Administrator Username: adminDatabase Administrator Password: demo
  2. Click Test Connection.
  3. When the connection test passes, click Save to continue.

The wizard automatically creates individual databases for each of its micro-services and creates a user with a random password for each.

Step 9: Audit DB Setup

  1. Enter the required information. Use values for your external Audit DB if you are using one.
    For Postgres, database type is Postgres/EnterpriseDB (This setup assumes you wish to use the same Postgres instance as the ApplicationDB)
    JDBC URL: jdbc:postgresql://postgres/auditdb
    Database User Name: auditdb
    Database Password: demo
  2. Click Test Connection

3. When the connection test passes, click Save to continue.

Step 10: User Setup

  1. Click Open Keycloak to open the Keycloak admin console.Note:

The Keycloak login page opens in a new tab. Do NOT close the wizard tab. You go back to it after you are done in Keycloak.

At that time, you will need to enter the Username of Initial System Administrator.

2. Log in.

  • User Name: admin
  • Password: datical

When prompted, enter and confirm a new password. To add an individual user not managed by an authentication service:

In the Keycloak application, click Users (under Manage in the left panel).

3. Click Add user. Fill in the following fields, then click Save:

  • Username
  • Email
  • First Name
  • Last Name

4. Click the Credentials tab. In the Manage Password section, fill out the following information, then click Reset Password:

  • New Password – enter a password
  • Password Confirmation – enter the password again to confirm
  • Set Temporary to OFF

5. Click the browser tab for the Datical Initialization Wizard.

Fill in Username of Initial System Administrator.

  • If you created the user manually, enter the name of a user you created in Keycloak.
  • If you are going to connect to an authentication service (LDAP or Active Directory), enter an existing user name who you want to have administrative privileges.

Confirm the user name to use as the initial Datical System Administrator, then click Check Keycloak Setup.

If the check is successful, you see the Datical login screen.

6. Log in as the Datical System Administrator user you configured. Review the user information.

  • Click My Account to see user details, roles, and groups
  • Click Admin > Users to see a list of all users. If you configured one user manually, you see that information here. If you configured user federation, you see a list of users.

7. Add ‘ProjectCreator’ role to the user.

2. Create a client in Datical Keycloak

  1. Visit https://<hostname/auth and log in with the ‘admin‘ user and password you set in the previous step.
  2. Navigate to Clients and create a new Client with Client ID as datical-aws. Select Save.
  3. Verify the Access Type is set to public.

3. Create a RDS instance running the Oracle database engine

Datical supports database engines like Oracle, MySQL, Microsoft SQL Server, PostgreSQL, and IBM DB2. The example in this post uses a DB instance running Oracle. To create a DB instance running Oracle, follow these steps.

Make sure that you can access the Oracle port (1521) from the location where you will be using Datical. Just like SQLPlus or other database management tools, Datical must be able to connect to the Oracle port. When you configure the security group for your RDS instance, make sure you can access port 1521 from your location.

4. Set up database version control using AWS CodeCommit

1. Configure HTTPS Git credentials for AWS CodeCommit. Visit IAM console

2. Select Users, from the side menu.

3. Select an valid user, then choose the Security Credentials tab.

4. Scroll down to HTTPS Git credentials for AWS CodeCommit, choose Generate.

5. Store the git credential generated. We will use this in next steps.

6. To create a new CodeCommit repository, visit https://console.aws.amazon.com/codecommit

7. Click Create repository.

8. Enter repository name as ‘AWS-Sample-Project’, and click Create repository

9. Copy the Git URL for the repository.

 

5. Create Lambda Function to Call Datical from AWS CodePipeline

Step 1: Create a service-linked role for Lambda

  1. Sign in to the AWS Management Console and open the IAM console at https://console.aws.amazon.com/iam/.
  2. In the navigation pane of the IAM console, choose Roles. Then choose Create role.
  3. Choose the AWS Service role type, and then choose the Lambda service. Then choose Next: Permissions.
  4. Select policies AWSLambdaBasicExecutionRole and AWSCodePipelineCustomActionAccess. Then choose Next:Review
  5. Enter role name as ‘datical-lambda-role’ and Create.

Step 2: Creating Lambda function for Datical integration

  1. Sign in to the AWS Management Console and open the AWS Lambda console at https://console.aws.amazon.com/lambda/
  2. On the Lambda console page, choose Create a function.
  3. On the Blueprints page, choose Author from scratch.
  4. On the Basic information section page.
    1. For Name*, type a name for your Lambda function (for example, DaticalFunctionForAWSCodePipeline).
    2. For Role*, select existing role. And select datical-lambda-role
  5. Click Create Function
  6. Choose Runtime as Node.js 6.10 if it is not defaulted.
    1. Download the Datical AWS CodePipeline Lambda Function from here: https://raw.githubusercontent.com/Datical/Datical4AWSCodePipeline/master/DaticalAWSCodePipelineLambdaFunction.js
  7. Then copy the code  into the Lambda function code box and save it.

6. Create a Datical Project

To create a new project, visit https://<hostname>/  as a user with System Administrator role.

What You Need First

You need the following information for project creation from previous steps:

 

  • AWS CodeCommit Repository URL and Git Credentials
  • Oracle RDS Host name, User credentials

Step 1: Create the Project

1.    Select Projects > New Project.

2.    Project/Schema Settings tab

  • Name : AWS-Sample-Project
  • Plan Type (default Single-Schema)
  • Project Key – Unique identifier for the project. Optional but recommended when creating a project. Use only lowercase letters, numbers, ‘_’ and ‘-‘.  It is read-only once the project is created.
  • Schema Naming (default Deployment Step)

3.    Click Save

Step 2: Create a Pipeline

In Project Settings > Pipelines:

1.    Click Create Pipeline

2.    Enter information for the pipeline:

  • Pipeline Name (required): My Demo Datical Pipeline
  • Pipeline Description
  • Pipeline Key – Unique identifier for the pipeline, optional and recommended, made up of lowercase letters, numbers, ‘_’, and ‘-‘: my-demo-datical-pipeline

3.    Click Save.

Step 3: Create Steps and Database Connections

In Projects > Project > Pipelines > Pipeline > Settings

1.    Click Create Step. You are prompted to create a connection for the step.

2.    Click Create New Connection. This is required if you have not create a connection before.

3.    Enter connection information

Note: You need Oracle RDS information from Stage 2.

  • Connection Name (required): My Dev DB
  • Connection Key – Unique identifier for the connection, optional and recommended, made up of lowercase letters, numbers, ‘_’, and ‘-‘: my-dev-db
  • DB Platform (required, the database type): ORACLE
  • Hostname (required): <RDS-DNS-ENDPOINT> created in Stage 2
  • Port: 1521
  • Database Name (may be required, depending on database type):
  • Username: <UserName> used to create Oracle RDS in Stage 2
  • Password: <Password> used to create Oracle RDS in Stage 2
  • Select SID and value as ORCL

4.    Click Save to continue entering step information.

 

5.    Enter step information.

  • Connection (required) – select the created connection.
  • Step Name (required): dev
  • Step Type : DEV
  • Step Key – Unique identifier for the step, optional and recommended, made up of lowercase letters, numbers, ‘_’, and ‘-‘: dev-01
  • Enable Row Count

6. Click Save to continue

Step 4: Create a Connection to the SCM Repository

In Projects > Project > Project Settings > SCM Settings

1. Click the Jenkins link. The Jenkins administrator console opens in a new tab.

Note: if you are prompted for credential, then you use the same user id used for datical login.

2. Click Credentials, then click System under it

3. Click Global credentials (unrestricted)

4. Select Kind, as Username with password

  • Provide credential information you download for Git Credential in Step 4.
  • Set the ID to a string. For example, you could use gitID<projname> to represent credentials to log in to Git for this project.

5. Click OK.

6. To verify, click Credentials.

Step 5: Complete SCM Connection Information

Go back to the Datical tab. Enter additional information.

  • Enter Credential ID created in previous step
  • SCM Type as GIT
  • For URL, enter the Git URL from CodeCommit
  • For Branch, enter master
  • Select Create Directories to create standard directories in the repository

Review your CodeCommit console to check created directories

Update the file under sample Changelog/changelog.xml with the content from https://raw.githubusercontent.com/Datical/Datical4AWSCodePipeline/master/changelog.xml

Note: You can use CodeCommit console to edit the file or a git client.

Step 6: Add Deployment Settings

In Projects > Project > Project Settings > Deployment Settings

1.    In Project Settings, click the Deployment Settings tab.

2.    Select values for the desired deployment behavior for this project.

3.     Click Save.

 

7. Enable automated release management for your database through AWS CodePipeline

In this step, we use aws cli to create the pipeline.

  1. Create version S3 bucket for integrating with CodePipeline.

2. Copy the below json to a file named ‘datical-pipeline.json’. And replace attributes highlighted in <<>> with your values.

{
	"pipeline": {
		"roleArn": "arn:aws:iam::<<YOUR-AWS-ACCOUNT-ID>>:role/AWS-CodePipeline-Service",
		"stages": [{
				"name": "Source",
				"actions": [{
					"inputArtifacts": [],
					"name": "Source",
					"actionTypeId": {
						"category": "Source",
						"owner": "AWS",
						"version": "1",
						"provider": "CodeCommit"
					},
					"outputArtifacts": [{
						"name": "MyApp"
					}],
					"configuration": {
						"PollForSourceChanges": "false",
						"BranchName": "master",
						"RepositoryName": "AWS-Sample-Project"
					},
					"runOrder": 1
				}]
			},
			{
				"name": "Staging",
				"actions": [{
					"inputArtifacts": [{
						"name": "MyApp"
					}],
					"name": "DBDeployment",
					"actionTypeId": {
						"category": "Invoke",
						"owner": "AWS",
						"version": "1",
						"provider": "Lambda"
					},
					"outputArtifacts": [],
					"configuration": {
						"FunctionName": "DaticalFunctionForAWSCodePipeline",
						"UserParameters": "{\"userName\":\"<<YOUR-DATICAL-USER-NAME>>\",\"password\":\"<<YOUR-DATICAL-PASSWORD>>\",\"stepId\":\"1\",\"daticalUrl\":\"<<YOUR-DATICAL-EC2-DNS>>\",\"label\":\"$all\"}"
					},
					"runOrder": 1
				}]
			}
		],
		"artifactStore": {
			"type": "S3",
			"location": "<<YOUR-S3-BUCKET>>"
		},
		"name": "DatabasePipeline"
	}
}

3. Execute the following command.

aws codepipeline create-pipeline --cli-input-json file://datical-pipeline.json

Note: To install aws cli follow steps https://docs.aws.amazon.com/cli/latest/userguide/installing.html

4.   Upon successful execution, visit CodePipeline console https://console.aws.amazon.com/codepipeline

CodePipeline Console shows successful execution of DB changes.

You can also review your deployment in Datical console.

8. Best practices for continuous delivery

Changesets are stored in an XML file inside the Datical project. The file, changelog.xml, is stored inside the Changelog folder. (In the Datical UI, it is called Change Log.)

Just like any other files stored in your source code repository, the Datical DB change log can be branched and merged to support agile software development, where individual work spaces are isolated until changes are merged into the parent branch.

To implement this best practice, your Datical project should be checked into the same location as your application source code. That way, branches and merges will be applied to your Datical project automatically. Use unique change set IDs to avoid collisions with other scrum teams.

Summary:

In this post, you learned how to build a modern database continuous integration and automated release management workflow on AWS. You also saw how Datical can be seamlessly integrated with AWS services to enable database release automation, while eliminating risks that cause application downtime and data security vulnerabilities. This fully automated delivery mechanism for databases can accelerate every organization’s ability to deploy software rapidly and reliably while improving productivity, performance, compliance, and auditability, and increasing data security. These methodologies simplify process-related overhead and make it possible for organizations to serve their customers efficiently and compete more effectively in the market.

I hope you found this post useful as you think through building a continuous release process for AWS for your entire software. If you have any questions or feedback, please leave a comment below.


About the Authors

 

Balaji Iyer

Balaji Iyer is an Enterprise Consultant for the Professional Services Team at Amazon Web Services. In this role, he has helped several customers successfully navigate their journey to AWS. His specialties include architecting and implementing highly scalable distributed systems, serverless architectures, large scale migrations, operational security, and leading strategic AWS initiatives. Before he joined Amazon, Balaji spent more than a decade building operating systems, big data analytics solutions, mobile services, and web applications. In his spare time, he enjoys experiencing the great outdoors and spending time with his family.

Karthik Thirugnanasambandam is a Partner Solutions Architect at Amazon Web Services. He works with large Global System Integrators on AWS cloud adoption. He is a DevOps and Serverless enthusiast. When not working, he enjoys reading and spending time with his family.

Robert Reeves is a Co-Founder & Chief Technology Officer at Datical. In this role, he advocates for Datical’s customers and provides technical architecture leadership. Prior to cofounding Datical, Robert was a Director at the Austin Technology Incubator. At ATI, he provided real-world entrepreneurial expertise to ATI member companies to aid in market validation, product development, and fundraising efforts. Robert cofounded Phurnace Software in 2005. He invented and created the flagship product, Phurnace Deliver, which provides middleware infrastructure management to multiple Fortune 500 companies. As Chief Technology Officer for Phurnace, he led technical evangelism efforts, product vision, and large account technical sales efforts. After BMC Software acquired Phurnace in 2009, Robert served as Chief Architect and lead worldwide technology evangelism.