AWS for SAP
Automating SAP installation with open-source tools
Introduction
We’ve already demonstrated in our first blog post how to provision the infrastructure for SAP applications using Terraform, and in our second blog post we added in automation of SAP software installation using Systems Manager. Now it is time to go deeper with open-source common tools like Jenkins and Ansible to have the SAP installation in a comprehensible single pipeline. This approach brings a few benefits added on top of the other alternatives:
- Helps customer teams to be compliant with auditable policies related to configuration as code, since in this blog post we will automate all of SAP software installation.
- Turns the SAP installation into a repeatable process, making the quality of the outcome easier to improve, since it can be simulated and run several times using the same source of information.
Another good option to deploy SAP is AWS Launch Wizard. Customer teams can build SAP systems that align with AWS best practices rapidly from the AWS Console following a guided experience designed for SAP administrators.
To help achieve goals such as increasing deployment efficiency and quality, many customers are automating as many repeatable processes as they can. Jenkins is an industry pattern providing one orchestrator environment that helps to put together all the required pieces. It runs the same commands we’d do manually using BASH for Linux.
In the end of this article you’ll have a Jenkins pipeline with the image below as its outcome:
The above pipeline has capabilities to build all the needed infrastructure and install the actual software for non-HA (1) SAP Primary Application Server (PAS), SAP Hana Database and SAP ABAP SAP Central Services (ASCS).
To help you make use of Jenkins and Ansible to fully automate your SAP software installation, we’ve open sourced code to a GitHub repository for this installation automation. This will automate and will be used together with the GitHub repository we created for provisioning your infrastructure using, which was explained in our first blog post.
Understanding the pipeline steps
- Checkout SCM – this is when Jenkins looks for the code on GitHub
- Prepare – Jenkins checks if all the required variables for the run are present (variables are described on section “Preparing Jenkins”)
- Check ENV states – checks if there is one S3 bucket available for storing the final Terraform state file, and also if there’s already one environment up using this automation. IMPORTANT: this step is going to create on bucket on your provided account. The bucket name will be “sap-install-bucket-” followed by a random number. Terraform will store its state file in this bucket.
- Create ENV – The infrastructure automation based on Terraform creates all the needed infrastructure for this installation. To understand what’s going to be created, review our first blog post.
- Install Hana and ASCS – this is a place holder, meaning that the next two steps (6 and 7) run in parallel.
- Install Hana – installs Hana on the instance created by Terraform.
- Install ASCS – Installs ASCS on its instance.
- Install PAS – installs PAS on its dedicated instance after Hana and ASCS are finished.
- Notify – a simple terminal notification stating the end of processing.
- Post actions – Jenkins auto-generated step stating the end of the whole pipeline.
Why Ansible vs regular Bash script?
Ansible is a programming language for configuring the operating system of our OS. It is a robust declarative language with far more benefits than regular Bash. Ansible operates this way and the main benefit it brings is:
You state one command and a Python code runs behind the scenes to achieve the desired state. Let’s take a look at one example in the main repository:
- name: Create directories if they don't exist
file:
path: "{{ item }}"
state: directory
mode: '0755'
loop: "{{ folders_to_create }}"
using one single Ansible command to state several things:
- path – states the directory path I want to ensure are createed.
- state – tells the command to create directories instead of files.
- mode – the permissions I want those folders to have.
- loop – means this will repeat X times according to the number of values I have inside varaible “folders_to_create”, and also making the “item” on path to work.
The most useful thing in Ansible is this state declaration. You just declare the state you want to reach and Ansible takes care of checking and performing the necessary steps to reach the state you described. Let’s say that one of those three folders on variable “folders_to_create” already exists. There’s no issue. Ansible will create the remaining two, and also fix the permissions of the 3 of them if it has to.
How to run the code
The installation automation repository has several folders putting together at least 12 repos that can be separated for you better understanding. Check the README files on each of the 12 folders mentioned on the main README to understand how each of them work.
1. Setting up the pre requisites
- Have access to a terminal on a Linux or Mac computer.
- Install Vagrant and VirtualBox on your computer.
- Have your SAP installation media files on a bucket in your AWS account to be used. Follow Launch Wizard’s guidelines on how to separate files between the buckets.
- For now only HANA 1909 is fully tested for this scenario. You can use a different version of that as well, but have in mind that you might have to tweak the code a bit for it to work.
2. Setting up Jenkins
- After cloning the installation automation repo, using a terminal, go to the folder “jenkins-as-code”, and type “sudo vagrant up”. Wait for this to complete. This might take around 10 minutes depending on your internet speed.
- When it’s done, open a browser window and type in “localhost:5555” and you will have your own Jenkins. Log in to it using the default user/password: admin/my_secret_pass_from_vault
3. Setting up the parameters
After logging in to Jenkins, go to Manage Jenkins > Manage Credentials. Here you will have to fill in the information for all the REQUIRED parameters. There are also some other optional parameters you can take a look at.
- AWS_ACCOUNT_CREDENTIALS – The AWS access key ID and secret access key for the IAM User you will use with Jenkins. Make sure you have a separate account for this demo only and provide administrator privileges to this user to avoid errors due to insufficient permissions.
- Example access key ID AKIA3EEGHLDKU6NTJYNZ and secret access key: nSrpAhTsPL81jVmFYjlYRtIVsKTHlFN82wyONh7X
- AMI_ID – Look for the AMI ID of the image named “Red Hat Enterprise Linux for SAP with HA and Update Services 8.2” on AWS Marketplace for the region you want to use (AMI IDs are specific in each AWS region). Subscribe to it and find AMI ID by clicking on the button “Launch new Instance”.
- Example: ami-0e459d519030c2bd7
- KMS_KEY_ARN – Create one customer managed key on your Key Management Service (KMS) and note down the ARN.
- Example: arn:aws:kms:us-east-1:764948313645:key/09fb3dfd-e0fa-4a78-aa12-8d69d96fce1e
- SSH_KEYPAIR_NAME – the name of the file you use to ssh into AWS instances. You may create a new one if necessary through the AWS CLI, or in the AWS Console, under the EC2 console, select Key Pairs. IMPORTANT! Do not add “.pem” in the end of the file. Use just the first part (before dot).
- mykeypair
- SSH_KEYPAIR_FILE – the actual creds.pem file. Upload it to Jenkins
- The “mykeypair.pem” file itself
- S3_ROOT_FOLDER_INSTALL_FILES – the S3 bucket and folder if applicable containing all your SAP media files. Follow the AWS Launch Wizard’s folder hierarchy for S/4HANA in the Launch Wizard documentation.
- Example: s3://my-media-bucket/S4H1909
- PRIVATE_DNS_ZONE_NAME – a private DNS zone name from Route53 for your SAP installation.
- Example: myprivatecompanyurl.net
- VPC_ID – VPC Id where to put the infrastructure to.
- Example: vpc-b2fa0ddf
- SUBNET_IDS – two PUBLIC subnet IDs have to be provided here (this is for future HA capabilities). Using public subnets is not advised for your real scenarios. We’re using public for making the demo simpler. For your real scenarios you should use private subnets with a Bastion or other strategy for reaching them and increasing security.
- Example: subnet-fec01a12,subnet-a615b465
- SECURITY_GROUP_ID – an already existing security group. IMPORTANT: make sure you add your own IP as the source CIDR in a rule allowing access on port 22 (SSH) to this security group.
- Example: sg-831778bb
- You are welcome to take a look at the other possible parameters. You can change the SIDs of the instances, default password, names, tags and some other important information for your installation.
4. Running the installation
Go back to Jenkins home, select “SAP Hana+ASCS+PAS 3 Instances” > “Spin up and install” > and then “Build now”. This process is going to take almost 2 hours to complete, and in the end you will have three EC2 instances with software installed to run the first as PAS, the second as ASCS, and the third as your HANA database, in your AWS account. The final output will be the image you’ve seen on the introduction part of this post.
As a last step of the installation, all three instances (PAS, ASCS and HANA) perform health checks to understand if the installation finished successfully or not. You can also do that by sshing into the instances and running “sapcontrol -nr 00 -function GetProcessList” using the <SID>adm user (ad0adm if you’re using the default SID) from terminal.
To make it easier for you to test spinning up and down your SAP, there’s also the pipeline “SAP Hana+ASCS+PAS 3 Instances” > “Destroy env”. Once you trigger this one, Jenkins is going to look for the current Terraform state file and delete everything that previous execution has created.
Next steps
Ready to get started? Head straight to the installation automation repo and start testing on your environment.
Once your tests are finished, you are welcome to customize the repo to meet your specific needs. The repo’s folders have READMEs with more instructions about how each of them work to put all the pieces together and have SAP running in the end.
If you are looking for expert guidance and project support as you move your SAP systems to a DevOps model, the AWS Professional Services Global SAP Specialty Practice can help. Increasingly, SAP on AWS customers—including CHS and Phillips 66—are investing in engagements with our team to accelerate their SAP transformation. Please contact our AWS Professional Services team if you would like to learn more about how we can help.