AWS Storage Blog

Extending SAP workloads with AWS Transfer Family

Transfer protocols, such as Secure Shell (SSH) File Transfer Protocol (SFTP), File Transfer Protocol Secure (FTPS), and File Transfer Protocol (FTP) are essential for corporations to migrate file transfer workflows by integrating with existing authentication systems. These protocols are deeply embedded in business processes across many industries like financial services, healthcare, telecommunications, and retail. Companies use these protocols to securely transfer files like stock transactions, medical records, invoices, software artifacts, employee records and more.

At T-Systems (an AWS Partner), we had a customer from the telecommunication enterprise segment that needed to provide a highly available and secure SFTP service to act as a central interface between their SAP Business Technology Platform (BTP) and external resources such as banks or other cash partners. This is a common scenario for enterprises trying to move their business-to-business (B2B) data securely in SAP BTP. SAP BTP integration suite provides an SFTP connector, but does not offer SFTP server functionality. This means that customers must install, configure, and operate a SFTP service using either complex open source solutions or expensive third-party options. For this scenario, no internal connection to the client network is required. This means that the solution can be set up completely detached from the customer’s own IT infrastructure. The lack of an available and secure SFTP server solution that could be set up independently of their own IT infrastructure presented a unique challenge. It impeded seamless integration with SAP BTP and their external resources, thereby hindering efficient and secure financial operations.

In our earlier post, Accelerate SAP workload migrations with AWS Transfer Family, we focused on how we helped a customer migrate their SAP workloads to AWS by taking over the specified data transfer protocols via FTP while providing a secure and highly available solution. In this post, we focus on how we leveraged SFTP from Transfer Family to extend the functionality of SAP Business Technology Platform (SAP BTP), providing a simple and extensible solution to invoke SAP workloads running on AWS and enhancing the BTP functionality with AWS services. This allows you to manage file transfers and modernize transfer workflows within hours and securely scale recurring business-to-business file transfers to AWS Storage services.

Solution overview

For this scenario, no internal connection to the client network is required. This means that the solution can be set up completely detached from the customer’s IT infrastructure. The only requirement is that the solution must be publicly accessible for all stakeholders.

AWS Transfer Family SFTP solution architecture

In this implementation, our customer:

  1. Created an AWS server
  2. Created IAM role for Amazon S3 access
  3. Created an SFTP user
  4. Configured the SAP BTP Integration Suite and connected it with the SFTP endpoint in Transfer Family
  5. Tested the Integration Flow

The customer had a public AWS account without connection to the on premise world and their bank & cash partners had their own subscription to the SAP integration suite in their BTP account. They leveraged a connection via SFTP to the public network using AWS Transfer Family, which authenticated the respective access via SSH keys. SFTP from Transfer Family. A separate key was configured for each SFTP user and respective party to authenticate itself in addition to a corresponding IAM role with an AWS Identity and Access Management (IAM) policy that distinguished between read and write access. Transfer Family endpoints were distributed across two availability zones, protected by security groups that break down access to appropriate IPs. The target storage is an Amazon Simple Storage Service (Amazon S3) bucket which is protected by policies from external access.

We created an AWS Transfer Family SFTP server, an IAM role and IAM policy to ensure least privilege access to the S3 bucket. We then created an SFTP user, which played an important role in configuring access on the SAP BTP side. Last but not least, we will create a connection to the created SFTP server from SAP BTP side.

1. Creating an AWS Transfer Family server

First, we started with the configuration of the SFTP Transfer Family Server. We navigated to AWS Transfer Family, and selected Create Server.

AWS Transfer Family, and selected Create Server.

Next, we selected SFTP (SSH File Transfer Protocol).

AWS Transfer Family choosing protocols

We chose the identity provider type “Service managed“, which requires less effort for the user management. The users can authenticate themselves using SSH keys.

AWS Transfer Family choosing identity provider

For the endpoint configuration, complete the following steps:

1. Choose a VPC hosted endpoint configuration for an additional management layer.

2. Restrict access to a selected group of users using Security Groups, by filtered IP addresses allowed to access the SFTP server.

3. Ensure encrypted communication and authentication keys are in place.

4. Switch to Internet-facing endpoint for public network access.

5. Assign Elastic IP addresses (EIP) to individual endpoints in the chosen Availability Zone (AZ).

VPC hosted endpoint configuration

6. Select Amazon S3 as the domain.

AWS Transfer Family choosing domain

7. Fill in the additional configuration details as required to complete the creation after the last review, otherwise leave everything as is.

AWS Under create server, configuring additional details

2. Create IAM role for S3 access

For users to connect to the S3 bucket, we need to create IAM roles. Therefore, we need to go to the IAM Dashboard, choose roles on the left side of Access management and click on Create role button

Access management and click on “Create role” button

1. For the next step, we need to select the trusted entity for the role. Select AWS Service as the trusted entity type and choose Transfer from the drop down as use case, followed by Next button.

select the trusted entity for the role

2. After that we need to add permission to the role by creating a policy in a new tab.

add permission to the role

This is an example policy for the role granting user access to your S3 bucket.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AllowListingOfUserFolder",
            "Action": [
                "s3:ListBucket",
                "s3:GetBucketLocation"
            ],
            "Effect": "Allow",
            "Resource": [
                "arn:aws:s3:::yourbucketname"
            ]
        },
        {
            "Sid": "AllowObjectAccess",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObjectVersion",
                "s3:DeleteObject",
                "s3:GetObjectVersion"
            ],
            "Resource": "arn:aws:s3:::yourbucketname/*"
        }
    ]
}

3. We will use the Policy template and paste it into the JSON editor, please keep remember to change the bucket name and add a tag.

use the Policy template and paste it into the JSON editor

4. We will add a tag to identity the purpose of this role.

add a tag to identitfy the purpose of this role

5. Finish the policy creation by giving it a name.

Finish the policy creation by giving it a name

6. Navigate back to the role creation tab. Then choose the created policy by searching for it in the search bar and selecting the check box.

choose the created policy

7. In the final step, we will give the role a name and finish the role creation by clicking on the button Create role.

give the role a name and finish the role creation

create role

If you are using S3 encryption you want to add the role the permission to use the AWS Key Management Service (AWS KMS) keys to encrypt / decrypt your bucket

        {   "Sid": "AllowKMSEncryption",
            "Action": [
                "kms:Encrypt",
                "kms:Decrypt",
                "kms:ReEncrypt",
                "kms:GenerateDataKey",
                "kms:DescribeKey"
            ],
            "Resource": "yourbucketkmskeyarn",
            "Effect": "Allow"
        }

3. Create an SFTP user

Now we can create the SFTP user. To do this, we need to select the role we created from the dropdown. In addition to the restrictions in the IAM role policy, we can further restrict access to the user’s home folder via a directly attached IAM policy.

Creating an SFTP user

To authenticate the user you need SSH keys. The public key is initially stored when the user is created. When the user connects to the AWS Transfer Family SFTP server with the corresponding private key, they authenticate themselves. How to create an SSH key is described here

Creating an SFTP user

Secure access to IP addresses for the AWS Transfer Family Server

We recommend limiting access to the appropriate IP addresses in the AWS Transfer Family Server security group. If the number of IP addresses exceeds 1000, it is recommended you use the AWS Network Firewall. It is also possible to install the WAF (web application firewall) as an additional protection against attacks.

Configurations on the SAP BTP Integration Suite

1. Log into your SAP BTP Global Account and navigate to the corresponding subaccount and click on create.

Log in to your SAP BTP Global Account

2. Create a subscription to the Integration Suite.

Create a subscription

3. Next, switch to the Integration SuiteM →Manage Capabilities→ Cloud Integration.

Cloud Integration

4. Navigate to Monitor (Bar Graph Symbol in the left bar), Integrations →Keystore (in Manage Security section).

Navigate to monitor

5. Create a Key Pair. The private key resides in BTP. The public key can be downloaded for use on the AWS server.

Create a Key Pair

6. Navigate into the created key and download the Public OpenSSH Key.

download the Public OpenSSH Key

7. Navigate back to the overview and choose Connectivity Tests and test the Connection. If all is fine, you’ll receive a confirmation.

Choose Connectivity Tests

4. Connecting SAP BTP with the SFTP endpoint in Transfer Family

In this section, we will walk through how to configure the host key and create the integration flow.

Configure the Host Key

1. Click the Copy Host Key button located on the right side to copy the known host information. Paste this into a notepad and save it without any file adjustment. In the following step, you’ll upload this Known Host text file. The Known Host file is utilized by the client to verify authorized servers and contains the hostname (SSH Server) as well as its public key.

Select the 'Copy Host Key' button

2. Navigate back to the overview and select Security Material.

select Security Material

3. Upload the Known hosts (SSH) text file.

Upload the Known hosts (SSH) text file

4. Navigate to Design via the pencil icon on the left bar and create a package and click on Save.

Navigate to Design via the pencil icon on the left bar, create a package, and select save

Create an Integration Flow

1. Within the Package, navigate to Artifacts, and click on edit.

Within the Package, navigate to Artifacts, and select edit

2. Create a new Integration Flow (IFlow)

Create a new Integration Flow

3. Add Integration flow and click ok.

Add IFlow, and select ok

4. Click on Edit on the top right corner.

Double-click the SFTP arrow

5. Drag the sender node to the start node and select SFTP.

Drag the sender node to the start node and select SFTP

6. Insert the host Endpoint of the AWS Transfer Family server, add the alias of the saved SSH private key under the Source tab and provide the username and save.

Insert the host Endpoint of the AWS Transfer Family server

5. Testing the Integration Flow

1. Double click on SFTP arrow and click on maximize in the lower right corner to fill in the SFTP connection and processing details. Then fill in the Processing and Scheduler tabs based on your preferences.

Double-click the SFTP arrow 2

2. Add any other processing steps to the IFlow to suite the use case. In our use case, we added a Groovy Script to convert the incoming tab separated file to an XML format.

Add any other processing steps to the IFlow

3. Log into your AWS account and search for S3 in the Search Bar. Navigate to the bucket and the user’s required folder, then upload the to be processed file. The file will get sent as a part of the batch input.

Log in to your AWS account and search for Amazon S3

4. If the flow is a success, the IFlow will display it in Completed Messages in the Monitor → Monitor Message Processing.

the IFlow displays it in Completed Messages

5. If a Data Store was added as a step in the IFlow, the file will display in the Monitor → Data Store under the name of the IFlow and can be downloaded. then the file displays in the Monitor

Cleaning up

In order to prevent additional AWS charges, you would need to delete the Transfer Family server, and additionally any associated objects stored in Amazon S3. Please refer to your SAP documentation for guidance on any clean up for SAP.

Conclusion

In this post, we demonstrated how to leverage Transfer Family SFTP to extend the benefits of SAP Business Technology Platform (SAP BTP), by invoking SAP workloads running on AWS.

AWS Transfer Family continues to provide immense value to SAP workloads. By utilizing the SFTP service, you can now enjoy the added benefit of freeing up valuable time and resources by eliminating the management overhead of a traditional SFTP server and expanding SAP BTP functionality by utilizing Transfer Family SFTP. This allows you to focus on actual business improvements, leading to increased productivity and success.

Additionally, our step-by-step guide in the blog is a resource for navigating through the configuration process of SAP BTP. With the use of AWS Transfer Family, we were able to provide a simple and scalable solution to launch SAP workloads on AWS, ensuring a highly available environment for our customers in both use cases.

For more information, visit AWS Transfer Family or our previous blog post.

The content and opinions in this post are those of the third-party author and AWS is not responsible for the content or accuracy of this post.

Artur Schneider

Artur Schneider

Artur Schneider was born in 1989 and lives in the south of Germany near Ulm. He's currently a Senior Cloud Consultant at T-Systems. In addition to his education as an IT specialist for system integration, he is also an educated bank clerk. He started his IT career as a system engineer in Microsoft environments. He specialized amongst else in virtualization, backup and monitoring of infrastructures. Since 2016 he has been involved in cloud topics, these included migrations of complete infrastructures to cloud platforms and automation of new services for migration to cloud platforms. In addition, he had a leading role in building a cloud automation team focused on the AWS cloud platform. Since then, he has been working in numerous AWS projects as Senior Cloud Consultant for enterprise customers.

Ruchir Saxena

Ruchir Saxena

Ruchir is based in Munich, Germany. After completing his education in Computer Science Engineering, he started working as an SAP Consultant in 2014 and is currently working as an Application Development Specialist. He has led multiple projects in the areas of SAP ABAP, SAP HANA, SAP BTP and Python. Lately, he has been parallelly working on AWS developments as well.