AWS for Industries

How to Integrate Your AWS Cloud-Native Dynamic Supply Chain Application With External Systems

AWS_Animated_Banner_1600x200_2022

In our previous blog, How to build a dynamic supply chain platform: a primer, we discussed how companies can use Amazon Web Services (AWS) to build a dynamic supply chain that can proactively react to events and also provide complete visibility, communication, and data transfer. By collecting and analyzing near-real-time supply chain data, companies can gain better insights and control over their supply chains. With that said, to gather data and communicate changes across the supply chain, you need to integrate your AWS cloud-native dynamic supply chain application with bespoke applications and third-party systems (3PSs).

In this post, we will dive deeper into five different mechanisms for integrating an AWS cloud-native supply chain application with 3PSs, such as a transportation management system (TMS), a warehouse management system (WMS), a freight forwarding system (FFS), or an inventory management system (IMS). For each of the five mechanisms, we provide a reference architecture, a brief description of the approach, use cases, and benefits.

1. Amazon SQS

Figure 1 illustrates the reference architecture for integrating an AWS dynamic supply chain application with a 3PS using Amazon Simple Queue Service (Amazon SQS), a fully managed message queuing service that transmits data across systems using messages.

Figure 1. Integration between AWS applications and a third-party system using Amazon SQS

Figure 1. Integration between AWS applications and a third-party system using Amazon SQS

When changes occur in the 3PS that need to be communicated to the AWS dynamic supply chain application, the data is encapsulated in a message that is inserted into a message queue on Amazon SQS. Amazon SQS can authenticate the 3PS using the approaches outlined in this blog, Approaches for authenticating external applications in a machine-to-machine scenario. Amazon SQS offers two types of message queues: 1) standard and 2) first-in, first-out (FIFO). We recommend using Amazon SQS FIFO queues for supply chain integrations so that messages are processed only once and in the exact order that they are sent. The cloud-native dynamic supply chain application polls the Amazon SQS FIFO queue, retrieves messages for processing, and deletes the message from the queue. Any changes in the dynamic supply chain application that need to be communicated back to the 3PS are inserted into another Amazon SQS FIFO queue. Similarly, the 3PS polls the queue, retrieves messages for processing, and deletes messages from the queue.

When to use

This approach is ideal for custom or bespoke supply chain applications that need to directly integrate with the AWS cloud-native supply chain application. For example, you may have a supply chain order management application that needs to communicate status changes to your AWS supply chain control tower application that provides visibility across the supply chain. In this case, you don’t have to manage any downstream dependencies across systems. Each application in the supply chain simply hooks into the control tower using this mechanism, so the control tower can achieve near-real-time visibility across all applications.

Benefits

  • You can decouple the systems. Say, for example, a 3PS is experiencing issues and cannot process the messages. The cloud-native dynamic supply chain application can continue to send messages to the Amazon SQS FIFO queue. Once the 3PS issues are resolved, the application can start processing messages again.
  • Amazon SQS FIFO queues are highly scalable and durable and have extremely high throughput with up to 3,000 messages per second

2. Amazon EventBridge

Figure 2 illustrates the reference architecture of integrating an AWS cloud-native dynamic supply chain application with a 3PS using Amazon EventBridge, a serverless event bus that makes it easier to build event-driven applications at scale using events generated from your application.

Figure 2. Integration between AWS applications and a third-party system using Amazon EventBridge

Figure 2. Integration between AWS applications and a third-party system using Amazon EventBridge

Events are changes in the 3PS that need to be communicated to the AWS cloud-native dynamic supply chain application. Event data is encapsulated in a message that is inserted into an event bus on Amazon EventBridge. Amazon EventBridge can authenticate the 3PS using the approaches outlined in this blog.

If the 3PS is a software-as-a-service (SaaS) application supported by Amazon EventBridge, the message can be inserted directly into a partner event bus. However, some SaaS applications, such as Salesforce, may require an integration using Amazon AppFlow, a fully managed service that helps users to securely transfer data between supported SaaS applications. You can find a detailed technical discussion of these integrations in Building Salesforce integrations with Amazon EventBridge and Amazon AppFlow. If the SaaS application is not supported by Amazon EventBridge, you can insert the message into a custom event bus using a REST API hosted on Amazon API Gateway, a fully managed service that makes it easy for developers to create, publish, maintain, monitor, and secure APIs. For a detailed technical discussion, please refer to this blog. The custom event bus is configured with rules that can transform the message as required and send the message to the cloud-native dynamic supply chain application.

Meanwhile, any changes in the dynamic supply chain application that need to be communicated back to the 3PS are also considered events that can be directly inserted into the custom event bus. The custom event bus is configured with rules that can transform the message as required and send it to an Amazon SQS queue. The 3PS retrieves the messages from the Amazon SQS queue and deletes those messages from the queue after processing. As we mentioned previously, Amazon SQS can authenticate the 3PS using the approaches outlined in this blog.

When to use

It’s best to use this approach when several custom or bespoke supply chain applications need to work in tandem or in near real time with the AWS dynamic supply chain application. For example, consider a case when multiple 3PS systems, such as a TMS, WMS, and FFS, need to coordinate with an AWS supply chain control tower to complete a transaction. In this case, the control tower acts as an orchestrator and provides near-real-time visibility across the supply chain.

Benefits

  • This scenario facilitates a single message to be processed by multiple downstream systems (targets).
  • You can apply rules and invoke different targets based on values in fields in a message.
  • The event bus can transform a message according to the requirements of different target systems.
  • You can archive and replay messages.
  • You can use custom schemas to define the structure of events.
  • You can download code bindings for popular programming languages to speed development.

3. AWS Glue

Figure 3 illustrates the reference architecture of integrating an AWS cloud-native dynamic supply chain application with a 3PS using AWS Glue, a serverless data integration service.

Figure 3. Integration between AWS applications and a third-party system using AWS Glue

Figure 3. Integration between AWS applications and a third-party system using AWS Glue

First, set up connections between AWS Glue and the 3PS database as well as AWS Glue and the AWS cloud-native dynamic supply chain application databases. We highly recommend using AWS Glue Studio, a visual graphical interface, to set up these connections. You can connect the 3PS databases to AWS-native databases, such as Amazon Redshift, which uses SQL to analyze structured and semistructured data, Amazon DynamoDB, a fully managed, serverless, key-value NoSQL database, and Amazon Relational Database Service (Amazon RDS), a collection of managed services. You can also connect 3PS databases to a variety of JDBC data stores, such as PostgreSQL, MySQL, Oracle, Microsoft SQL Server, and MariaDB. The 3PS databases and the AWS dynamic supply chain application databases can also be connected to any proprietary databases using connectors available on the AWS Marketplace—where you can find, test, buy, and deploy software that runs on AWS—or you can set up a custom connection using custom code. Once the connections are set up, AWS Glue Studio can create, run, and monitor extract, transform, load (ETL) jobs that extract and transform the required data from the 3PS database at the desired frequency, and store the extracts in Amazon Simple Storage Service (Amazon S3), an object storage built to retrieve any amount of data from anywhere. Data from Amazon S3 can be loaded into the AWS data stores used by the dynamic supply chain application. You should reverse the process to extract data from the AWS cloud-native dynamic supply chain application database and load it into the 3PS databases.

When to use

You should use this approach when you do not need near-real-time integration with a 3PS—for example, if you have an AWS supply chain planning application that needs to collect information from various third-party supply chain systems. In general, supply chain planning requires a larger window of time, and therefore, you can migrate data in batches.

Benefits

  • This approach gives you flexibility to connect to a wide range of data sources, and you can use this approach with structured and semistructured data.
  • The easy-to-use graphical interface speeds the process of authoring, running, and monitoring ETL jobs.
  • You can reduce custom coding by reusing common transformations, such as “removal of duplicates.”
  • It runs in a serverless environment, which means you don’t have to manage infrastructure, and AWS Glue provisions, configures, and scales the resources required to run your ETL jobs.

4. Amazon AppFlow for SaaS applications

Figure 4 illustrates the reference architecture of integrating an AWS dynamic supply chain application with a 3PS using Amazon AppFlow.

Figure 4. Integration between AWS applications and a third-party system using Amazon AppFlow

Figure 4. Integration between AWS applications and a third-party system using Amazon AppFlow

First, configure the data transfer connection and data transfer frequency between the 3PS and Amazon AppFlow using the SaaS provider credentials. Then, configure Amazon AppFlow to extract data and save it in Amazon S3, so that the AWS dynamic supply chain application can process the data as needed. For an example of how to set up Amazon AppFlow for commonly used SaaS supply chain management applications, such as SAP, refer to this blog. You should reverse the process when the AWS dynamic supply chain application needs to send data back to the 3PS.

When to use

We recommend using this approach if your 3PS is a SaaS application that supports bidirectional transfer using Amazon AppFlow. Depending on the type of 3PS that you are integrating your AWS dynamic supply chain application with, you can also use this approach with the use cases in option 1 (Amazon SQS) and option 3 (AWS Glue).

Benefits

  • Features like data pagination, error logging, and network connection retries are included by default.
  • Move large volumes of data without needing to break them down into multiple batches and without needing to plan or provision resources.
  • The AWS management console offers an integrated user interface (UI) that makes it easy to configure the connection between Amazon AppFlow and your 3PSs.
  • You can run the workflow on demand, scheduled, or in response to business events for certain applications.

5. AWS Transfer Family

Figure 5 illustrates the reference architecture of integrating an AWS dynamic supply chain application with a 3PS using the AWS Transfer Family, a fully managed service that transfers files over SFTP, FTPS, and FTP directly into and out of Amazon S3 or Amazon Elastic File System (Amazon EFS), a simple, serverless, set-and-forget, elastic file system.

Figure 5. Integration between AWS applications and a third-party application using AWS Transfer Family

Figure 5. Integration between AWS applications and a third-party application using AWS Transfer Family

In this approach, you set up an SFTP server that connects to AWS Transfer Family using the built-in authentication manager (service managed), Microsoft Active Directory (AD), or by integrating your own or a third-party’s identity provider, such as Okta or Microsoft AzureAD (BYO authentication). Then, you use an ETL process to extract data at the desired frequency and transfer the data to the SFTP server. The SFTP server uses AWS Transfer Family to transfer the data to Amazon S3. The AWS dynamic supply chain application imports the data from Amazon S3. The process works in reverse for virtually any data you need to transmit to the 3PS. In the reverse scenario, the AWS dynamic supply chain application sends the data to Amazon S3, and then the data is transferred to the SFTP server using AWS Transfer Family. The ETL process transforms and loads the data into the 3PS database.

When to use

We recommend using this approach if you do not need near-real-time integration and you can’t use AWS Glue in option 3 or Amazon AppFlow for SaaS applications in option 4.

Benefits

  • This approach provides the most flexibility and can be used in almost any situation that doesn’t require near-real-time integration.
  • With AWS Transfer Family, you don’t need to host and manage your own file transfer service.
  • This scenario gives you a fully managed, high-availability file transfer service with auto-scaling capabilities, and you don’t have to maintain your own code or infrastructure.
  • You can maintain client-side configurations for authentication, access, and firewalls, which simplifies management overall.

Get started today

In this blog post, we discussed various options and mechanisms for integrating an AWS dynamic supply chain application with third-party supply chain systems. Contact your AWS account team today to get started on a proof of concept or implementation project using these architectures. Visit the AWS website to learn more about other AWS cloud-based solutions that can transform your CPG supply chain or your retail supply chain.

Stay tuned for the next installment in this blog series to learn about AWS innovations across the supply chain to meet the demands of our dynamic global economy.

Sanjeev Pulapaka

Sanjeev Pulapaka

Sanjeev Pulapaka is a Senior Solutions Architect in the US Fed Civilian SA team at AWS. He works closely with customers in building and architecting mission critical solutions. Sanjeev has extensive experience in leading, architecting, and implementing high-impact technology solutions that address diverse business needs in multiple sectors, including commercial, federal, state, and local governments. He has an undergraduate degree in engineering from the Indian Institute of Technology and an MBA from the University of Notre Dame.

Dnyanesh Patkar

Dnyanesh Patkar

Dnyanesh Patkar is the Head of Transportation and Logistics in AWS Industries. In his role, he works with customers to envision, develop, and execute transformational business and operating model strategies in this space. He is a seasoned executive, and his background includes leading business transformation programs, serving in general management roles with P&L responsibility, and creating high-performance teams. He has over 25 years of experience at companies such as Schneider National, DiamondCluster, and National Semiconductor. Dnyanesh has his MBA from the Wharton School of Business and a Master of Engineering from Cornell University. Dnyanesh is based out of the AWS Atlanta office.

Rajdeep Saha

Rajdeep Saha

Rajdeep Saha is a specialist solutions architect for serverless and containers at AWS. He helps customers design scalable, secure applications with AWS services. Rajdeep is passionate about teaching newcomers about cloud computing. He is based out of New York City and uses Twitter sparingly.