AWS Architecture Blog

Build Chatbots using Serverless Bot Framework with Salesforce Integration

Conversational interfaces have become increasingly popular, both on web and mobile. Businesses realize these interactions are resulting in quicker resolutions of customer concerns than a more traditional approach of agent interactions. An intelligent chatbot on top of customer-facing platforms comes with inherent benefits. Among these are 24/7 customer support with no agent wait-times, improved operational excellence, self-learning intelligence, efficient use of Customer Service Representative (CSR) agents, and more. Chatbots can help businesses save annual support costs by over 50% and reduce customer issue resolution wait-times by up to 90%.

In this blog post, we’ll explore a pre-packaged AWS architect-vetted solution called the Serverless Bot Framework. This is an AWS Solutions Implementation that can be used to add sophisticated conversational chatbots to an existing web or mobile application. It can also be extended to expand the scope of customer support by integrating with the existing Salesforce ecosystem. This chatbot can be customized to enrich customer support with intelligent, self-service, and life-like experiences. In addition, it alleviates the need to engage a CSR.

The Serverless Bot Framework solution uses AWS CloudFormation to automate the deployment of framework architecture into AWS Cloud, as shown in Figure 1.

Serverless Bot Framework architecture

Figure 1. Serverless Bot Framework architecture

Since the latest release of this Framework (v1.5.0), the Solution has integrated Amazon Lex V2 as an option for the brain module of the chatbot. This new integration supports multiple languages in a single bot, the addition of new language to an existing bot, and other features. The solution also provides a sample web application hosted on Amazon Simple Storage Service (S3) and accessed by Amazon CloudFront. Every interaction comes via Amazon API Gateway and gets synthesized by a core AWS Lambda function. This determines the capable bot microservice that can answer the customer’s question. We call this core process Intent Resolution Engine. The solution also provides model microservices for weather forecasting, pizza ordering, and appointment scheduling. These microservices showcase the ease of integrating both native AWS services and external API-dependent architectures.

Use Case

Here’s a fictional use case to showcase the value of our solution. Let’s assume there is a government agency that requires its citizens to submit pension/retirement benefit claims using a customer-facing web/mobile application. Say the application stack is built using open-source technologies and persists claims submission-related information in its respective database. It is common for large organizations to have multiple disparate systems that assist in serving their customers. Let’s extend our use case with enterprise-wide data points centralized on Salesforce. It is this internal Salesforce platform that powers the enterprise contact center solution operated by human CSR agents. How can this agency build an interactive chatbot feature on top of their existing pension/retirement application stack? How can they extend that chatbot interactive experience to provide information from other disparate systems? Let’s unpack the Serverless Bot Framework from the AWS Solutions Implementation library and apply it to this use case.

To provide an intelligent self-serve experience to their customers, this agency can use the Serverless Bot Framework’s CloudFormation template to deploy the framework architecture. The API Gateway endpoint can hook this Bot architecture into their existing website/mobile application. The framework comes with built-in Amazon Cognito integration, which can be used for API authentication purposes. For the backend, the existing Amazon Lex2 integration can hook with a custom Lambda function that can pull data from existing web application database. This framework deployment has built-in code hooks for web-to-API-Gateway and Lambda-to-DB, which can be used as models when tailoring the agency’s implementation. By integrating frontend hook of API Gateway and backend hook of custom Lambda function into databases, the agency can immediately start to leverage a sophisticated conversational chatbot. This bot can engage in a life-like chat with their customers. It can respond to questions about pension status, claim history, and any relevant claim-based information.

During these interactions with our chatbot, the customer may ask about information external to the pension/retirement application, but relevant to the government agency. Possible questions are: “What are my unemployment benefits?” or “What are my government-sponsored health benefits?” Assuming this extended information is available in the Salesforce environment, there are two potential options:

  • Option 1: We can route the chatbot conversation transcript to a live agent who has access to this extended information on Salesforce and can address customer questions.
  • Option 2: We can extend our chatbot architecture to dip into Salesforce data repository and pull relevant information to potentially address customer questions. This is possible by exposing service API(s) from Salesforce for bot consumption.

Obviously, Option 1 has challenges associated with a traditional contact center, such as agent-wait-time, limited availability, and more. Alternatively, Option 2 is much more scalable and efficient. Let’s talk about high-level implementation choices with option 2.

Considerations

Here are a few architectural considerations this customer can explore to establish connectivity between the AWS infrastructure and Salesforce. The customer can build a dedicated SaaS Integration Service VPC on the AWS side, as described in our Connecting AWS and Salesforce blog post. This architecture uses AWS PrivateLink and establishes an authorization model using OAuth2.0 framework (leveraging Node.js wrapper nforce). This secures the access/refresh token that can be used for the chatbot Lambda interactions with the Salesforce API. The other architectural option is SSL certificate-based integration between AWS Lambda and the Salesforce Connected App. This uses the OAuth2.0 JSON Web Token (leveraging the new-salesforce-jwt library). This is detailed in an AWSPlatformEventsJWT article by Salesforce developers.

Once the connectivity mechanism is determined between AWS and the Salesforce environments, the lex-sfdc-dip Lambda function can be programmed to make web service calls to Salesforce API endpoints. This is shown in Figure 2. This is used for extended information and to intelligently respond to customer’s questions in real time. After this customization is in place, our chatbot will be able to provide superior competing levels of support on par with a human CSR.

Notional architecture for Serverless Bot Framework & Salesforce Integration

Figure 2. Notional architecture for Serverless Bot Framework & Salesforce Integration

As you can see, our Serverless Bot Framework can be deployed quickly and is ready to integrate into existing architectures. Although we presented Salesforce integration here, the same conceptual implementation idea can be applied to other heterogenous platforms. Although the use case we chose is related to a government agency, this API based Framework solution can bring instant intelligent chatbot results to any industry application.

The serverless nature of the Serverless Bot Framework makes this solution cost-effective as well. Customers only pay for what they use, and it auto-scales for demand spikes. We ran estimates for multiple example scenarios with notional requirements like monthly active users, number of API requests per month and presented cost breakdown of this solution deployment in our Implementation Guide.

Conclusion

In this post, we explored the option to bring interactive life-like experience to customers of an existing web/mobile application using the ready-to-deploy Serverless Bot Framework. We also talked about extending this chatbot architecture and providing enterprise-level intelligent self-service features to customers by integrating with a Salesforce environment. Our goal with this blog post is to spark ideas around unique architectural patterns that can be built using the Serverless Bot Framework. We want to show businesses how to tap into richer customer support without the need of any major upgrade to their existing systems.

Visit the Serverless Bot Framework webpage and Implementation Guide for details on the solution’s latest updates.

Shyam Namavaram

Shyam Namavaram

Shyam Namavaram is a Solutions Architect at AWS working with Public Sector Partners. He passionately works with customers supporting their digital transformation by providing technical guidance and helping them innovate & build secure cloud solutions on AWS. He specializes in AI/ML, Containers and Analytics technologies. He is the AWS Solutions Influencer for AI/ML oriented solution implementations and helps evangelize AWS Solutions to customers. Outside of work, he loves playing sports.

Mohsen Ansari

Mohsen Ansari

Mohsen Ansari is a Solutions Builder in the AWS AI Solutions Lab where he works with AWS customers to identify and build ML solutions to address the customer’s highest return-on-investment ML opportunities.

Johny Duval

Johny Duval

Johny Duval is the AI/ML Product Manager for the AWS Solutions team. He leads a team of Solution Builders in bringing to market Well-Architected end-to-end products leveraging AI & ML services and technologies. He has worked many years leading startup teams of various sizes through different stages, integrating mobile technology to higher education institutions, launching analytics products to enterprise clients, and operationalizing AI technology to Fortune 500 companies.