Posted On: Jan 22, 2021

In a conversation, we often pause to lookup information or interrupt a speaker with an answer before they finish asking a question. Today we are launching streaming conversation APIs, so you can easily configure your bot to handle such pauses and interruptions that might occur in a conversation. With streaming capabilities, bot builders can quickly enhance the ability of virtual contact center agents and smart assistants. 

Previously, you had to configure client attributes and implement Lambda code to handle the pauses and interruptions that occur in a conversation. With a streaming conversation API, all user inputs across multiple turns are processed as one API call. The bot continuously listens and can be designed to respond proactively. For example, a caller may ask to pause the conversation or “hold” the line to find information such as credit card details. Now you can add “wait” and “continue” responses to handle this dialog. The bot waits for input while the caller retrieves information and then continues the conversation based on pre-configured cues. Similarly, interruptions can be configured as attributes to a prompt. Overall, the design and implementation of the conversation is simplified and easier to manage.

Starting today, you can use the streaming conversation APIs in all the AWS regions where Amazon Lex operates. To learn more, see the following list of resources: