IVR Migration

Easily migrate interactive voice response (IVR) flows to Amazon Lex and deliver sophisticated conversational experiences

Amazon Lex enables sophisticated and natural conversational experiences so you can optimize customer service delivery models, increase customer satisfaction, and scale your operations. Amazon Lex is a fully managed service on the Amazon AWS infrastructure so you can achieve greater agility, security, and reliability. The conversational experiences can scale based on demand without impacting latency so you can support engaging real-time customer interactions.

By migrating your IVR flows to Amazon Lex, you can expedite delivery of customer service solutions while optimizing resources so your teams to focus on complex problems and build deeper customer relationships. Amazon Lex provides automated migration tools, pre-built bots, industry grammars, and partner network to drive innovation, increase agility, and achieve cost savings. Migrate your IVR flows to Amazon Lex and deliver a rich end-to-end experience.

Why migrate IVRs to Amazon Lex?

Transform customer service

Amazon Lex provides automatic speech recognition and natural language understanding technologies to transcribe input, identify user’s intent, and fluidly manage the conversation. You can deliver personalized experiences by managing context and orchestrate dialog based on business knowledge. You can use Amazon Kendra, an easy-to-use intelligent search service, to answer natural language questions in a customer service call. Native integration with Amazon Polly allows you to select a voice suited to your brand from a range of neural text-to-speech (NTTS) voices that offer a richer, more lifelike speech quality. Overall, you can now enable richer and lifelike conversations and transform the customer service experience.

Reduce implementation effort

Amazon Lex can help you drive customer engagement with minimal effort. Pre-built bots include dialog orchestration, context management, and business logic execution so you can quickly automate customer service interactions. You can easily publish the bots on voice modality (e.g., Amazon Connect) or a social media platform (e.g., Facebook Messenger), directly from the Amazon Lex console, reducing duplicative development tasks. Easy Integration with other AWS machine learning services helps improve workflows such as document processing for increased efficiencies. You can create flows for common tasks (e.g., making a payment) or conversation patterns (e.g., transaction progress updates) driving expedited delivery of customer service solutions.

Secure and scalable infrastructure

The services use AWS’ high-performance infrastructure to achieve greater agility, security, and reliability. Amazon Lex is a completely managed service that can automatically scale with usage, without impacting latency so you can support engaging real-time conversations. Amazon Lex is PCI, SOC, and ISO compliant. You have access to the authentication and verification logic that you can easily incorporate into your conversation flows. You can obfuscate selected information in conversation logs and opt out of having your content stored. AWS core infrastructure meets the most stringent security and data requirements and you can build with confidence to deliver secure experiences.

Common migration scenarios

Maintain current IVR experience

In such a migration the source and target conversation flows remain the same so you maintain the end-user experience. You can use the same dialog prompts and updated grammars to enable a conversation. An IVR migration tool is available so you can easily convert the IVR flows including grammars to a bot definition. Grammars are specified in Speech Recognition Grammar Specification (SRGS) format and include the ECMA script tags for semantic interpretation. With this migration approach you have the flexibility to adopt a conversational experience at your own pace.

Build a new conversational experience

You can migrate to a new experience and enable sophisticated natural language conversations for your customer service function. The target conversation flows in this case are built ground-up to incorporate conversational AI innovations. To build new experiences, you can start off with pre-built solutions for domains such as financial services, insurance, and telecom. Alternately, you can use the automated chatbot designer to design the bot from conversation transcripts. Designing a conversational experience ground-up gives you an opportunity to drive better customer engagement.

Manage a hybrid experience

A hybrid experience includes some IVR flows migrated as-is and others that are created brand new after the migration. For the IVR flows that need to be carried over, you can incorporate available industry specific grammars to provide the existing experience. As you create new flows, you can use the pre-defined resources such as built-in intents and slot types. A hybrid approach provides a path forward for IVR experiences with a small portion of the flows that cannot be re-designed due to business logic or architecture constraints.

Industry grammars

Amazon Lex provides grammar support so you can collect information (e.g., itinerary confirmation code) specific to an industry during a customer service interaction. The industry grammars are a set of XML files that you can use to deliver a consistent end-user experience as you migrate IVR flows to Amazon Lex. You can choose from a range of grammars across domains such as financial services, insurance, and telecom. For example, for financial services interactions you can use the grammars to elicit account ID, routing number, and contact details. Similarly for telecom customer support you can use grammars to collect SIM number and zipcode details. The industry grammars allow you to maintain the user experience and enable seamless IVR migrations.

Partners

NeuraFlash

Blogs

Easily migrate your IVR flows to Amazon Lex using the IVR migration tool
Mar 22, 2022

Read blog post »

Expedite IVR development with industry grammars on Amazon Lex
Mar 22, 2022

Read blog post »