AWS for Industries

How Datalex enhances developer experience using Amazon Bedrock

Datalex, a leading provider of omni channel airline retail solutions, has worked with Amazon Web Services (AWS) as both a customer and an AWS Travel and Hospitality Competency Partner. As the Datalex team learned more about the transformative benefits of generative artificial intelligence (AI), they began investigating how to use Amazon Bedrock to enhance the developer experience (DX) and deliver revenue-generating capabilities more swiftly.

This blog post outlines how Datalex is using Amazon Bedrock to apply generative AI to reduce knowledge search times and how the team plans to examine future opportunities to further enhance delivery to customers.

“Working with AWS gives us the flexibility to take advantage of innovative and emerging technologies, such as generative AI, through rapid prototyping and proofs of concept” Brian Lewis, Datalex CTO.

Considerations

Datalex has a rich heritage of product delivery within the travel industry, accumulating a vast repository of corporate knowledge alongside an extensive product suite. Its corporate knowledge is housed across platforms such as Microsoft SharePoint, Confluence, Jira, and internal developer portals. Searching across all the platforms was a cumbersome, time-consuming process, making it a perfect target for optimization.

Datalex wanted to effectively mine all of its corporate knowledge stores and surface the most relevant information for team members when needed. They wanted a solution that could be developed with minimal impact on day-to-day operations and that let them avoid a lengthy and costly overhaul of transferring all knowledge into new systems or formats.

The Datalex team built a solution combining Amazon Bedrock, a fully managed service that offers a choice of high-performing foundation models from leading AI companies, and Amazon Kendra, an intelligent search service powered by machine learning (ML), to access and take advantage of their extensive knowledge base.

Using these AWS services, Datalex was able to work within a set of complex regulatory frameworks while still moving fast. As a result, their developers can now use a single chatbot interaction to navigate and search its knowledge base quickly.

The solution

The solution harnesses a suite of AWS services, all orchestrated by infrastructure as code through Hashicorp Terraform. To build a user interface, Datalex opted for a Streamlit hosted container on Amazon ECS on AWS Fargate. Streamlit is a widely adopted open source Python application framework built for people who aren’t web developers, such as machine learning and data science teams. Amazon ECS on AWS Fargate can be used to run containers without managing servers or clusters of Amazon Elastic Compute Cloud (Amazon EC2) instances.

To secure access, Datalex uses Amazon Cognito, a service that helps customers implement identify and access management (CIAM) into web and mobile applications, linked to Microsoft Entra ID (previously Azure Active Directory) where their companies’ users are stored. The Streamlit application uses LangChain with an Amazon Kendra retriever for data source queries, facilitating a seamless integration with Amazon Bedrock.

LangChain is an open source framework offering a collection of architectures to build large language model (LLM) powered applications. Amazon Kendra uses natural language processing (NLP) and machine learning algorithms to provide search capabilities across a range of various data sources.

A majority of LLM use cases fall within the Retrieval Augmented Generation (RAG) access pattern. RAG is a design pattern in which a user’s question to an LLM is augmented with the context of relevant information from a company’s knowledge source. The LLM hosted on Amazon Bedrock can then generate an answer from the initial question and retrieved context. The ability to use its own knowledge base, rather than the implicit knowledge within a particular LLM, increases the traceability and explainability of answers.

An initial challenge was how to bring the varying knowledge sources into a framework where the data can be efficiently retrieved. Amazon Kendra simplified this process by allowing different data sources to be plugged in and providing the Retrieve API as a RAG integration. As of this writing, the API can retrieve passages of up to 200 token words and up to 100 semantically relevant ordered passages. LangChain has an Amazon Kendra retriever component that can be enabled with a few lines of code.

The LangChain Amazon Kendra retriever allows the filtering of responses and many Amazon Kendra connectors come with access control list (ACL) support. Using the data permissioning layer within Amazon Kendra offers an additional defense to mitigate risks of sensitive information disclosure, one of the top ten security risks defined in the OWASP Top 10 for LLM Applications. Datalex’s implementation established an Amazon Kendra index with multiple data sources attached, ranging from Confluence and Jira connectors to web crawlers for internal documentation.

As part of the proof of concept, Datalex collected reporting data to refine model parameters, prompts, and RAG context search relevance. This data was stored in a partitioned Amazon Simple Storage Service (Amazon S3) bucket in Parquet format. For analysis, AWS Glue and Amazon Athena are used, enabling querying data and visualizing insights within an Amazon QuickSight dashboard.

Figure 1 Using an Amazon Bedrock LLMFigure 1 – Using an Amazon Bedrock LLM to offer an easy way for users to ask questions across Amazon Kendra indexed data stores. Amazon Glue, Amazon Athena and Amazon QuickSight allow fast analysis of solution performance to quickly identify improvements.

Following the initial company-wide rollout, Datalex sought to enhance reporting by capturing user feedback in real time. The introduction of thumbs-up and thumbs-down buttons alongside responses allowed users to effortlessly provide feedback, which was then collated and stored for future analysis within Athena.

Business outcomes

Following implementation, Datalex has noted a favorable response from the workforce, with a measurable decrease in time spent navigating systems for information retrieval. Employee feedback has indicated an enhanced work experience. There has been a 25 percent uplift in the production of articles being created and updated week-on-week, bolstered by the confidence that these efforts will be more readily surfaced and utilized. This process has generated an AI/ML flywheel effect, a key driver of AI/ML transformation. For Datalex the flywheel is leading to better quality of documentation, leading to a better user experience, greater adoption, and more feedback on improvements.

Long term

Looking to the future, Datalex intends to expand AI-augmented services across various departments, starting with a phase-2 trial to test if this generative AI solution can help answer repetitive HR questions, saving the department time. Further on the horizon is deeper integration of Amazon Bedrock into their delivery model, embedding it into product capabilities to enhance customer delivery and speed of ideas into production.

“Our Datalex Digital Commerce Platform, powered by AWS, is revolutionizing how airlines engage with travelers, allowing them to offer a more personalized and extensive range of products and services. AWS’s scalable infrastructure enhances our platform’s ability to meet the various peaks in demand that airlines face on a regular basis to ensure there is no lost revenue. The addition of AWS Bedrock and the possibilities this opens for use of Generative AI is exciting and we look forward to how this can further enhance our customer experiences.”

Conclusion

Datalex’s adoption of generative AI exemplifies how swiftly solutions can be crafted using Amazon Bedrock, showcasing its potent integration capabilities with existing services. Datalex’s strategy has not only elevated the developer experience, but also established a new standard for efficiency and agility in delivering customer solutions.

Contact an AWS Representative to learn how we can help accelerate your business.

Further reading

About Datalex

Datalex is an industry provider of omnichannel retail solutions for airlines worldwide. The Datalex product portfolio enables comprehensive retail capabilities, including pricing, shopping, merchandising, offer and order management, and pricing AI.

Datalex has a strong track record of delivering cutting-edge digital transformation for progressive airline brands worldwide, including JetBlue, Aer Lingus, easyJet, Edelweiss, Air Transat, and Air China. In addition to the partnership with AWS, Datalex is an IATA strategic and ARMi partner.

Francis Flannery

Francis Flannery

Francis Flannery is a Senior Solutions Architect based in Ireland. For the past three and a half years he has helped companies of all sizes build on AWS. Francis has worked in AI/ML for well over a decade. Recently, he has being focused on helping customers bringing generative AI use cases into production. He has a degree in Electronic Engineering and a Research Masters in Cryptography. A young family means his spare time is spent picking up and sometimes building with LEGO.

Ashley Woods

Ashley Woods

Ashley Woods is the Director of Engineering Strategy for Datalex based in Manchester, United Kingdom. He works with the Executive team in Datalex and throughout all business areas to drive innovation, strategic planning, and operational excellence. Ashley has expertise in process design, ITSM, leadership, and programme/project management. He holds an MBA from the Australian Institute of Business and is undertaking a PhD in Information Systems (Virtual Teams) at the University of Southern Queensland. In his spare time, he enjoys going to the cinema and spending time with friends.

Morgan Chorlton

Morgan Chorlton

Morgan Chorlton is a Software Engineer specialising in AI and serverless computing for Datalex based in Manchester, United Kingdom. He works On the AI team, building an AI Pricing solution within AWS. Morgan has expertise in Serverless domain and has exposure to various services within AWS. He holds a Computer Science degree from the University of Huddersfield. In his spare time, he enjoys spending time with family and working side software projects.