Customer Stories / Software & Internet / Australia

2024
Dovetail

Dovetail Enhances Customer Insights with New Generative AI Products Developed in Weeks Using Amazon Bedrock

Learn how Dovetail accelerates development of new generative AI products and features using Amazon Bedrock, deploying the latest large language models while reducing security and administrative overhead.

1 day or less

to create prototypes

2 weeks

to develop new generative AI features

10 hours

saved by customers each week on data analysis

80%

improvement in Dovetail customers’ productivity

Overview

Dovetail is a global B2B SaaS company whose products help teams leverage customer insights. To improve scalability and automate data processing and analysis, Dovetail sought to leverage large language models (LLMs) and build its generative AI capabilities.

By adopting Amazon Bedrock, Dovetail can now create prototypes within a day and release new generative AI products and features in just two weeks. Building on AWS ensures Dovetail maintains data security, residency, and low latency while delivering intuitive AI-driven tools that enhance customer insights.

Dovetail Case Study

Opportunity | Automating Analysis of Fragmented Data with Generative AI

Founded in Australia in 2017, Dovetail helps businesses analyze unstructured data from customer videos, audio recordings, and documents. Initially, the platform focused on offering productivity features for manual data handling, such as creating highlights and adding tags.

In 2019, Dovetail migrated to Amazon Web Services (AWS) to support its growing platform. Over time, Dovetail and AWS have strengthened their collaboration, with Dovetail providing feedback and AWS supporting its development. Peter Wooden, software engineer at Dovetail, shares, “AWS is highly responsive to our feedback. They consistently use our insights to enhance their services, allowing us to stay focused on our core product.” As large language models (LLMs) emerged, Dovetail began improving its products to automate processes to improve efficiency, such as identifying key moments in transcripts and classifying support tickets. The company’s journey into artificial intelligence (AI) and machine learning (ML) started with automated transcription and basic sentiment analysis. 

Recognizing the potential of LLMs, Dovetail saw an opportunity to develop new features to enhance data aggregation and pattern recognition across customer communication channels. Deploying generative AI would provide a higher level of end-to-end automation, greatly accelerating customers’ workflows and helping them uncover insights faster than ever. As Benjamin Humphrey, co-founder and CEO at Dovetail, explains, “We needed a managed service to access the latest LLMs and embedding models for rapid prototyping and delivery”. Dovetail sought to integrate generative AI to provide users with more efficient tools for analyzing unstructured data, enhancing the platform’s power and intuitiveness.

kr_quotemark

Amazon Bedrock's serverless design boosts our product development speed. We focus on writing the code that sets our product apart, without worrying about infrastructure or provisioning. Now we can create prototypes in under a day and launch new features within weeks.”

Peter Wooden
Software Engineer at Dovetail

Solution | Expanding Generative AI Capabilities Cost-Effectively and Securely on Amazon Bedrock

When AWS introduced Amazon Bedrock, a fully managed service offering a range of high-performing foundation models, Dovetail was invited to join the service preview cohort. Previously, Dovetail had used Amazon SageMaker to deploy deep learning models but needed a more flexible solution for its variable workloads. With Amazon Bedrock, Dovetail could access serverless compute on demand and autoscale its APIs during peak periods. “Amazon Bedrock has proven to be a more cost-effective, flexible, and seamless way to integrate and deploy generative AI capabilities,” says Wooden.

Another key factor in Dovetail’s decision to choose Amazon Bedrock was the transparency AWS provided about the underlying technology and processes. AWS ensures that customer data remains confidential, without being used to train models or test concepts, thereby safeguarding the data clients entrust to Dovetail.

This focus on data security extends throughout Dovetail’s operations. By using AWS, Dovetail—and its customers—benefit from enhanced security with reduced overhead. All customer data remains securely within Dovetail’s private network, spanning from application servers to its databases.  The company meets data residency requirements for customers by leveraging multiple AWS regions to host customer data. Meanwhile, Dovetail simplifies compliance by developing its generative AI capabilities on AWS. This approach eliminates the need to notify clients about third-party sub-processors and reduces the risk of clients rejecting those processors.

Outcome | Boosting Data Analysis Efficiency by 80%, Saving Users 10 Hours Weekly

By using Amazon Bedrock, Dovetail maintains a rapid pace of innovation while controlling costs. “Amazon Bedrock's serverless design boosts our product development speed,” Wooden explains. “We focus on writing the code that sets our product apart, without worrying about infrastructure or provisioning. Now we can create prototypes in under a day and launch new features within weeks.”

Dovetail's AI-powered features speed up workflows for businesses, helping them curate end-customer data on a deeper level and unlocking insights faster. This allows designers, product managers, and salespeople to personalize their services more effectively. One business utilizing Dovetail is Instawork, a flexible work app that connects local businesses with skilled hourly workers. Emi Fogg, UX researcher at Instawork, states, “Magic features live up to Dovetail’s goal of bringing us to insight faster. While we still take every step in analyzing, synthesizing, and reviewing our research data, Dovetail’s AI features like Magic search and Magic cluster have sped up the process, saving us around 1–2 hours per project, freeing up more time to focus on nuance instead of broad strokes.” Building on the advanced capabilities of Amazon Bedrock, Dovetail introduced the preview version of its Channels product in early 2024. By deploying a range of Anthropic Claude models, including Claude 3.5 Sonnet on Amazon Bedrock, Dovetail launched new Magic features to enhance data analysis. According to a recent poll, product managers and designers using Magic features save an average of 10 hours weekly on data analysis and are 80 percent more efficient in arriving at insights.

Among these innovations, the Magic search feature has been particularly well received by Dovetail customers. Previously, users could only perform keyword searches on data within the Dovetail system. Now, with Magic search, users can ask natural language questions and receive detailed summaries as answers, based on relevant results. Additionally, Dovetail Channels uses Anthropic Claude models on Amazon Bedrock to support customers in running its Voice of the Customer program, classifying customer feedback into actionable themes and helping customers stay attuned to omnichannel user feedback.

Dovetail plans to keep refining its generative AI product offerings. “Dovetail is a prime example of how generative AI can accelerate daily tasks and make a tangible impact for customer-centered businesses,” says Humphrey. “It’s been an exciting journey exploring these technologies, and we’re always looking for new ways to leverage the latest generative AI on AWS services to push our platform even further.”

Learn More

To learn more, visit aws.amazon.com/ai/generative-ai.


About Dovetail

Dovetail is a rapidly growing Australia-based software company that helps organizations improve the quality of their products and services through the power of customer insights. The Dovetail customer insights hub turns unstructured, fragmented customer data into actionable insights for the company’s 4,000 plus customers.

AWS Services Used

Amazon Bedrock

Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon through a single API, along with a broad set of capabilities you need to build generative AI applications with security, privacy, and responsible AI.

Learn more »

Amazon Transcribe

Amazon Transcribe extracts key business insights from customer calls, video files, clinical conversations, and more.

Learn more »

More Software & Internet Customer Stories

Showing results: 1-4
Total results: 826

no items found 

1 207

Get Started

Organizations of all sizes across all industries are transforming their businesses and delivering on their missions every day using AWS. Contact our experts and start your own AWS journey today.