Partner Success with AWS / Software & Internet / United States


LILT Fine-Tunes Multilingual Generative AI Models with NVIDIA NeMo on AWS
Deployed
NVIDIA NeMo for faster model fine-tuning
Enabled
real-time model fine-tuning to incorporate linguists’ edits
Increased
throughput by 30X
Overview
LILT, a multilingual content translation and generation platform, helps customers use generative artificial intelligence (AI) to localize content, support go-to-market outreach, and improve customer experiences. Since its inception in 2015, LILT has run on Amazon Web Services (AWS) and used NVIDIA G4dn GPUs to power its platform. Recently, LILT also deployed the AWS Partner NVIDIA software stack, NVIDIA NeMo, on AWS to build its multilingual generative AI models and accelerate model fine-tuning for faster translation and higher-quality content generation.
Growing Content Production Solution Calls for Faster Model Fine-Tuning
LILT is known for bringing human-powered, technology-assisted translations to global enterprises. Its translation and content generation solution empowers product, marketing, support, ecommerce, and localization teams to deliver exceptional customer experiences to global audiences. To produce content, LILT developed its own multilingual generative AI models using NVIDIA GPUs to power model training, fine-tuning, and retraining. However, as LILT scaled, it needed the ability to fine-tune and retrain models that were five to 50 times bigger—all in real time—and capture higher-quality edits from its linguists.

The speed and frameworks that NVIDIA provides are so important to LILT. With these, we’re able to improve our models with the terms, tone, and colloquialisms our customers use—ultimately delivering better multilingual content and translations.”
Omar Orqueda
Vice President, AI Research and Engineering, LILT
Fine-Tuning Models in Real Time with NVIDIA NeMo
By adopting NVIDIA NeMo, a cloud-native framework for building generative AI models, LILT has accelerated fine-tuning. NVIDIA NeMo is included as a part of NVIDIA AI Enterprise, a secure, cloud-native, end-to-end software platform that streamlines building and deployment of production-grade AI applications, including generative AI. “The speed and frameworks that NVIDIA provides are so important to LILT,” said Omar Orqueda, vice president, AI research and engineering at LILT. “With these, we’re able to improve our models with the terms, tone, and colloquialisms our customers use—ultimately delivering better multilingual content.”
The combination of NVIDIA’s G4dn.12xlarge GPU instance and NVIDIA NeMo allows LILT to implement a real-time human-in-the-loop approach for all verified translations. This ensures all suggested changes from LILT’s linguists are used for model fine-tuning—helping to produce more accurate content. “The NVIDIA computing power makes our linguists more efficient,” Orqueda said. “They create training data for us that we can use to make our models better.”
Providing Customers the Freedom to Choose with Amazon Bedrock
As a solution that supports many different use cases, LILT structured its platform to provide customers the flexibility to choose from a variety of models. For multilingual content creation, LILT offers Amazon Bedrock, a fully managed service that makes foundation models from leading AI startups and Amazon available via an API. “We believe in the power of choice for LILT customers, and Amazon Bedrock lets them decide which model is best suited for their needs,” Orqueda said.
The fact that Amazon Bedrock can be deployed on AWS GovCloud (US) also allows LILT’s public sector customers to access a suite of models. “Many of our customers in the public sector need to deploy very sensitive content, so they’re using LILT on AWS GovCloud,” Orqueda said. “That way they can scale their loads and translate content in real time.”
Translating 150K Characters Per Second with 30X Faster Throughput
As a solution that supports many different use cases, LILT structured its platform to provide customers the flexibility to choose from a variety of models. For multilingual content creation, LILT offers Amazon Bedrock, a fully managed service that makes foundation models from leading AI startups and Amazon available via an API. “We believe in the power of choice for LILT customers, and Amazon Bedrock lets them decide which model is best suited for their needs,” Orqueda said.
The fact that Amazon Bedrock can be deployed on AWS GovCloud also allows LILT’s public sector customers to access a suite of models. “Many of our customers in the public sector need to deploy very sensitive content, so they’re using LILT on AWS GovCloud,” Orqueda said. “That way they can scale their loads and translate content in real time.”
Ensuring Large Volume Translations Are Fast and Accurate
Not only does LILT allow customers to choose their model, but they can also adapt it based on their own requirements. This ensures high-quality batch machine translations so that customers receive accurate content even when there is no human in the loop. “In some cases when the answer is needed right away, customers can use our machine translations based on a model that still gives extremely excellent outputs,” Orqueda said.
In just a few years, LILT has grown its business and developed models that deliver a quality product for customers. “Looking ahead, LILT will continue delivering an extremely good experience for customers by using the best possible models and also creating the right training data for their particular use cases,” concluded Orqueda. “The next stage for us is making LILT a completely open platform where every single company can offer their multilingual services.”
About LILT
LILT is a multilingual content translation and generation platform bringing human-powered, AI-assisted language translation and localization services to global enterprises.
About AWS Partner NVIDIA
Since its founding in 1993, NVIDIA has been a pioneer in accelerated computing. The company’s invention of the GPU in 1999 sparked the growth of the PC gaming market, redefined computer graphics, ignited the era of modern AI, and is fueling industrial digitalization across markets. NVIDIA is now a full-stack computing company with data-center-scale offerings that are reshaping industry.
AWS Services Used
Amazon Bedrock
Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon via a single API, along with a broad set of capabilities you need to build generative AI applications with security, privacy, and responsible AI.
AWS GovCloud (US)
AWS GovCloud (US) enables customers to adhere to ITAR regulations, the FedRAMP requirements, Defense Federal Acquisition Regulation Supplement (DFARS), DoD (SRG) Impact Levels 2 and 4 and 5, and several other security and compliance requirements.
Learn more »
More Software & Internet Success Stories
Total results: 149
no items found
-
Software & Internet
Improvement-IT Uses TechNative to Migrate to AWS, Speeds Customer Onboarding, and Reduces Support Calls by 15%
Improvement-IT, based in the Netherlands, provides IoT solutions to a variety of organizations with an emphasis on tracking, tracing, and monitoring the status of assets. Together with its other companies Port Pay and Alltrack Medical, it offers these innovative solutions to help customers track assets in the field, manage warehouses, and optimize supply chains. However, it was being hampered by its own managed services provider, which was running both Amazon Web Services (AWS) and on-premises assets for it. It wanted a proactive partner with deep expertise to help optimize its systems, improve client onboarding times, and better detect problems before they affected customers. AWS Partner TechNative has helped it to achieve those goals, reducing customer support calls by 15 percent and cutting onboarding time by 50 percent.
-
Software & Internet
Atlassian Reduces Latency by 17% and Saves $2.1 Million with Amazon FSx for NetApp ONTAP
Atlassian faced a critical challenge when the storage solution hosting its Bitbucket platform’s 2.3 petabytes of data was being retired. To solve this, Atlassian joined forces with AWS Partner NetApp to migrate that data to an Amazon Web Services (AWS) storage service that used NetApp’s ONTAP file system. The migration was seamless, resulting in no discernable customer impact and no service disruptions. Within a one-month period, Atlassian successfully migrated 60 million repositories, achieving $2.1 million in annual cost savings and reducing application latency by 17 percent.
-
Software & Internet
Glympse Reduces Onsite Safety Challenges by Enhancing Navigation with HERE and AWS
Since 2008, Glympse, the pioneer in location-sharing technology, has been providing innovative solutions that predictively visualize and provide notifications and updates about where people, products, and assets are while in motion. When Glympse learned that companies in heavy industrial environments struggle with safety challenges, it saw an opportunity to help. Glympse, with AWS Partner HERE Technologies and supported by Amazon Web Services (AWS), developed a unique solution—one that provides onsite drivers and visitors with web-based, turn-by-turn directions. The Glympse solution reduced collision risk and supports timely hazard response. It keeps yard operators aware of all onsite activity, supporting quicker responses to potential hazards and alerting operators about who is in the yard.
-
Software & Internet
OpenText Accelerates FedRAMP Moderate Authorization with InfusionPoints, Schellman, and AWS
InfusionPoints, an AWS GSCA Partner, and AWS GSCA Partner Schellman Compliance worked with Canada-based OpenText Corporation to achieve a FedRAMP Moderate authorization for the OpenText IT Management Platform. After connecting with InfusionPoints through the AWS Global Security & Compliance Acceleration program, OpenText achieved a FedRAMP Moderate certification in 18 months, enabling the company to serve its US government customers seeking cloud modernization and to expand its business to other federal, state, and local government agencies and contractors.
Get Started
Organizations of all sizes across all industries are transforming their businesses and delivering on their missions every day using AWS. Contact our experts and start your own AWS journey today.