Benefits
Overview
Tangram Therapeutics (Tangram) is redefining how new medicines are discovered by combining biology with advanced computation. To overcome the long timelines and inefficiencies of traditional drug discovery, the company worked alongside Amazon Web Services (AWS) to develop LLibra OS, a powerful agentic AI platform that unifies and analyzes data from disparate sources to fuel drug discovery. Built on AWS, LLibra OS helps researchers rapidly identify and evaluate promising disease-modifying drug targets and design medicines to silence them, advancing life-changing treatments faster.
About Tangram Therapeutics
Tangram Therapeutics is a UK biotech company on a mission to solve human disease through computation. It merges biology and technology to accelerate the discovery and development of innovative RNA interference (RNAi) medicines.
Opportunity | Using AI to accelerate RNAi discovery for Tangram
Tangram doesn’t simply want to improve on existing treatments—it aims to discover entirely new medicines to treat diseases with significant unmet therapeutic need. The company applies computation at every stage of the drug discovery and development process. Prior to Tangram developing LLibra OS, its platforms included GalOmic, which generates proprietary RNAi drugs that silence disease-modifying genes by blocking specific genetic instructions, and its legacy platform HepNet, which employed older forms of AI such as deep learning and facilitated identification of novel gene targets using proprietary network science approaches. Both platforms aim to overcome industry challenges such as the long timelines associated with drug design and optimization as well as the limited biological understanding of complex disease.
Solution | Building LLibra OS on AWS to transform target discovery
AI innovation thrives on AWS, and Tangram is part of a global community of customers—from Fortune 500 leaders to new businesses—that use AWS to run their ambitious AI workloads. As an AI-first company, Tangram built its pipeline on classical machine learning and graph networks, and has created a predictive AI model for small interfering RNA design.
To build its next-generation discovery platform, Tangram worked alongside AWS Professional Services, a global team of experts that can help businesses realize their desired business outcomes when using AWS. Building on the foundations of HepNet, the Tangram and AWS teams created LLibra OS—an applied-AI computational platform for data ingestion and analysis. Built on an agentic infrastructure, LLibra OS uses proprietary analytical capabilities to identify novel targets, evaluate their therapeutic potential and opportunity in a medical indication, and support the predictive design of GalOmic medicines. The novel target-indication pairs are quantitatively scored and ranked using novel graph neural network architectures, designed specifically for the unique problems that LLibra OS solves. Active learning using reinforcement learning ensures that the scoring system stays up to date and highly performant and tunes the scoring system to the requirements of the scientific users.
LLibra OS uses agentic orchestration tools to automate the review of massive datasets, unifying proprietary, licensed, and curated public data to generate actionable insights. The platform’s multiformat document-processing pipelines analyze everything from internal lab results to public datasets. LLibra OS’s retrieval augmented generation tool aids knowledge discovery and decision-making, complemented by web search and text-to-SQL capabilities, ensuring that insights from diverse data sources are effectively combined.
At the core of LLibra OS is a carefully curated selection of large language models (LLMs) operating within an agentic framework. The platform’s modular architecture makes it possible for Tangram to continually switch out or upgrade models or add specialized AI agents as technology evolves, supported by a powerful evaluation engine to assess LLMs as they are released. “We need the best of the best LLMs at any given time,” says Lee Clewley, vice president of applied AI at Tangram. “Everything we’ve done is modular, extensible, and changeable, so that as we get increasingly better technology, we can just plug it in.”
A core design choice throughout the AI used in LLibra OS is the embedding of subject matter expertise wherever possible. From collaboration on the prompts used in the LLMs across the agentic framework, to the design of novel deep learning architectures with curated Bayesian priors, LLibra OS is built by Tangram, for Tangram.
LLibra OS runs on a serverless AWS architecture to support scalability and flexibility. It uses AWS Glue to discover, prepare, and integrate data at nearly any scale—processing more than 1,000 biological datasets. LLibra OS also uses Amazon Athena, an interactive query service that simplifies data analysis. Amazon Bedrock—a comprehensive, secure, and flexible service for building generative AI applications and agents—is used to access and evaluate LLMs.
By combining these AWS services, LLibra OS extends research capabilities beyond traditional limits, freeing scientists to explore new ideas for potential life-changing treatments. “Running LLibra OS on AWS, we not only accelerated research and boosted efficiency but also increased scientists’ productivity,” says Clewley.
Outcome | Transforming pathways to life-changing medicines
Launching LLibra OS on AWS accelerated and enhanced Tangram’s research. Document processing that previously took weeks now takes hours. The company can assess four to five target-indication propositions in just a few hours—instead of 1 quarter—increasing speeds by up to 50 times. Tangram has built an in-house evaluation harness that rigorously evaluates all components of LLibra OS to ensure consistent, cutting-edge performance. As of November 2025, LLibra OS outperforms all available reasoning models and deep research frameworks on one of the most challenging available biomedical datasets, HLE Bio/Chem Gold, with all reasoning and provenance fully discoverable.
LLibra OS processes 300 times more data, with roughly 288 million input tokens and 16 million output tokens on a typical day, compared with less than 1 million output tokens previously. This opens the door to exploring answers to questions that were once out of reach by rapidly accelerating scientists’ ability to find and evaluate differentiated target-indication pairs. And by unifying, evaluating, and scoring data that’s constantly being added, LLibra OS is making it possible to discover hidden targets that might never have been found without the use of AI.
By building LLibra OS on AWS, Tangram is revolutionizing drug discovery, giving scientists the insights that they need to evaluate new targets faster and deliver life-changing treatments sooner. “The new solution is not only scaling and speeding up research,” says Clewley. “It’s actually supercharging the team doing it.”
Running LLibra OS on AWS, we not only accelerated research and boosted efficiency but also increased scientists’ productivity.
Lee Clewley,
Vice President of Applied AI, Tangram TherapeuticsAWS Services Used
Did you find what you were looking for today?
Let us know so we can improve the quality of the content on our pages.