AWS for Industries

Executive Conversations: Realizing the Potential of Cloud Technology in Healthcare

Vignesh Shetty, Senior Vice President and General Manager of Edison AI & Platform at GE Healthcare, joins Taha Kass-Hout, MD, MS Vice President and Chief Medical Officer at Amazon Web Services (AWS), to discuss strategies for building a modern data strategy to fuel innovation and improve patient care. GE Healthcare provides medical technology, pharmaceutical diagnostics, and digital solutions for healthcare organizations across the globe. These solutions leverage artificial intelligence (AI) and machine learning (ML) to enable precision health approaches and provide highly personalized and precisely targeted diagnostics and therapeutics to patients, to deliver on the promise of precision medicine.

This Executive Conversation is one in a series of discussions held with thought leaders in healthcare and life sciences. It’s where we seek to learn more about the impact of technological innovation and cloud computing on their industries.

Taha Kass-Hout: Healthcare is an unusual business—seven and a half trillion dollars worldwide. Despite that, four billion people don’t have reasonable access to care. AI and technology must be part of the solution to provide more equitable care. At GE Healthcare, you’ve been pioneering the Edison platform built on AI. How are you enabling your clinicians or customers to take care of every patient’s journey and make sense of health data?

Vignesh Shetty: Let me start with a specific customer example that encapsulated the challenges in health technology. In 2018, the CXO of a large European radiology firm explained to us that he viewed the fundamental problem of his company, and the industry, as one of unfulfilled potential.

He had three concerns: First, when operating within a public health system, he needed to understand how to value time per appointment and increase patient access to care. Second, he wanted to focus on increasing access to personalized care. Finally, healthcare stakeholders have all these AI tools being thrown at them. How do they quantify the value or lack thereof? Is it in terms of time saved, dollars saved, fewer readmissions, fewer rescans, fewer codes blues? He knew these questions didn’t have easy answers.

In that case, we [at GE Healthcare] came up with point solutions that used protocol variation. We looked at CT and MRI scans. By evaluating the sequence in which they were carried out, we identified the top three procedures and saved a few minutes on each. That led to significant savings and increased throughput, which increased the use of time per appointment and significantly reduced the exam backlog.

Zooming out, a major challenge that remains is aggregating multimodal data—genetic data, radiogenomics, imaging, and so on. It could, in theory, create a cohesive, longitudinal view of a patient that becomes a foundational pillar on which a lot of innovation can happen. But that aggregation doesn’t exist yet in the context of AI.

In these areas, I believe there’s a strong case for more collaborations, like the one between GE and AWS. It’s critical to integrate AI seamlessly and then track model performance, model drift, and data drift to avoid creating or reinforcing bias. When people say, “I don’t trust AI,” it isn’t just about the algorithm; it’s also about the data used to train and validate an algorithm, which can lead to potential bias. But if we can join forces, I think there’s huge potential.

TK: Breaking the black box and understanding the bias that goes into models is not an easy task. But when you start breaking a model into its components, there are technical challenges. Many of the machine learning services we provide can help customers zero in on a particular problem and find out where drift happens in the data or model performance overtime, like Amazon SageMaker Clarify and Amazon SageMaker Model Monitor. You mentioned that multimodality is one major challenge. Can you describe a few operational and clinical use cases for AI in addressing that challenge?

VS: At GE Healthcare, we try to meet clinicians where they are, because we want to be able to supplement human interaction rather than replace it. When it comes to AI, we want it to be as invisible as possible and integrate it into our clinical and operational workflows.

Fundamentally, the Edison platform removes a lot of the heavy lifting required of the builder for multimodal applications. Organizations need to ensure that they can collect and query data from heterogeneous sources, whether unstructured or semi-structured. That’s where we need solutions like Amazon Comprehend Medical—a machine learning service that has been trained to analyze and structure medical text. It frees builders from agonizing over how to collect and organize this data.

When it comes to deployment, the Edison Open AI Orchestrator helps hospital executives create integrated workflows that deliver optimized clinical outcomes. Use cases might involve using the right protocol to find the right clinical indication, or even operational outcomes such as determining the correct staffing levels for a given week based on inpatient numbers and expected intake fluctuation.

We’ve seen a fundamental paradigm shift with the advent of AI embedded into devices themselves. In radiology, a deep learning reconstruction algorithm, that operates on raw images, has enabled organizations to achieve lower scan times and better image quality―simultaneously using less expensive hardware.

Finally, there are significant operational use cases at the enterprise level. How do you predict and minimize the likelihood of a no-show, which is a missed care opportunity? When it comes to things like staffing, how do you ensure that you map the right patient to the right device and then have the right staff at the right time, in the right location?

TK: Some people think of GE Healthcare as an imaging company, but you are clearly working to address multimodality in the cloud and on devices. How do you see the value of interoperability bringing all of this together?

VS: I view interoperability as necessary—a hard prerequisite to achieving any of those outcomes. The value of insights is increasingly a function of being able to collect data from multiple, previously siloed, services or departments within a hospital. We’ve got everything from radiogenomics, imaging data, and NGS sequences to unstructured and semi-structured notes.

How do you ensure that you build a standard set of services by each vendor that doesn’t vary from version to version, enabling backward compatibility of the APIs? And then, across vendors, how do you ensure that you minimize the heavy lifting for both the application builder and the people deploying it—the provider or the payor?

Those solutions are predicated on key interoperability standards being built either explicitly or implicitly. I think that’s going to be the future—when you do that well, you fundamentally take the focus off the clinician back to where it belongs [on patients]. When systems talk to each other, clinicians can spend less time documenting things and instead spend more time looking patients in the eyes.

TK: Can you elaborate on how you see the role of AI in structuring unstructured data and improving outcomes?

VS: I view AI as a force multiplier for being able to sift through mountains of data—especially heterogeneous or multimodal data—to create a 360-degree view of a patient. Importantly, it’s not just data mining and data safety; it’s being able to uncover patterns that we didn’t even know existed while making sure we’re not mistaking correlation for causation. That’s where AI comes in. We use it for retrospective analytics. We typically have lots of historical data, and we should encourage more data sharing (as long as it’s compliant) because AI used correctly is a tool for good. But we also use AI for predictive and prescriptive insights.

For example, if you look at a healthcare command center at a large hospital, they fundamentally use machine learning and deep learning for operational use cases. They get about 500,000 messages per day—data from EHRs (electronic health records) or the machines themselves; multi-vendor, multimodality. They parse it to determine how to improve utilization and reduce indicators of patient deterioration, like code blues. These are turn-by-turn, real-time analytics that you couldn’t do with humans staring at a screen. It fundamentally needs pattern analysis, which is where AI comes in.

TK: What are some strategies to address the challenges you see holding organizations back from developing better AI?

VS: I would say start small and then scale. Once you’ve got confidence, don’t get held back by the fact that there are silos within an enterprise. That’s the reality for almost every large enterprise today. But you could use a combination, even synthetic data for training and then real-world data and real-world evidence for actual validation. As you start to iterate in the early stages, maybe start with a single modality. Pick your biggest pain points, pick a use case, work backward from that and then scale out to multimodal analyses. It’s as much a cultural shift as it is a technological shift, and adoption drives the breaking of silos, which is the biggest impediment.

TK: Looking to the future, what are you most excited about?

VS: I think healthcare and health technology are at an early inflection point. I’m incredibly excited about the specific marriage of human intelligence with some of these tools to drive better clinical and operational outcomes.

I think two specific areas are going to be huge in the future. We touched upon one—multimodal data aggregation. Multimodal AI can operate on this data and create actionable insights for various disease states such as prostate, lung, or breast cancer.

We’re still walking through some of those interoperability challenges we’ve touched on, along with other issues about data privacy, data sovereignty, and data gravity. But I think we’ll be in significantly better shape 10 years from now. The other area is operational. I know a lot of the past focus has been on the clinical side, but as we’ve discussed here, the ability to iterate, learn and improve lends itself to operational use cases.

To envision the future, we can learn from other industries. For example, think about ridesharing: it’s not about just saving time, saving money, or increasing safety. It does all those things, but the game-changing benefit is reduced uncertainty. Instead of waiting outside in the cold for a cab to turn the corner, I can sit back, track the car, have a cup of tea, and step outside just as my ride pulls up.

Now imagine that same reduction in uncertainty in the healthcare provider space. What if we could do that for both patients and providers, from scheduling to diagnosis to therapy planning to therapy application, for both acute and chronic illness? I think it’s going to be groundbreaking.

TK: We’re excited about this future, too. We’re making material investments in building tools that require no machine learning background. Software developers, app builders, data engineers, data scientists, or even clinicians with no coding ability should be able to start building these exciting applications. In the era of personalization, we’ll be able to start precisely targeting therapeutics and diagnostics at the individual level. The future is bright for AI, cloud technology, and big data.

At AWS, we provide the broadest and deepest set of AI and machine learning services built on the most comprehensive cloud to support customers’ machine learning workloads and make sense of their vast troves of health data. From tacking unstructured medical text with natural language processing to analyzing health data to derive insights, we are investing in purpose-built services for healthcare and life science customers. We are on a mission to democratize machine learning so healthcare organizations can build responsible machine learning models that allow clinicians to make accurate, personalized predictions. Forecasting events such as cancer, heart attacks, or strokes will enable them to intervene early and provide the ultimate patient experience.

Learn more about Medical Imaging on AWS.  Learn more about AWS for Health.

Vignesh ShettyVignesh is the Senior Vice President and General Manager of Edison AI & Platform at GE Healthcare. He leads a product and engineering team focused on building the Edison Health Services platform as well as clinical and operational AI (ML/DL) based retrospective, predictive and prescriptive analytics/applications. Before joining GE, Vignesh spent 16 years at Cisco in various roles.

Taha Kass-Hout

Taha Kass-Hout

Dr. Taha Kass-Hout is Vice President and Chief Medical Officer at Amazon Web Services, and leads our Health AI strategy and efforts, including Amazon Comprehend Medical and Amazon HealthLake. He works with teams at Amazon responsible for developing the science, technology, and scale for COVID-19 lab testing, including Amazon’s first FDA authorization for testing our associates—now offered to the public for at-home testing. A physician and bioinformatician, Taha served two terms under President Obama, including the first Chief Health Informatics officer at the FDA. During this time as a public servant, he pioneered the use of emerging technologies and the cloud (the CDC’s electronic disease surveillance), and established widely accessible global data sharing platforms: the openFDA, which enabled researchers and the public to search and analyze adverse event data, and precisionFDA (part of the Presidential Precision Medicine initiative).