Exploring the future of AI and ML in the public sector
From accelerating genomics research to addressing the teacher shortage to improving commutes, artificial intelligence (AI) and machine learning (ML) are fuelling digital transformation in the government, education, nonprofit, and healthcare sectors.
Austin Tanney, head of AI at Kainos—an Amazon Web Services (AWS) Partner Network (APN) member—has a team of almost 20 individuals focusing on AI and ML. Kainos was listed as one of the leading suppliers on the Gov.UK Digital Marketplace in 2019, and recently worked with AWS to help the UK Driver & Vehicles Standards Agency (DVSA) use ML to minimize fraud and improve road safety.
After the success of this and other recent projects, Kainos took the decision that all future machine learning development work will be completed on Amazon SageMaker.
Tanney shared his ML learnings and where he sees the technology going.
AWS: What’s your background in ML? How have you seen tech evolve?
Tanney: My journey has been unusual for AI. I’m a biologist—I have a PhD in biomedical sciences—and I became interested in biomedical technology and genomics when I began working in the diagnostics industry. Machine learning was being done, but we called it biomathematics. Over the past 20-25 years, what used to be niche or rare is now absolute in the industry. We’ve seen AI and ML platforms and tools grow and cloud has fueled it.
How has the cloud helped the growth of ML?
The majority of machine learning projects and use cases that we see today could not be delivered using on-premises technology. AWS machine learning services make it possible for a service provider, like Kainos, to incorporate AI into systems that transcribe speech to text, that extract text and data from virtually any document, and that personalize real-time recommendations and forecasting predictions for customers.
What are some of ML use cases that are most fascinating to you?
Coming from medical and diagnostics, I’ve seen interesting developments in the medical field. My background is in cancer research. AI is not trying to replace oncologists, but rather help oncologists make better and faster decisions. And ML can help remove tedious labor—automating tasks people don’t want to do, so instead they can focus on the work that we can do better than machines.
For example, we developed a solution for processing doctor’s medical notes using AWS Textract and Amazon Comprehend Medical along with bespoke NLP models. In doing that, we are eliminating work that’s manual and doesn’t require the expertise of a human clinical coder. Instead, the experts can focus on more difficult cases.
What are your tips to help start an ML project?
From the start, it is key to understand the problem. Often, the reason technology-focused projects fail is that it’s not focused on the impact, so the business-critical problem isn’t addressed. You need to solve the right kind of problems. So start with thinking of the product you’re designing and who will be using it.
Who should be in the room as you build out projects?
Technology challenges used to be thrown over the fence to the IT or technology teams. We’re seeing that less and less now. Just as our delivery teams are multifaceted, the customer team has to be diverse as well. You need a subject matter expert (SME), data engineers, data scientists, AI engineers, and software engineers on both sides of table. And I feel like the most critical person to have in the room is the person who owns the problem and wants to get it solved. For me, this is usually the key driver in a successful project
What are the capabilities of good data science team?
One of the things that we drive towards as a team is building things in the right way from the start. Projects have to involve software engineering from the outset, and software engineers are a core part of our team. We then also have data scientists, analysts, and the AI engineers. However, software engineering craftsmanship should be at the heart of any AI project for it to succeed.
What are some of the challenges that Kainos frequently encounters?
Our biggest challenge is access to data. Organizations still often struggle to share data. Sometimes we have a lot of data, but once filtered and normalized, we realize we can’t do much with it. So doing data cleanup is also important. Cloud services are helping to make it easier and quicker to label and run experiments on data. Service providers like Kainos also work with customers to help them realise the potential value in their data assets, which helps customers to overcome this data access challenge. But we are still early on this journey.
When did you first start working with Amazon SageMaker?
I became interested in Amazon SageMaker when it was launched. When it was still in test, I played with it and I was impressed. It made things simple and I realized it would have a big impact. Our team made a decision early on that we should be working on a consistent platform and we decided on Amazon SageMaker. SageMaker helps our teams work more collaboratively and easily—allowing us to focus on the work.
AWS continues to add capability to SageMaker as well and I am excited to see how Amazon SageMaker Ground Truth supports us with building and labelling training data sets and to see how Amazon SageMaker Neo and Amazon SageMaker Debugger combine to support us with training deep learning models.
As data grows exponentially, where do you see the market growing? Do you see any unique challenges?
There’s interest in natural language processing (NLP). Governments have a lot of information that is text heavy. There’s also interest from academia to see how machine learning can help their research.
With this increase in demand it’s critical that employers focus on workforce development. Skills are a challenge so as technology evolves it’s important that we focus on upskilling people. AWS supports with programs like AWS Educate. Kainos also has a number of programs to do this, including the Kainos AI Camp, a two-week summer camp. I also run Northern Ireland’s AI meetup community (AINI) where we also run regular events meetup style once a month with talks from local companies and academics to educate about the potential of AI and to enable people working in the field to get together and learn from one another.
What is the untapped potential of AI and ML?
Within all industries, especially in government or education, we get caught up in doing things the way that have always been done. We haven’t thought about if there’s a better way to do it. In the work we do, we know predictions are needed. ML is way to do it. A lot of what we’ve been doing in the last few years is in NLP. We need to extract data to make those predictions.
It is not beyond the imagination to suggest that virtually every application will be infused with ML and AI. This will impact customers and citizens around the world and it’s amazing to think what can be done when we commoditise these services. We don’t have to just solve problems with numbers. You can use AI to do something simple like tell the difference between a dog and a cat in a picture, translate a doctor’s medical notes, or you can use it to help personalize medicine for an individual.