AWS for Industries

Survey: Adoption, Barriers, and the Future of Generative AI in the Telco Industry

Half of surveyed telcos plan to integrate generative AI in the next two years and expect to increase spending by as much as six times.

It’s no illusion: generative AI is everywhere. And here at AWS, we believe it’ll have a profound impact across all industries. Generative AI is the next wave of widespread adoption of deep machine learning (ML), with the opportunity to reinvent customer experiences and business applications, including those within the telco industry.

To better understand generative AI outlook, sentiment, and adoption across the telco industry, AWS worked with Altman Solon, a global strategy consulting firm exclusively focused on the Technology, Media, and Telecom sectors, to survey more than 100 senior telco leaders from the U.S., Western Europe, and Asia Pacific. The results are summarized in a whitepaper exploring the current and anticipated adoption of generative AI. Here are some of the key findings.

1. Adoption of generative AI will grow significantly over the next two years

Among the 17 use cases tested across four telco domains (marketing and product, customer service, network, and IT), on average, each use case has around 19% adoption – meaning respondents have implemented or are in the process of implementing. However, this will grow. The survey found that adoption for use cases will reach, on average, 34% within the next year and nearly half (48%) within the next two years. Alongside that, spending could surge to as much as six times current amounts. While chatbots are driving a significant portion of the momentum (more on that below), 64% of telcos agree that many of the generative AI use cases they’re considering are new applications, not yet served by existing non-generative AI applications and processes.

2. North American telcos slightly lead in generative AI adoption rates

North American telcos slightly lead in generative AI adoption rates (22% average use case adoption) compared to other parts of the globe. European telcos (19%) are more cautious about adopting generative AI given regional data residency restrictions such as General Data Protection Regulation. Particularly for telcos outside North America, present and future AI regulation and data restrictions and residency continue to be a key consideration. Countries such as China and many across the EU are tightening AI regulation and monitoring for AI, while the U.S. and India seek a more passive approach.

While telcos in APAC (16% adoption) have less data regulation, they are constrained by localization challenges, such as language. Generative AI relies on large language models (LLMs), which need to be trained on a corpus of data in a specific language. Today, most of the leading LLMs are built in English. AWS is working to close this gap. For example, earlier this year, AWS launched the AWS LLM Development Support Program to power Japan’s generative AI capabilities with a $6 million (868 million Yen) investment with AWS credits to help spur the diversity of LLMs.

3. Customer-facing chatbots are the most widely adopted first use case for generative AI

Integrating generative AI into a customer-facing chatbot is a natural first wave of adoption and a widely adopted use case: 92% of survey respondents said they were highly likely to implement generative AI chatbots, and among those, 63% said this was already production.

We believe this is the right first phase as it leverages existing foundation models, but we believe down the road, generative AI will also aid in network operations. For example, generative AI can ingest data from manuals that operators rely on when installing network elements. Combined with chatbots, this data can provide interactive guidance based on prompts to speed up and simplify installation tasks.

Other possible use cases center around increasing employee productivity, such as with guided assistance and documentation in customer service and IT.

4. Data security and governance are top challenges and critical enablers for adoption and implementation

Implementing generative AI is not without its challenges. Two-thirds (61%) of surveyed telcos indicated concerns around data security, privacy, and governance. For telcos to leverage generative AI for company purposes, it requires a large set of proprietary data. While there are many public LLMs, there is concern that proprietary company data could be embedded into the public model itself, creating intellectual property risk.

Said one head of IT, “We need to ensure that our data is secure and not used by other actors.”

It’s a concern we take seriously, and it’s why we built Amazon Bedrock with features to ensure that none of the customer’s data is used to train the underlying models. Customers can trust their data will remain private and confidential.

The survey also found that among the early adopters, organizations ranking in the top 30% for data proficiency are outpacing others in using generative AI for use cases beyond productivity. For example, these organizations are better positioned to leverage generative AI for revenue generating use cases, such as in product and marketing. These data-proficient organizations share common characteristics: dedicated AI centers of excellence, pervasive use of advanced data analytics, and modern data infrastructures (e.g. cloud).

5. Telcos expect to use off-the-shelf models rather than develop their own in the near future

Some telcos cited their lack of technical resources as a barrier to generative AI adoption. For this reason, it’s not surprising that only 15% of surveyed telcos indicated a desire to build foundation models in-house—with the rest expecting to use off-the-shelf models. That said, about two-thirds (65%) of respondents anticipate training those same off-the-shelf models with proprietary internal data to tailor them to specific needs. We expect early adopters that have a strong foundation in data platform modernization to be among the 15% that will build new foundation models, creating a new potential path to monetization.

Along with assistance of fine-tuning models, telcos will look to LLM vendors, like AWS, to provide studio and professional services. “I personally believe that this will be the key accelerator for generative AI in telcos,” said one general manager of advanced analytics at a wireless provider. In fact, 44% of surveyed telcos expect to use a fully managed services platform to build applications using available models.

At AWS, we see massive potential for generative AI to transform how customers operate and deliver new value to their customers. We’re focused on providing flexible approaches to meet enterprises’ needs – whether building their own foundation models with purpose-built AWS ML infrastructure such as in-house Inferentia and Tranium chips; leveraging pre-trained foundation models available from Amazon (Titan, Alexa) and from third parties to build their applications; or fine-tuning available models with additional data using services such as Amazon Bedrock and Amazon Sagemaker Jumpstart. Additionally, they can use pre-built applications, such as Amazon CodeWhisperer, that use generative AI without requiring expertise in foundation models or AI and ML technology.

Regardless of how telcos begin to approach generative AI, the most important thing is to experiment and explore now.

Go deeper into the findings in by checking out the full whitepaper here.

TAGS: , , ,
Dr. Ishwar Parulkar

Dr. Ishwar Parulkar

Dr. Ishwar Parulkar is the Chief Technologist for Telecom and Edge Cloud at Amazon Web Services. In this role he is responsible for setting AWS technology strategy, defining new cloud services and leading initiatives to enable AWS’ edge cloud offerings and next generation telecom networks and services. Prior to AWS, Dr. Parulkar was a Distinguished Engineer at Cisco and Chief Architect for business units responsible for telecom routing, mobile packet core, small cell and network orchestration products. He was a founding member of industry-wide initiatives around mobile edge computing and 5G and the conviction that cloud technology can transform the telecom field brought him to AWS in 2016. Before Cisco, Dr. Parulkar was a Distinguished Engineer at Sun Microsystems where he led the design data center computing infrastructure, including the first multi-core processor systems and first compute virtualization platforms in the industry. He started his career at Apple, where he worked on the Mac desktop/laptop product lines and on the Newton PDA technology, which had the seeds of the iPhone revolution. Dr. Parulkar holds an M.S. from Vanderbilt University and a Ph.D. from University of Southern California. He served for several years as a member of the Industry Advisory Board of University of Southern California. He holds several dozens of patents, has published over 25 papers in IEEE/ACM journals/conferences and served as Program Committee member/chair for IEEE/ACM conferences in networking and computing. In 2017, Dr. Parulkar was elected a Foreign Fellow of the Indian National Academy of Engineering for his excellence in and seminal contributions to the field of communication networks and data center computing.