AWS Cloud Enterprise Strategy Blog

What Will Generative AI Mean for Your Business?

Gen AI

It won’t surprise you to hear that there’s been lots of excitement and speculation about generative AI in our meetings with AWS customer executives lately. The question on their minds is: “What does this mean for my business?” That’s a good way to frame the question; it’s not about what generative AI can do, but what it can do for your business. And the seeds of the answer are there in that framing as well. How generative AI will affect your business depends on how you and your competitors will use it to innovate new business models and derive new competitive advantages. It’s not about what the technology itself does—exciting as that is—but about how you will combine it with other technologies, your people’s skills, your values and competencies, and your distinctive vision.

It is a question of how to manage innovation in your company, which is not a new question. Generative AI, which has powers we haven’t even conceived of, joins other technology-influenced ways of solving business and mission challenges, ways of imaging the future, and technological tools like IoT, analytics, and the many services AWS offers for innovating new products and operating with excellence. The IT world has often made the mistake of confusing technologies with business models. What you will gain (or lose) from generative AI depends on the innovative uses you and your competitors find for it. The important questions—and ones that require some thought—are how to innovate with generative AI, scale with it, incorporate it into business models, and manage its risks.

With that in mind, the AWS approach to generative AI becomes clearer. As with other AWS services, our emphasis has always been on helping our customers drive their businesses forward—not just producing technical capabilities but helping our customers use those capabilities to be more successful. That’s what we mean by being “customer obsessed.” We speak of democratizing AI: making it so easily available that it can become part of an enterprise’s normal cycles of experimentation, learning, understanding customers’ needs, and building business capabilities.

Let’s look at generative AI from the standpoint of business innovation and excellence.

This Is Exciting

Generative AI, along with whatever grows out of it, appears to be the next big thing to transform how we do business. In the big picture view, recent advances in generative AI show us that extremely large foundation models are both practical and powerful and can be fine-tuned rather easily to accomplish important tasks. This is somewhat surprising. Even those at the cutting edge of AI research weren’t sure until recently how convincing the natural language content generated by even an extremely large model could be, let alone how large such a model would have to be. And there are emergent behaviors of large language models that are surprising and whose implications aren’t yet clear.

Language is not the only field that might be amenable to foundation modeling—foundation models of amino acid sequences can be used to engineer new proteins for use in healthcare, models based on financial markets can inform financial applications, and stable diffusion models can create images. The unexpected emergent behaviors of very large language models go well beyond language manipulation. Generative AI will change how we think about solving a broad range of business and mission challenges. Innovating with generative AI is more than just finding uses for chatbots!

Sustainable Competitive Advantages

Businesses using generative AI will want to build sustainable competitive advantages. To do so, they must combine generative AI with resources that are unique and proprietary (or defensible). The large language models used by text-based services like ChatGPT are a type of foundation model (FM), a pretrained model that—in GPT-4’s case—contains hundreds of billions of parameters. Most companies will be unable to create their own FMs, as doing so requires tremendous resources and expertise. They will therefore need to use FMs from external providers, the same FMs available to their competitors and future disruptors. Sustainable competitive advantage can’t come just from using generative AI—if you can add a chatbot to the front end of your application, your competitors can as well.

Your long-term advantages will come from how you fine-tune the FM, what proprietary data you add or use to train the model, or how you integrate the generative AI into business processes that are truly unique to your company.

While the FM itself might not be unique to your company, you do have plenty of data that is unique: data about your customers, their prior transactions, sensors you own or control, and your research. Some of that data can be used to fine-tune the FM, derive prompts for your generative AI applications, build your own models, or simply create applications in conjunction with the FM. Amazon Bedrock allows you to use your proprietary data with an FM in a secure way that keeps your proprietary data private. This allows you to focus on managing the quality of your data and finding unique ways to use it to build differentiated services and competitive capabilities.

Incorporating generative AI into your company’s distinctive ways of providing value to your customers is an integration task; generative AI must be continuous with your everyday IT applications. With your other business applications running in it, the cloud can provide integration capabilities through tools like Amazon API Gateway, analytics services, data lakes, and asynchronous movement of data. And you’ll want your authentication and authorization policies to be consistent across all your IT capabilities, including generative AI.

The AWS approach to generative AI is to support our customers in building sustainable competitive differentiators, not just implementing new and exciting technology.

Management of Innovation

Enterprise leaders often mistakenly assume that becoming more innovative is a matter of getting employees to have more ideas. In truth, employees usually have plenty of ideas, especially those who work closely with customers. The challenge of innovation is to execute those ideas, to give them a chance to show that they can be effective. By definition, innovative ideas are necessarily risky because they are new and unproven. The key to managing innovation is to reduce the risk of innovation and then adjust governance processes to allow more freedom given the lower risk.

It is here that the cloud has always excelled. An employee can quickly spin up infrastructure to test an idea, then discard the infrastructure and stop paying for it if the idea doesn’t work, or quickly change the infrastructure if needed. An employee can inexpensively and quickly build functionality by combining AWS’s many high-level services as building blocks and integrating them through serverless functions—or stop using them if they discover a better way. For example: instead of spending years building image recognition capabilities, they can obtain them off-the-shelf with Amazon Rekognition and stop using and paying for those services if the new ideas don’t prove themselves.

Because the cloud dramatically reduces the cost and risk of trying innovative ideas, it allows companies to consider ideas they would have previously rejected. With generative AI in the cloud, companies can combine it with other building block services to test the new ideas stimulated by the generative AI’s capabilities—at lowered risk and cost. Again, it’s not just a matter of testing generative AI’s capabilities but of embedding them in business processes that must be tested.

Critically, Amazon Bedrock allows employees to innovate with different FMs. The initial release of Bedrock supports models from AI21 Labs, Anthropic, Stability AI, and two Amazon Titan models. Each of these is designed to specialize in certain types of applications. Employees testing new ideas can choose the FM that best supports their intentions or try several and compare.

The AWS approach to generative AI is amenable to good practices for managing innovation and stimulating innovation in business processes.

Responsive Agility: Keeping Pace

Although sustainable competitive advantage is a critical goal, companies can also use generative AI simply to improve how they serve their customers. When its customers’ needs change, a company that has learned the techniques of agility in the cloud can respond nimbly. And as generative AI evolves—and it surely will—companies can use that agility to incorporate new features and build new applications. As competitors release new capabilities, enterprises need to respond quickly to match them. As with other IT capabilities, companies must learn agility with respect to generative AI.

Companies have been learning agility over the last few decades, and the same considerations will apply as they begin to incorporate generative AI. How can they sense the need for change? Deliver incrementally and quickly? Govern investments to move quickly into execution and juggle requirements with shifting priorities? The cloud (and contemporary practices like DevOps) are the keys to building agility and speed.

Operationalizing Generative AI

IT leaders will quickly recognize that using generative AI is not simply a matter of coming up with an idea and rolling it out. Like other technologies, it must be operationalized effectively, and the challenges of doing so are well-known to IT practitioners. Off the top of my head, AI applications and models must have reliable deployment processes, be version-controlled, be tested, and meet compliance requirements. Users must be authorized, interfaces to other systems must be built, and helpdesk services must be available. Applications must be secured. There are ethical issues to address, and guardrails must be implemented.

Generative AI must become part of a business’s overall technical operations. The cloud excels in streamlining IT operations; AWS’s broad selection of services and the automation the cloud supports will be critical to making generative AI applications reliable, resilient, secure, and efficient. In particular, Amazon SageMaker is designed to make operationalizing AI applications easier. Among other features, it supports and automates governance processes, provides a centralized catalog for machine learning artifacts, integrates machine learning applications into automated testing and deployment (CI/CD) pipelines, and monitors data and models as they’re being used to ensure their quality.

Speaking of efficiency, when generative AI applications become part of a company’s core business processes, cost becomes an important factor. While AWS Inferentia and AWS Trainium chips are specially designed to cost-effectively train and deploy AI models, the entire suite of cloud services and the cloud’s ability to scale up and down seamlessly will likely play a critical role in managing the costs of whatever innovations companies develop.

Expressing Values

With AI, addressing ethical concerns and ensuring compliance with applicable frameworks is critical. Because Amazon Bedrock is based on a choice of FMs, AWS customers can choose the FMs that best fit their compliance needs and corporate values—even as those needs evolve. They can take advantage of AWS AI Service Cards that provide transparency into how individual AWS services address and influence fairness and bias, explainability, privacy and security, robustness, governance, and transparency.

Responsible AI, like the responsible use of other digital techniques, involves cultural change as well as governance processes. Governance processes establish guardrails and are critical. But the everyday activities of employees are guided by corporate culture, and building a culture of responsible AI use is a new frontier in the leadership of transformation.

In my upcoming book, I suggest that ethics in digital transformation is not just a matter of rules and compliance; it’s better thought of as a way that companies express their values and can even be a business advantage. Consumers today make spending decisions based on the values their vendors demonstrate; employees choose where to work based on prospective employers’ values. There is room for enterprises to go beyond compliance and industry frameworks to formulate an ethical vision and build it into their culture and operations. Generative AI and AI in general are places where the company’s ethical vision comes to the surface—compared to, say, ERP systems and logistics.


Generative AI is powerful new technology. But for AWS customers, it is more than that—it is a way to achieve business objectives and formulate new business goals. It is less a question of what the technology can do and more a question of how businesses will innovate to make it part of the value delivery to their consumers in ways that give them a competitive edge. This is the lens through which AWS’s approach to generative AI should be viewed.

Mark Schwartz

Mark Schwartz

Mark Schwartz is an Enterprise Strategist at Amazon Web Services and the author of The Art of Business Value and A Seat at the Table: IT Leadership in the Age of Agility. Before joining AWS he was the CIO of US Citizenship and Immigration Service (part of the Department of Homeland Security), CIO of Intrax, and CEO of Auctiva. He has an MBA from Wharton, a BS in Computer Science from Yale, and an MA in Philosophy from Yale.