AWS Executive in Residence Blog
You Wanted to Become AI-Native, and All You Got Was a Lousy Foundation

Always implement things when you actually need them, never when you just foresee that you need them.
—Ron Jeffries, co-founder of Extreme Programming (XP).
At a large enterprise I recently worked with, the board asked the Chief Digital Transformation Officer to develop an AI adoption strategy to drive innovation, growth, and cost efficiency. His consultant of choice conducted stakeholder interviews and proposed a three-phase program scheduled to last 3.5 years:
- Phase 1: Fix digital basics and address the leftover gaps in people, processes, and technology from an incomplete digital transformation.
- Phase 2: Build the AI foundation, including governance, tools, platforms, and an AI office.
- Phase 3: Introduce AI and agentic business initiatives most likely to reach customers.
It was a coherent program that optimized for efficient delivery. But it’s not what the board wanted. They were looking for a strategy to become more innovative and competitive in an environment where the time required to implement IT products has shrunk significantly. What they got was a fix-the-basics project that would consume most of the budget before delivering actual business value.
The consultant’s cautious recommendation was rational. The 2025 DORA report declared: “AI is an amplifier.” It magnifies the strengths of high-performing organizations and the dysfunctions of struggling ones. Fewer than half of digital initiatives meet or exceed their business outcome targets, and most large enterprises carry that unfinished work into their AI programs under pressure from boards and competitors.
An enterprise that knows its foundations are incomplete has good reason to be anxious about what AI could amplify. Fixing the basics first feels like the responsible choice. But it shouldn’t be your primary goal or your starting point.
Why Foundation-First Fail Every Time
Building a foundation before you have a use case is speculative. Every design decision is made without real workloads, and every governance model is designed for processes that have not yet run.
Martin Fowler gave this pattern a name. In his foundational work on software design, he called it speculative generality: the habit of adding infrastructure and hooks for “things that aren’t required,” because you believe you will need them someday. At the code level, it is a recognized anti-pattern. At the program level, it becomes a multiyear, multimillion-dollar commitment.
The Standish Group’s CHAOS reports consistently show that agile, iterative delivery significantly outperforms sequential delivery. The 2020 CHAOS report found iterative programs are three times more likely to succeed than sequential, phased delivery. The gap widens dramatically at scale: the 2015 CHAOS report found large iterative programs are six times more successful than their sequential equivalents.
Teams that build foundations without real use cases rarely get them right. When the real use cases arrive, the team wastes time and effort fixing the foundation. By the time they’re done, the AI tooling landscape has already shifted.
The enterprise that chooses the foundation-first path typically ends up with three things: (1) an expensive platform designed for requirements that no longer apply, (2) an unhappy board that has watched the budget disappear with no visible return, and (3) an organization that has spent years not delivering innovation to customers.
The answer is not to abandon foundations but to build them correctly. And the best foundations are built for a specific purpose.
The Only Foundation That Holds
The organizations getting real returns from AI start with a specific, critical customer problem. And as they solve that problem, they develop the foundations on which future initiatives can be built.
Pick the most strategically important initiative in your backlog. Implement it and measure its real impact on your business. Build the foundation as you go; limit it to what the initiative requires. Then bring that outcome to the board—not a progress update, but proof of what you changed and how it impacted real customers.
A delivered outcome is the clearest signal an organization can send to itself that a new way of working is legitimate. From there, iterate the approach. The next initiative inherits what the previous one proved, and your foundation grows.
Overcoming Cultural Habits
This approach might feel unnatural, especially to organizations built on operational excellence. When faced with a trade-off between building more foundation and delivering to customers, teams may feel pulled toward the foundation. Establishing explicit principles at the outset of each program keeps teams oriented toward customer delivery. For guidance, look to Stephen Brozowich’s post about how to write memorable, actionable principles that drive alignment and accelerate decision-making.
Every organization should develop its own principles, but these four are a good starting point:
- Focused over comprehensive. A complete foundation is not the goal. The right outcome for the right customer is.
- Customers over requirements. Requirements tell you what was asked for. Customers tell you what actually works.
- Effective over efficient. Efficiency optimizes a known system. Effectiveness discovers whether you are building the right one.
- Lean over speculative. Build only what the current initiative requires. Each one adds a governed, architecturally proven component to the foundation the next one inherits.
Your board is not waiting for a perfect foundation for AI. It wants a result that customers can experience and improves core business KPIs.
Further Reading
- McKinsey. The State of AI 2025. Annual survey of AI adoption and impact across 1,993 organizations in 105 countries.
- Gartner. Gartner Survey Reveals That Only 48% of Digital Initiatives Meet or Exceed Their Business Outcome Targets. Annual global survey of more than 3,100 CIOs and technology executives, and more than 1,100 executive leaders outside of IT.
- Google Cloud DORA. 2025 State of AI-Assisted Software Development. Research on how AI amplifies organizational strengths and dysfunctions in software delivery.
- Martin Fowler. Refactoring (1999, 2nd ed. 2018).
- Fortune/Knowledge at Wharton. The AI Efficiency Trap. Exploration of why efficiency-only AI strategies produce commodity rather than competitive advantage.