AWS for Industries

Driving Outcomes for Capital Projects in the Energy Industry using Generative AI

According to the International Energy Agency, annual investment in new capital projects—which stood at just over $1.5 trillion before the COVID-19 pandemic—is set to reach nearly $2 trillion between 2025 and 2030. Capital projects in the energy industry typically involve billions of dollars in investment and millions of engineering and construction hours. Over the life cycle of a capital project, tens or even hundreds of thousands of documents are generated, reviewed, modified, tracked, executed, and handed over to the operations. Moreover, these documents are of diverse types, including engineering plans, regulatory permits, CAD drawings, instrumentation indexes, and construction details, among others. The use of enormous numbers of documents shared among hundreds of different parties—including engineering, procurement, construction (EPC) contractors, equipment vendors, suppliers, quality assurance and control parties, and government bodies—using dozens of different knowledge management systems (KMS) over many years of a project life cycle has the scale to deliver meaningful business outcomes. Generative artificial intelligence (AI) can drive such outcomes through savings in engineering and construction hours and schedule acceleration, helping avoid all too common project delays and cost overruns.

Generative AI differs from traditional AI in its ability to understand context, think critically, and generate original and realistic content. During the initiation and definition phase of a capital project, generative AI can promptly summarize and interpret massive amounts of documentation from the public domain—for instance, it can obtain federal, state, and local regulations and standards from reports and other outputs of various agencies—alongside research materials from closed sources. Using generative AI in this way can accelerate a feasibility study, which might ordinarily include such tasks as site analysis, regulatory compliance and environmental impact reporting, among others, to confirm that the project is legally, technically, and economically justified. Capital project teams can benefit in particular from using chatbots powered by Amazon Bedrock, the simplest way to build and scale generative AI applications with foundation models, with Retrieval Augmented Generation (RAG), a combination that helps deal with inquiries more responsibly and with higher relevance. The fully managed, RAG-based feature of Amazon Bedrock extends its large language model (LLM) capabilities so that project documents of varying formats, sizes, and types can be aggregated and made searchable promptly, reducing timeframes between conception to implementation. Amazon Bedrock–powered chatbots can provide answers based on both public and closed information, including rules on local zoning, permits, environment, mass transportation, traffic, parking, telecommunications, utilities, fire, and health and safety. Combining Amazon Bedrock with a ready knowledge base increases cost-effectiveness by removing the need to continuously train an LLM on constantly growing and changing project data. Chatbot responses can be further improved through fine-tuning and prompt engineering.

Agents for Amazon Bedrock, which empowers generative AI applications to execute multi-step tasks across company systems and data sources, can automate prompt engineering and the orchestration of user-requested tasks. Once configured, a fully-managed agent automatically builds the prompt and securely adjusts it according to company-specific information, delivering a response to the user in natural language. The agent can also interact with existing models, proprietary algorithms, and relevant data sources: for example, it can forecast short-term and long-term crude oil prices with which to make financial decisions and budget estimations for the project. A forecast of this kind can be continually refined using the high-quality synthetic data that generative AI produces for backtesting and simulations. Embedding models such as those in the Amazon Titan family, which incorporates Amazon’s 25 years of experience innovating with AI and machine learning (ML) across its business, and vector stores such as OpenSearch and pgvector can make non-transactional market data—such as press releases, breaking geopolitical news, earnings reports, and Environmental, Social, and Governance (ESG) reports—part of data-driven decision-making in near real time.

During the project’s design and construction phases, generative AI can analyze the latest data and documents across disparate KMS and promptly generate customized content, across different systems, for purposes ranging from purchase order addendums to project status reports. Amazon Kendra, which helps find answers faster with intelligent enterprise search powered by machine learning, can quickly integrate documents from multiple sources into a single knowledge base with its high-performance retrieval algorithms, empowering LLMs to produce more comprehensive and relevant content. These tools can significantly simplify project monitoring and reporting compared with searching multiple systems with keywords when there is a need, for instance, to assemble a complete picture of a specific instrument tag number, risk, or issue. Generative AI can empower EPC contractors to effectively and efficiently respond to requests for proposal (RFPs) from operators by filling RFP response templates with content recommendations virtually automatically. These methods can drastically reduce non-reimbursable business development hours and improve the consistency and quality of responses to RFPs.

During operations, generative AI can summarize timelines, discussions, evolutions, critical decisions, and other activities across structured and unstructured data, for a specific topic, across multiple KMS, and across different project phases, including initiation, definition, and implementation. An operations team can use a generative AI chatbot for ongoing maintenance tasks with minimal effort. For example, IoT sensor data, such as the revolutions per minute, temperature, and voltage, of a hydrogen turbine (powering a refinery, perhaps) can be continuously compared with the turbine’s normal ranges, as specified in its maintenance and diagnostics manual, to monitor for problems. A chatbot can be built with Agents for Amazon Bedrock, Amazon Kendra, and Streamlit to take sensor data as user input and produce on-demand comparisons of actual and normal ranges to highlight anomalies.

The above use cases apply as well to capital projects beyond the energy industry, such as those for data or fulfillment centers, manufacturing plants, and nuclear power plants. To learn more about how Amazon Web Services (AWS) is helping its customers and partners transform, innovate, and accelerate the energy transition, please visit this page. To learn more about how AWS approaches generative AI, please visit this page.

TAGS:
Ali Khoja

Ali Khoja

Ali Khoja is a principal customer solutions manager in energy and utilities team at AWS. As a people manager, he is a role model to customer solutions managers, having delivered outcomes for multiple customers (Baker Hughes, ExxonMobil). Ali has over 21 years of experience in technology consulting and is a graduate of United Way Project Blueprint, which trains leaders of color for top roles on nonprofit and public sector boards and committees.

Yunjie Chen

Yunjie Chen

Yunjie Chen is a senior customer solutions manager (CSM) at AWS. As part of the worldwide public sector scale CSM team at AWS, Yunjie interacts with major public sector customers and drives success in cloud journeys with the latest AWS technology. She has led several generative AI EBAs (experience-based accelerators) and, prior to joining AWS, she held leadership roles in multiple capital projects in the energy industry for over 20 years.