AWS for Industries

Applying carbon value modeling to achieve net-zero

Climate change is one of the most pressing issues of our time, and it’s becoming increasingly clear that we need to take rapid action to reduce our carbon emissions. To be carbon-neutral by 2050 and keep the Earth’s mean temperature below 2° C of preindustrial levels, the world must curb emissions by 7.6 percent per year for 10 years. Current global emissions are above 40 GtCO2 per year. In January 2023, researchers at US climate center Berkeley Earth found that Earth’s long-term average temperature would increase by 1.5° C around 2033 and by 2° C around 2060. This calls for deep emission cuts, and delaying the challenge will only add to the bill in the end.

One approach that’s gaining traction to drive reductions in carbon emission is carbon value modeling (CVM), which involves a data-driven approach to decarbonization.

What is carbon value modeling, and how it can help achieve net-zero?

CVM is a framework and tool kit for customers to analyze their current state of emissions and generate pathways to decarbonize, as well as focus on continuous improvement. This can be achieved by implementing resource and operational-efficiency initiatives, applying carbon-reduction strategies (such as investing in renewable energy and reducing waste), and altering the fuel mix in operations. For example, the tool kit helps to understand the current emissions, perform scenario analysis, and compare a future emissions trajectory and ability to meet targets.

Challenges and complexity of designing pathways for net-zero

Though CVM has many benefits, there are challenges and limitations to overcome. One challenge is to assign a monetary value to carbon emissions because there are multiple factors to consider. Another limitation is the lack of effectiveness if it is not implemented widely enough. Designing the pathways to achieve net-zero is a complex planning exercise because there are a number of plausible strategies that might impact the time to achieve net-zero. These might include operational/resource efficiency measures—for example, reducing idle hours of equipment running—energy efficiency measures (like energy planning with renewable sources), circularity strategies (such as using steel furnace slag for cement production), low-carbon projects (changing industrial lighting to LEDs, retrofitting trucks to use liquified natural gas (LNG) kits, investing in renewables/solar), carbon capture, and usage and storage, to name a few. Analysis is required for each of these strategies using a wide range of detailed operational variables that affect emissions. In essence, decision-makers need to have insights about the following:

  • What technical/physical variables can be changed, and what cannot be changed? What will be the eventual impact on emissions?
  • Who would be accountable for various areas of the emissions value hierarchy (as below), and how would these get tracked?
  • What do the future-state combinations of initiatives result in quantitatively toward the net-zero targets?
  • What is the economic impact of top line and bottom line with low-carbon projects?
  • What are the carbon tax liabilities for net emissions for the organization?
  • Which are the low-carbon projects that provide the greatest economic benefits and positive impact on emissions (marginal cost abatement)?
  • How do companies sustain planning as a recurring exercise with accurate data inputs for the variables that impact the carbon value?

Analyzing the current state

Analyzing emissions from a business implies a deep understanding of the underlying technical, physical, economical, and financial variables across the complete emissions scope and value chain, which helps us to identify optimization opportunities, bottlenecks, and constraints. It also helps us to test different options by running what-if scenarios. A typical business hierarchical structure of emissions covers the group-level and unit-level emissions and combustion sources under them. The activity and efficiency levers when applied with the related emission factors and global warming potential results in calculating current emissions.

Figure 1. Carbon value tree

Figure 1. Carbon value tree

Typically, the top section of the carbon value tree (the effect) consists of variables relevant to executive management and related to sustainability performance of the company, such as total emissions and energy intensity, and the lower/leaves of the value driver tree (the cause) generally consist of those operational variables that drive the performance of variables higher up in the tree. For example, an activity variable of fuel consumed per hour multiplied by the efficiency of run hours of an equipment drives the emissions from a mobile source of emission. Imagine doing this level of modeling for entire processes in the organization, including activities that impact Scope 3, such as wastewater treatment, water consumption, and third-party vehicle usage. Decision-making can be localized within a certain level of the value tree using tools like impact analysis and sensitivity-analysis dashboards.

Designing the future state

After current state is modeled with the right data inputs, net-zero planners can then experiment using sensitivity analysis at various levels, with variables to visualize the impact when a variable is modified. Planners can see the effect it has on the entire model, which can depict anything from a single production unit to an entire portfolio of operations—for example, which processes have the most impact on emissions with a 5 percent change in their activity variables, and what if some of these are constrained activities? Planners can then test out different scenarios to identify the optimal approach and pathways.

Figure 2. Sensitivity analysis performed on the current state

Figure 2. Sensitivity analysis performed on the current state

The current state model is also redesigned to include low-carbon project variables. For example, what is the fuel consumption per hour if a group of haul trucks is replaced with LNG, keeping the run hours constant as in current state but with a different LNG fuel-consumption rate and emission factor? Modeling these scenarios provides a view of the carbon abatement compared to current state. The future-state model would also include a financial model that calculates the net present value (NPV) for each of the low-carbon projects and generates the insights into marginal cost abatement (NPV/Carbon Abatement) for the business to then sequence the projects into a net-zero road map of initiatives. If the abatement strategies fall short of targets, the net-zero planner can model offsets, such as biodiversity projects, which roll up to gross emissions and impact the net emissions, thus providing a holistic framework for decarbonization.

Sustaining the momentum

After current emissions levels and targets have been ascertained, impact tracking needs to indicate where variances occur between the actual and the planned emissions performance. It is, however, difficult for organizations to quantify precisely which, and more importantly how much, of each of the underlying influencing factors, or drivers, contribute to a specific emissions variance. The ability to accurately understand exactly how much each driver impacts the emissions of a business is an important requirement to focus management attention on areas that will achieve the most effective results. Designing the above needs mature analytical planning and management solutions. With that in mind, we offer the Wipro Decarbonization and Carbon Value Modeling solution, integrating modeling technologies with industry-specific knowledge to provide this capability, which traditional management information systems (MIS) tools fail to provide.

Wipro’s Decarbonization and Carbon Value Modeling Solution

Wipro’s Decarbonization and Carbon Value Modeling solution is powered by Amazon Web Services (AWS) solutions. Using AWS Carbon Data Lake, customers derive insights from their carbon emissions data, which helps them analyze the current state, design the future state, and track the impact to continuously improve.

Organizations can have raw emissions data in historians, Internet of Things (IoT) platforms, and contextual data in manufacturing and business systems. This can result in greenhouse gas (GHG) emissions data being siloed, even within different departments of the same organization. AWS Carbon Data Lake provides mechanisms to merge and track data from various sources in a single repository and further reduce the undifferentiated heavy lifting of ingesting, standardizing, transforming, and calculating GHG emission data. AWS Carbon Data Lake uses emission factors based on open standards for calculation. It not only helps customers overcome one of the major pain points concerning the underlying data issues around data acquisition, organization, and standardization but also helps mitigate their issues with inconsistency in calculation of carbon emissions. Data lineage built in the framework provides the audit trail of data points at the highest granular level.

AWS Carbon Data Lake customers have a standard and extensible carbon-data management framework on top of which they can build end-user-specific APIs for downstream visualization, business intelligence (BI), and optimization tools. Sourcing calculated GHG emissions data from these APIs, the CVM tool’s user interface (UI) helps customers visualize the current state, where the organization and site-level emission scorecards, along with the Scope 1, Scope 2, and Scope 3 emissions, are made available. With the help of prebuilt industry CVMs for mining, oil and gas, steel, cement, and many more, customers can model the complete operations structure and model the activity drivers to calculate emissions. After they perform the impact analysis along with the top contributors, they can model the future state by performing scenario analysis, sensitivity analysis, and scenario comparisons. Customers can use the tool to perform forecasting based on production targets and future-state carbon modeling with low-carbon options using financials.

Customers can build their own dashboards or use built-in dashboards to track key performance indicators (KPIs) that provide targets to shoot for, milestones to gauge progress, and insights that help people across the organization make better decisions.

Solution overview

The diagram below explains the complete flow, from emissions data collected from the source and processed by the AWS Carbon Data Lake. It’s built using the design principles and best practices of the Sustainability Pillar of the AWS Well-Architected Framework, which helps you learn architectural best practices for designing and operating secure, reliable, efficient, cost-effective, and sustainable workloads in the AWS Cloud. The calculated GHG-emission data is available through APIs to the CVM tool for the customers to perform analysis.

Figure 3. Solution architecture

Figure 3. Solution architecture

As shown in figure 3, deploying the solution sets up the following application stacks:

  1. Customer emissions data from various sources is mapped to a standard CSV upload template. The CSV is uploaded, either directly to the landing bucket in Amazon Simple Storage Service (Amazon S3), an object storage service, or through the UI.
  2. An Amazon S3 landing bucket provides a single landing zone for all ingested emissions data. Data ingress to the landing-zone bucket initiates the data pipeline.
  3. A workflow in AWS Step Functions, a visual workflow service, orchestrates the data pipeline, including data quality check, data compaction, transformation, standardization, and enrichment, with an emissions calculator function in AWS Lambda, a serverless, event-driven compute service.
  4. AWS Glue DataBrew, a new visual data preparation tool, provides data-quality auditing and an alerting workflow. AWS Lambda functions provide integration with Amazon Simple Notification Service (Amazon SNS), which sends notifications through A2A and A2P, and a web application in AWS Amplify, a complete solution that lets frontend web and mobile developers easily build, ship, and host full-stack applications on AWS.
  5. AWS Lambda functions provide data-lineage processing, queued by Amazon Simple Queue Service (Amazon SQS), which lets you send, store, and receive messages between software components. Amazon DynamoDB—a fully managed, serverless, key-value NoSQL database—provides NoSQL pointer storage for the data ledger, and an AWS Lambda function provides data lineage audit functionality, tracing all data transformations for a given record.
  6. An AWS Lambda function outputs calculated CO2-equivalent emissions by referencing an Amazon DynamoDB table with the customer-provided emissions factors.
  7. An Amazon S3–enriched bucket provides data object storage for analytics workloads, and the Amazon DynamoDB–calculated emissions table provides storage for GraphQL API (a query language for APIs).
  8. Optionally, deployable artificial intelligence (AI), machine learning (ML), and BI stacks provide customers with options to deploy a prebuilt notebook in Amazon SageMaker—which developers can use to build, train, and deploy ML models—and a prebuilt dashboard in Amazon QuickSight, which powers data-driven organizations with unified BI at hyperscale. Deployments come with prebuilt queries in Amazon Athena, which can be used to analyze petabyte-scale data, that can be used to query data stored in Amazon S3. Each service is pre-integrated with Amazon S3–enriched object storage.
  9. An optionally deployable web application stack uses AWS AppSync, which creates serverless GraphQL and Pub/Sub APIs, to provide a GraphQL API backend for integration with web applications and other data-consumer applications. AWS Amplify provides a serverless, preconfigured management application that includes basic data browsing, data visualization, data uploader, and application configuration.
  10. An AWS Lambda function queries the calculated CO2-equivalent emissions from the Amazon DynamoDB table and invokes an API to send the data to the CVM tool.
  11. In the CVM tool, Amazon Elastic Container Service (Amazon ECS), a fully managed container orchestration service, stores the transactional data in Amazon RDS for MySQL, a popular open-source relational database, and the model information in Amazon DynamoDB. When the users access the tool, the traffic is routed through Amazon Route 53, a highly available and scalable domain name system (DNS) web service, to Amazon CloudFront, a content delivery network (CDN). Then traffic is routed to the Elastic Load Balancing (ELB), which distributes the incoming traffic to Amazon ECS, where data is stored and retrieved from the databases. Content is cached in Amazon ElastiCache, a fully managed, Redis- and Memcached-compatible service, for faster retrieval.

Conclusion

In conclusion, CVM is a promising approach for achieving net-zero. By assigning a monetary value to carbon emissions, businesses and individuals have financial incentive to reduce their emissions. Though there are challenges and limitations, CVM has the potential to be a powerful tool for reducing carbon emissions and mitigating climate change.

Wipro and AWS Cloud can support the implementation of CVMs to achieve net-zero by providing the requisite processes, tools, and services to eventually facilitate sustainability innovation. By using these services, businesses can reduce their carbon emissions and contribute to a more sustainable future.

References

Robert Rohde, “Global Temperature Report for 2021,” Berkeley Earth, January 12, 2022, https://berkeleyearth.org/global-temperature-report-for-2021/.

Amazon Web Services, “AWS enables sustainability solutions,” 2021, https://aws.amazon.com/sustainability/.

Amazon Web Services, AWS Carbon Footprint Calculator, 2021, https://calculator.aws/#/footprint.

Sudip Kumar Chaudhuri

Sudip Kumar Chaudhuri

Sudip Kumar Chaudhuri is a partner in the Energy and Resources Domain and Consulting team at Wipro. Based out of India, he has spent 25 years in the industrial consulting and solutions domain. He focusses on data- and digital-led initiatives with customers for Operational Decarbonization and specializes in Carbon Value modeling to help organizations transition to net-zero.

Bindhu Chinnadurai

Bindhu Chinnadurai

Bindhu Chinnadurai is a senior partner solutions architect in AWS based out of London, United Kingdom. She has spent over 18 years working in everything for large-scale enterprise environments. Currently she engages with AWS Partners to help customers migrate their workloads to AWS with focus on scalability, resiliency, performance, and sustainability. Her expertise is DevSecOps.

Shailesh Tekurkar

Shailesh Tekurkar

Shailesh Tekurkar has over 30 years of global experience in IT consulting, sales, and program delivery, where he has delivered innovation and business value through advising and implementing big data and advanced-analytics data science solutions for marquee customers across industry domains including oil and gas, energy and utilities, transportation, media, pharma, and insurance sectors. He currently leads the GSI partnership business within EMEA for AWS based out of London.

V.A. Vaishnav

V.A. Vaishnav

V. A. Vaishnav is a senior cloud leader in Wipro, driving platform solutions across industry and technology areas, innovation, and strategic initiatives in AWS Cloud. He is based out if India and has 24 years of experience in the IT industry. His areas of expertise include cloud transformation, IT outsourcing, solutioning, and digital transformation, helping clients achieve their business outcomes.