AI and ESG: Understanding the Environmental Impact of AI and LLMs

March 29, 2024
Authored by
Ella Shoup
AI Policy Associate at Holistic AI
AI and ESG: Understanding the Environmental Impact of AI and LLMs

Powerful artificial intelligence (AI) models offer significant transformative potential with wide applications. However, the computing power needed for AI to work – especially Large Language Models (LLMs) – requires significant amounts of energy, which can have a significant effect on the environment. The scope and size of the energy usage, however, depends on what phase of the ML lifecycle you measure: training or deployment.

In this blog post, we provide an overview of the environmental implications of AI, focusing on LLMs. We also discuss the importance of understanding the energy usage in different phases of the ML lifecycle, and how to respond to the environmental impact of AI.

Key Takeaways

  • Emissions related to the IT sector – including AI, cryptocurrency, and data centers – are set to sharply increase after 2023
  • LLMs are notably energy-intensive compared to other AI systems due to the high amount of compute needed for these models to work
  • From the manufacturing of chips to the powering and cooling of data centers, LLMs use huge amounts of energy at every phase in their lifecycle
  • While the training phase of an LLM is typically seen as the most energy-intensive, inference poses a potentially higher environmental cost

What is the impact of AI on the environment?

IT-related carbon emissions have steadily increased as more of everyday life and economic activity goes online. Before 2023, the information and technology industry were contributing 1.5-4% of worldwide emissions, where it is estimated that data centers, cryptocurrency, and AI consumed about 460 TWh of electricity worldwide in 2022 – amounting to almost 2% of global electricity demand.

Given the rapid adoption of AI technology, researchers believe this will sharply increase after 2023. Alarmingly, it is projected that by 2027, AI could consume the energy equivalent of a country like Argentina or the Netherlands, with most of this energy stemming from the pre-training and training phases of a model’s lifecycle.

What is the environmental Impact of AI Chips?

The computing power needed to train and deploy AI models relies on hardware known as General Processing Unit (GPU) chips, with powerful LLMs like ChatGPT needing thousands of GPUs to operate. The hardware lifecycle of GPUs is itself energy consuming, particularly due to the manufacturing of GPU chips, which requires intensive mining, and disposal, which often generates e-waste.

Indeed, each stage of creating an NVIDIA GPU chip, which are used in most of today’s most powerful LLMs, presents several harms to the environment. In general, chip manufacturing – rather than energy consumption or hardware use – accounts for most of the carbon output from electronic devices.

Environmental Impact of Training Large Language Models

In addition to the hardware that LLMs rely on, the training phase of an LLM’s development requires vase computing power resources, adding to their environmental impact. A recent study found that training just one AI model can emit more than 626,000 pounds of cardon dioxide, which is equivalent to nearly five times the lifetime emissions of an average American car.

Another study found that training ChatGPT consumed 1,287 MWh, making it equivalent to the carbon dioxide emissions from 550 roundtrip flights from New York to San Francisco. However, training an LLM typically only happens once which means the environmental impact from this phase of the cycle is limited to when training is complete.

Environmental Impact of Deploying Large Language Models

The next phase of the LLM lifecycle – when it is actually used by people – has been subject to growing scrutiny. Once the model is trained and deployed, it performs what is known as an inference, or the live computing LLMs perform to generate a prediction or response to a given prompt. With the widespread use of commercial LLMs like ChatGPT and their incorporation into basic features like search, the process occurs millions of times a day.

As a result, multiple studies have shown that most of an LLM’s carbon footprint will come from this part of the cycle. This has been corroborated by reporting from Big Tech companies; for example, in 2022, Google reported that 60% of its ML energy use came from inference, and the remaining 40% from training.

Tasks that require content generation, such as text and image generation, image captioning, and summarization, are the most energy and carbon intensive, according to a study by researchers from Hugging Face and the Allen Institute for AI. Images are especially energy intensive; generating a single image takes as much energy as fully charging your smartphone. Notably, the study also found that for the same task, using multi-purposes models for discriminate tasks is more energy intensive compared to task-specific models.

Why does AI require so much energy?

Computing power, also known as “compute,” often refers to a stack that includes both hardware and software. In addition to GPUs, computing power also relies on software to enable these chips, and domain specific languages designed for Machine Learning, and data management software. The stack also includes the infrastructure in a data center, such as servers, data storage drives and network equipment, which all require massive amounts of energy to operate and cool.

Compute is measured in floating point operations per second (FLOPS), which has gradually increased by a factor of 150 since 2004, from 100 GigaFLOPs in 2004 to 15 TeraFLOPs in the newest GPU models. Overall, global compute instances have risen as much as 550% in the last ten years. This is likely to continue to grow as AI becomes more ubiquitous in everyday systems.

The more computing power needed by AI, the more energy the associated data centers will require. Each time there is a steep change in online processing, there is also a significant increase in the computing power and cooling resources needed by large data centers.

A high-performance AI model requires vastly more energy, for example, than sending an email or buying something online. A 2023 paper found that ChatGPT uses 500 millilitres of water (similar to a 16-ounce water bottle) every time a user prompts it between 5-50 times, depending on the season and where the servers are located.  The study follows figures released by Google and Microsoft that show their water usage spiked in 2022, which researchers attribute to AI work.

How to Respond to AI’s High Energy Use

The environmental impact of AI is a growing concern for society, lawmakers, manufacturers, and developers alike, and mitigating it will require a united approach with shared responsibility. Users of generative AI can be mindful of how often they use the technology and consider how precise their prompts are to avoid having to ask for multiple iterations unnecessarily, manufacturers and deployers can explore more sustainable methods, and lawmakers can codify best practices into law. In fact, the US has already proposed the Artificial Intelligence Environmental Impacts Act to require a study on the environmental impacts on AI and voluntary reporting, and other policymakers are starting to follow suit.

Schedule a demo with our experts to find out how Holistic AI’s Governance Platform can help you empower trust in your AI.

DISCLAIMER: This blog article is for informational purposes only. This blog article is not intended to, and does not, provide legal advice or a legal opinion. It is not a do-it-yourself guide to resolving legal issues or handling litigation. This blog article is not a substitute for experienced legal counsel and does not provide legal advice regarding any situation or employer.

Subscriber to our Newsletter
Join our mailing list to receive the latest news and updates.
We’re committed to your privacy. Holistic AI uses this information to contact you about relevant information, news, and services. You may unsubscribe at anytime. Privacy Policy.

Discover how we can help your company

Schedule a call with one of our experts

Schedule a call