AI is exciting! But this post is going to be a wet blanket. Yes, artificial intelligence (AI) is transforming the world in unprecedented ways. From self-driving cars to chatbots, AI seems to be making our lives easier, more convenient and more futuristic. Sci-fi vibes here. But behind the scenes, AI is also consuming a lot of energy, which has serious environmental and economic implications.
How is this going to pan out from here on? Let’s peek a bit into the AI phenomenon. We’re not going to discuss what constitutes AI (trust me, I’ve had that argument with many people, and as always, definitions matter), and whether AI is or isn’t sentient yet. We’re only going to look into AI energy consumption and costs. Which is a scary enough story. Buckle up.
- What Makes AI So Energy-Hungry?
- What Are The Consequences Of AI's Energy Consumption?
- Jevons Paradox
- How Can We Reduce AI's Energy Consumption?
What Makes AI So Energy-Hungry?
AI models consume so much energy because of three main factors: the amount of data they are trained on, the complexity of the model, and the volume of requests made to the AI by users.
1 / Data
Training an AI model requires a large amount of data, which is used to teach the model how to behave based on a set of examples. The more data the model has, the more accurate and reliable it can be. However, processing this data requires a lot of computing power, which in turn consumes a lot of electricity.
For instance, one of the most advanced AI models today is GPT-3, a natural language processing system that can generate coherent texts on almost any topic. GPT-3 was trained on 175 billion parameters, which is equivalent to about 45 terabytes of text data. According to one estimate, training GPT-3 consumed about 355 megawatt-hours of electricity, which is enough to power an average American home for 40 years.
2 / Complexity
The complexity of an AI model refers to how many layers and parameters it has. A layer is a set of mathematical operations that transforms the input data into a more abstract representation. A parameter is a numerical value that determines how the layer operates. The more layers and parameters a model has, the more capable it is of learning complex patterns and solving difficult problems. However, this also means that the model requires more computing power and energy to run.
For example, one of the most complex AI models today is DALL-E, a computer vision system that can generate realistic images from text descriptions. DALL-E has 12 billion parameters, which is about 70 times more than a typical image recognition model. To run DALL-E, one would need a powerful graphics processing unit (GPU), which is a type of electronic chip that can process large amounts of data quickly. A single GPU can consume up to 300 watts of electricity, which is about six times more than a typical laptop.
3 / Volume
The volume of requests made to the AI by users refers to how often and how many people use the AI service or application. The more popular the AI model, the more inferences it has to make, and the more energy it consumes. An inference is the process of answering a user’s query using the trained model.
For example, one of the most popular AI services today is Google Search, which uses AI to rank and display relevant web pages for each query. Google Search handles about 9 billion queries per day, which translates to about 100,000 queries per second. To handle this massive volume, Google operates thousands of servers around the world, which consume about 12 terawatt-hours of electricity per year, which is equivalent to the annual electricity consumption of Denmark.
The energy use by AI has increased significantly in the past decade, driven by the growth of data centers, artificial intelligence models, and cryptocurrencies.
What Are The Consequences Of AI’s Energy Consumption?
AI’s energy consumption has both environmental and economic consequences. On one hand, it contributes to greenhouse gas emissions and climate change. On the other hand, it increases operational costs and resource constraints for AI companies and users.
1 / Environmental Impact
The environmental impact of AI’s energy consumption depends on how the electricity is generated. If the electricity comes from renewable sources such as solar or wind, then the impact is minimal (more on this later, though. The damn devil is always in the details!). However, if the electricity comes from fossil fuels such as coal or natural gas, then the impact is significant.
According to a study by Alex de Vries, a data scientist at the central bank of the Netherlands and a Ph.D. candidate at Vrije University Amsterdam, if current trends continue, worldwide AI-related electricity consumption could increase by 85 to 134 terawatt-hours annually by 2027. This would be comparable to the annual electricity consumption of countries such as the Netherlands, Argentina or Sweden. Moreover, this would result in about 40 to 60 million tons of carbon dioxide emissions per year, which is equivalent to about 10 percent of global aviation emissions.
2 / Economic Impact
The economic impact of AI’s energy consumption affects both AI companies and users. For AI companies, energy consumption increases operational costs and reduces profit margins. For AI users, energy consumption increases service fees and reduces accessibility.
For example, according to Forbes, training an advanced AI model can cost up to $10 million in electricity bills alone. This means that only large corporations or well-funded startups can afford to develop and deploy such models. Moreover, this also means that these models are likely to charge high fees for their services or limit their availability to certain regions or markets. This could create a digital divide between those who can access and benefit from AI and those who cannot.
Based on the data and projections from the International Energy Agency (IEA), we estimated the AI energy use in 2023 was worth US$ 78 billion (assuming the average cost of electricity is $0.1 per kWh). Note that this is just an estimate, and the actual AI energy use in 2023 could be higher or lower.
Of course, there are loads of scientists who say AI will increase the energy efficiency of everything (and even help you lower your electric bill), and thus make things even better for us. But that brings us to something called the Jevons Paradox.
Jevons Paradox is a phenomenon in economics that occurs when an improvement in the efficiency of using a resource leads to an increase in the overall consumption of that resource, rather than a decrease. For example, if a car becomes more fuel-efficient, the cost of driving per mile goes down, which may encourage people to drive more, and thus use more fuel overall.
This paradox was first observed by William Stanley Jevons in 1865, who noticed that the invention of the more efficient Watt steam engine increased the demand for coal in England.
Jevons’ Paradox has implications for energy conservation and environmental policy, particularly in the current context of AI, as it suggests that improving energy efficiency alone may not be enough to reduce energy consumption and greenhouse gas emissions.
Economists suggest that some possible ways to avoid or mitigate the paradox are by combining efficiency improvements with policies that increase the cost of using the resource (such as taxes or caps), or to change the behavior and preferences of consumers through education or incentives.
But everyone is so excited to use AI, is anyone going to listen? As if bitcoin mining wasn’t energy-intensive enough…
How Can We Reduce AI’s Energy Consumption?
Anyway, AI’s energy consumption is a serious challenge that requires collective action from all of us, all the various stakeholders, including researchers, developers, providers, regulators and us humble users. Some of the possible solutions include:
– Improving the efficiency and performance of AI models and hardware
This could involve using techniques such as pruning, quantization, distillation and sparsification to reduce the size and complexity of AI models without compromising their accuracy. (It’s all gobbledegook to me, too. It could also involve using specialized chips such as tensor processing units (TPUs) or neuromorphic chips that can process AI tasks faster and with less energy than conventional CPUs or GPUs.
– Using renewable energy sources to power AI systems and data centers
This could involve investing in solar panels, wind turbines, hydroelectric dams or other green technologies that can generate clean and cheap electricity. It could also involve using carbon capture and storage (CCS) or carbon offsetting schemes to reduce the net emissions of AI operations.
– Developing ethical and sustainable standards and regulations for AI development and deployment
This could involve establishing guidelines and best practices for measuring and reporting AI’s energy consumption and environmental impact. It could also involve imposing taxes or incentives for AI companies and users to encourage them to adopt more energy-efficient and eco-friendly solutions.
This post was about AI energy consumption
AI is a powerful and promising technology that can bring many benefits to all of us. But, it also comes with a high energy cost that will have negative effects on the environment and the economy. Are we ready for this, when we’re already grappling with the consequences of two centuries of energy emissions?
I know we are generally not in a position to do anything about utility scale power and AI deployment, but it is important to be aware of this trade-off and to know what to say to your political representative if the time comes.
If you liked this post, please share it with your friends. Thank you!<3
THIS POST CONTAINS AFFILIATE LINKS. PLEASE READ MY DISCLOSURE FOR MORE DETAILS.