AI Could Soon Consume as Much Electricity as an Entire Country
Published: 10.16.2023
Artificial intelligence is transforming many industries and aspects of our lives, but it comes at a cost: energy consumption. A recent report by Alex de Vries, a data scientist at the Central Bank of the Netherlands found that AI servers could consume between 85 to 134 terawatt hours (TWh) annually by 2027 in a middle-ground scenario. This is equivalent to about 0.5 percent of the world's current electricity use, or the annual electricity consumption of a small country like the Netherlands, Argentina, or Sweden.
Vries' conclusions stem from estimations related to the creation of AI servers and their energy-efficient performance. De Vries discovered that the energy usage of AI servers is increasing more rapidly than the general improvement in AI efficiency. This trend is driven by the growing complexity of AI models, which demand more potent hardware for their operation.
A research firm called SemiAnalysis estimated that OpenAI uses 3,617 of NVIDIA's HGX A100 servers to support ChatGPT. These servers have a total of 28,936 graphics processing units (GPUs) and running ChatGPT on these servers requires 564 megawatt-hours (MWh) of energy per day.
Running ChatGPT consumes more energy than what was expended during its development. This is primarily due to its ongoing usage for text generation, language translation, and responding to queries, which necessitates energy for processing user requests and generating responses.
The report also found that the energy consumption of AI varies depending on how it is used. For example, training AI models is the most energy-intensive task, followed by running AI models in production. De Vries found that if AI is used for tasks like natural language processing and image recognition, it could consume significantly more energy than if it is used for tasks like fraud detection and medical diagnosis.
This raises important questions about the sustainability of AI. If AI consumption continues to grow at its current rate, it could put a significant strain on the global energy grid. It is important to develop more energy-efficient AI hardware and software and to use AI in ways that minimize its energy consumption.