According to data, each training of ChatGPT may consume energy equivalent to that used by 126 Danish households in a year. While the market is enthusiastically discussing how artificial intelligence technology revolution will bring more productivity to the world, some warn that it is too resource-consuming to “raise” ChatGPT.
Experts point out that, just like when cryptocurrency first came out, everyone only talked about the profits brought by new technology, but ignored the huge energy consumption behind it. Artificial intelligence is a technology that requires even more computing power than cryptocurrency, and generative AI requires a lot of training to become “intelligent” and constantly improve, representing huge resource consumption.
The reality is that humans are always greedy for larger models and larger training sets, meaning that the improvement of artificial intelligence processing capacity will always be accompanied by massive resource consumption. Martin Bouchard, co-founder of Canadian digital infrastructure company QScale, stated that each query based on generative AI will require four to five times the computing power of a regular search engine, resulting in a significant increase in energy consumption.
According to his estimates, OpenAI’s electricity consumption in January 2023 may be equivalent to that used by 175,000 Danish households in a year.Bouchard added that this is only a prediction based on current model usage, and if AI is applied more widely, the energy consumption of AI could reach the equivalent electricity consumption of millions of people.
According to AI expert and Professor at the Oxford Internet Institute, Sandra Wachter, statistical data shows that globally, the impact of information and communication technology on climate change is far greater than that of the aviation industry. Just the energy needed for AI has increased by about 300,000 times between 2012 and 2018.
Wachter also added that every time ChatGPT is trained, it may consume energy that could be used by 126 Danish households for a year. This also reflects the potential downside of AI: once ChatGPT is commercialized on a large scale and occupies a critical position in the economic life, if OpenAI’s computing center receives an unfortunate power outage notice…it could cause a tremendous water and energy consumption.
AI’s demand for water is also vast, as it needs energy to compute and water to cool down data centers to ensure safe operation. According to a preprint paper from the University of Riverside, California and the University of Texas, the training of GPT-3, the previous generation model of ChatGPT, consumed nearly 700,000 liters of water, which is enough to produce 370 BMW cars.
The research also indicates that freshwater is constantly being consumed during conversations between ChatGPT and ordinary users.To put it in perspective, for every 25-50 questions ChatGPT interacts with a user, it needs to consume 500 milliliters of water.
Given the popularity of this AI, its existence could potentially cause significant water supply issues in the area. Additionally, research indicates that OpenAI needs to provide cooling equipment for its 10,000 GPUs and over 285,000 processor cores, a process requiring enough energy to produce 320 Tesla car batteries.
Furthermore, the water used for this cooling equipment cannot be ordinary water and must come from a clean, freshwater source to avoid corrosion and bacterial growth. Experts worry that existing data only provides estimates based on the GPT-3 model, and the upcoming GPT-4, which will have even more training data, is expected to consume even more energy and water.
ChatGPT is starting to resemble a troublesome “imp” that requires significant resources to maintain while spouting nonsense.
From this perspective, for AI to truly be implemented in the real world, companies need to consider ways to reduce the unbearable cost of AI for humans, even as they innovate new technologies.