HomeTechnology"People don't suspect it": Is ChatGPT an environmental bomb?

“People don’t suspect it”: Is ChatGPT an environmental bomb?

Massive consumption of energy, water, electronic components… Generative artificial intelligences such as ChatGPT or Midjourney could be potential environmental bombs. But its impact remains largely secret.

Just type a few words and ChatGPT can generate an entire novel or video game for you in the blink of an eye. But behind this apparent magic trick lies a huge and extremely resource-intensive machine. Energy, water, metals: from design to use, each phase of the life cycle of these generative AIs turns them into potential sinks for energy and natural resources.

“People don’t necessarily suspect that AI can have an impact on the environment,” Sasha Luccioni, a researcher and climate director at Hugging Face, a central platform in current AI development, acknowledges with Tech&Co. “You don’t see a physical machine, it’s in the cloud… But there are consequences.”

substantial emissions

The problem arises from the design stage. Before using these programs, they must be trained by having them analyze several billion examples (text, images, etc.). The largest models run thousands of graphics cards stacked in data centers for millions of hours.

Which means considerable energy consumption: The unit of GPT-3 (ChatGPT’s predecessor) consumed almost 1,300 MWh, according to one estimate. Therefore, this phase would have caused the emission of 550 tons of CO2 equivalent to the atmosphere.

But this is not the only time of intensive use of energy. “Like a television that we would leave on standby, data centers consume energy even when they are not used for training,” summarizes for Tech&Co Sasha Luccioni, who participated in the estimate of Bloom’s energy consumption, another language model.

So the Bloom unit would have consumed more than 433 MWh of electricity, but you still need to add 256 MWh to simply keep these standby components out of the unit’s phases. And the carbon impact of an AI could be even more disproportionate if the electricity used were obtained from fossil sources, such as coal or gas, rather than renewable or nuclear sources.

Impacts still very secret

Generative AIs not only consume a lot of energy: they also require a lot of water and strategic metals. Water is essential for cooling the data centers used for training and running the model, to the point that training GPT-3 would have consumed 700,000 liters of water, according to a third-party estimate.

And to make the thousands of graphics cards that are piling up in these very data centers and tirelessly calculating, you need to use a lot of metals dug out of the ground – and again, a lot of water and electricity.

But how much, exactly? “It’s extremely difficult to estimate,” laments Sasha Luccioni. First, because each generative AI is designed differently. But above all because these data are wrapped in a thick veil of mystery.

For example, OpenAI has not published any impact studies on its latest language models, which are the most widely used in the world. And those that do are not necessarily comparable to each other: some impact studies only analyze the electricity consumption of the training phase, while others try to take into account all the stages, up to the design of the electronic components, but here of Again, the impact of all these parts is not necessarily made public, forcing complex estimates.

One thing is certain: since its opening to the general public and the explosion of uses, the environmental impact of generative AIs has skyrocketed. “Given the scale of the ChatGPT deployment, the use phase has clearly become the most consuming,” says Sasha Luccioni. “You have to run multiple copies of the model in parallel to keep up with the demand,” she adds.

And the trend is not reassuring. “We want to use generative AI everywhere, make bigger and bigger models… Companies use it as a communication argument”, says Sasha Luccioni. Enough to make generative AIs veritable gulfs of energy.

“No need to talk to your fridge”

Is it possible to reduce these impacts? “There are many technical means: for example, to improve the efficiency of the models to obtain the same results with less computing power”, suggests Sasha Luccioni. Optimizing data collection or storage, inference, data center cooling, power sources used… There are many avenues.

A study by a team at Google estimates, for example, that applying most of these measures, the carbon impact of a language model could be divided by a factor of 1000 (but this study does not take into account the impact of manufacturing components or phase of use). When asked about it, ChatGPT writes that OpenAI has committed to going carbon neutral in its electricity consumption by 2025.

But will these possible improvements be enough to offset the current explosion in the number of generative models, their size, and their use? For Sasha Luccioni, one of the simplest solutions would continue to be to question the interest of these technologies.

“I would prefer an AI that would improve cancer diagnosis, rather than a generative AI that allows me to talk to my fridge or my search engine.” He concludes by asking for more transparency: “In any other sector, when we create a product that is going to have an impact, we must give information about its efficiency index, its energy consumption… When we understand how energy intensive these models are, it will help to raise awareness.”

Author: lucas chagnon
Source: BFM TV

Stay Connected
16,985FansLike
2,458FollowersFollow
61,453SubscribersSubscribe
Must Read
Related News

LEAVE A REPLY

Please enter your comment!
Please enter your name here