HomeTechnologyA bottle per conversation: ChatGPT is a sink for drinking fresh water

A bottle per conversation: ChatGPT is a sink for drinking fresh water

Generative AIs like ChatGPT are particularly water-intensive, conclude the US researchers, who fear repercussions on the supply of populations.

A simple conversation with the ChatGPT chatbot is like emptying a bottle of water, while training its predecessor GPT-3 would have required enough water to fill a nuclear cooling tower. These are the preliminary findings of a study still in the pre-publication stage, titled “Making AI Less ‘Thirsty,'” conducted by the Universities of Colorado Riverside and Arlington Texas, broadcast by Gizmodo.

Half a liter for 25 to 50 questions

OpenAI, which developed this AI, did not communicate the exact time it took to train its language model. But Microsoft, which forged a multibillion-dollar partnership with the American startup, explained that its latest supercomputer contains 10,000 graphics cards and more than 285,000 processor cores. Such material rampage requires huge amounts of water to cool Microsoft’s US data center – three times as much if the company chose its Asia-based center to prepare the model, because it consumes more energy.

The researchers estimated that training GPT-3 would have consumed 700 cubic meters of water, while between 25 and 50 questions exchanged with the ChatGPT chatbot require half a liter of water. Newer models, like the GPT-4, would require even more water.

To keep data centers at an average temperature of around 10 to 26 degrees Celsius, cooling towers evaporate cold water. We are talking here about “consumed” water, that is, that we cannot recover, and not simply “withdrawn” and then recycled water. The type of water used is also important: it must come from a freshwater source. Data centers also consume an indirect volume of water: that necessary for the surplus production of electricity they need.

The energy needed to power GPT-3 also released 502 tons of carbon emissions and could have powered an average American home for hundreds of years, according to a Stanford artificial intelligence study also cited by Gizmodo.

tensions in the water

OpenAI is not the only one affected, far from it; Google’s data centers, which power Google Search as well as its LaMDa and Bard language models, required more than 8.7 million cubic meters of water in 2019 in just three US states. LaMDa would use even more water , because this model is hosted in data centers located in particularly hot regions, such as Texas.

The research also points to the tensions around water consumption that the use of generative AI of this type can cause in the United States, where 44 million inhabitants have “inadequate” access to the water distribution network. Stanford University estimates that by 2071, nearly half of the nation’s freshwater basins will be unable to meet monthly demand. Rising temperatures and population growth have already led to a record drought in the west, the likes of which have not been seen in over 1,000 years.

Under these conditions, the researchers recommend better targeting places where data centers are established or running them during the coldest hours of the night. Chatbot users could also focus their conversations on the hours when daily water consumption is lower… in the same way that the authorities encourage it for dishwashers and washing machines.

Author: lucia lequier
Source: BFM TV

Stay Connected
16,985FansLike
2,458FollowersFollow
61,453SubscribersSubscribe
Must Read
Related News

LEAVE A REPLY

Please enter your comment!
Please enter your name here