With the development of artificial intelligence technologies, the environmental impacts of these systems have also begun to be discussed. A study conducted by the University of California Riverside revealed the environmental impact of large language models, especially GPT-4. According to the study, operating artificial intelligence systems requires the use of a significant amount of water to cool data centers. In fact, even producing a 100-word text can cost 3 bottles of water. So why is water wasted? Here are the details…
A 100-word text from the AI model GPT-4 costs 3 bottles (1,408 ml) of water
In data centers where artificial intelligence technologies operate, large amounts of water are used to balance the high energy consumption of servers and prevent overheating . However, water consumption can vary depending on the geography of the data centers.
For example, while it may take 235 ml of water to produce 100 words in a data center in Texas, it can take up to 1,408 ml in a center in Washington state. This difference is due to local climate, water resources, and energy costs.
The energy consumption of AI technologies also stands out as a significant problem. For example, if only 10% of employees in the US used GPT-4 once a week, the energy consumption of this system would be equivalent to the energy needs of Washington DC for 20 days .
In fact, Meta used 2.2 million liters of water to train the LLaMA-3 model . That’s the amount of water needed to grow 2 tons of rice, or the annual water consumption of 164 Americans. In other words, let’s note that AI is putting serious pressure not only on water but also on electricity consumption.
Major technology companies such as OpenAI, Microsoft and Google state that they will take various measures in this regard and work to reduce water consumption. However, a concrete solution or alternative method has not yet been developed.