By OWOEYE TOLUWANIMI JANET
The systems required to power Google’s AI tools have significantly boosted the company’s greenhouse gas emissions. This issue has been growing behind the scenes as the company has hurried to integrate artificial intelligence into its core products, often with less-than-stellar results.
For AI systems to function, a large number of computers are required. To process data and control the heat produced by all those computers, the data centers required to power them—basically, warehouses filled with powerful computing equipment—consume enormous amounts of energy.
According to the internet giant’s annual environmental report, the ultimate consequence has been a 48% increase in Google’s greenhouse gas emissions since 2019. The growth was mostly attributed by the tech behemoth to “increased data center energy consumption and supply chain emissions.”
Google has now referred to its target of having net-zero emissions by 2030 as “extremely ambitious,” and it has stated that “the uncertainty around the future environmental impact of AI, which is complex and difficult to predict” is likely to have an influence on the pledge. Put another way, the company’s sustainability initiative, which formerly featured the phrase “don’t be evil” in its code of conduct, has grown more intricate as a result of AI.
Like its rivals in the computer industry, Google has thrown its lot of money into artificial intelligence (AI), which is predicted to be the next big tech revolution that would drastically alter the way we work, live, and consume media. Google CEO Sundar Pichai has referred to the company as a “AI-first company,” and it has included its Gemini generative AI technology into several of its key products, such as Search and Google Assistant.
However, there is a significant drawback to AI: the energy-intensive data centers that Google and other Big Tech competitors are currently investing tens of billions of dollars every quarter to develop in order to further their AI goals.
The International Energy Agency estimates that a Google search query uses an average of 0.3 watt-hours of electricity, whereas a ChatGPT request uses an average of 2.9 watt-hours of electricity. This shows exactly how much more demanding AI models are than conventional computing systems. According to a study published in October by Dutch researcher Alex de Vries, under the “worst-case scenario,” Google’s AI systems would someday use as much electricity annually as Ireland, assuming that AI is fully integrated into their current hardware and software. “Since AI is becoming more and more integrated into our products, lowering emissions could prove difficult because of the rising energy requirements of AI computation and the emissions from our anticipated increases in technical infrastructure spending,” Google stated in a report that was released on Monday. It also stated that the present rate of growth in data center electricity use is outpacing the availability of carbon-free electricity sources.
In order to power its data centers, Google plans to invest in renewable energy sources like wind and geothermal energy, therefore the company’s overall greenhouse gas emissions are expected to climb before declining.
Another issue with sustainability is the significant volumes of water required as coolant to keep data centers from overheating. By 2030, Google claims to have restored 120% of the freshwater used in its offices and data centers. However, as of last year, it had only restored 18% of that water, a significant increase from 6% the year before.