| As researchers scrutinize Big Tech's utility bills, artificial intelligence has earned a reputation as a thirsty, energy-hungry beast. A single Q&A session with a large language model (LLM) can consume more than a half-liter of fresh water to cool servers. Asking ChatGPT one question reportedly consumes 10 times as much electricity as a conventional Google search. And generating an image is equivalent to charging a smartphone. But should we worry about it? Five years ago, there was a similar panic over streaming video. A frenzy followed a French think tank's estimates that streaming a half-hour Netflix show generated as much CO2 as driving almost four miles. Globally, this implied, the streaming giant was consuming enough electricity annually to power Britain in service of shows like "Tiger King." But it turned out that those estimates relied on faulty assumptions about the energy use of data centers and streaming video. The actual emissions, former International Energy Agency analyst George Kamiya calculated, were 25 to 53 times lower. As AI commandeers more of our digital lives, it's worth asking again: Is this technology as voracious as we fear? If so, what can we do about it? So I teamed up with decarbonization analytics company Planet FWD. We analyzed the emissions associated with our digital lives, and what role AI is playing as we pelt it with questions billions of times per day. Individuals asking LLMs questions, our data suggests, is not the problem. Text responses don't consume much energy. But AI's integration into almost everything from customer service calls to algorithmic "bosses" to warfare is fueling enormous demand. Despite dramatic efficiency improvements, pouring those gains back into bigger, hungrier models powered by fossil fuels will create the energy monster we imagine. Read the full column. Write me at climatecoach@washpost.com. I read all your emails. |
No comments:
Post a Comment