When we talk about AI, we tend to talk about the impact on human tasks—making them easier, giving us superpowers, enabling better insights and results. But that superpower comes at an environmental cost, and it’s not something we generally think about.

Gen Z influencer with the TikTok name @nakitadumptruck sums it up well in her post, “’Why AI is destroying the environment’ in 60 seconds” [edited for brevity]:

When you enter a prompt in ChatGPT, it goes through thousands of calculations to figure out the best possible answer. These calculations are run in data centers. These data centers heat up because they are going through so many calculations. To keep the servers from overheating, they need water [for cooling]. Water transfers the heat generated by data centers into cooling towers to help it escape the building (kind of like how you release sweat to keep cool). Depending on where the data center is based, it uses a different amount of water [or electricity to drive the AC for cooling]. In West Des Moines, Microsoft uses 6% of the water in that town. In Oregon, Google uses 25% of the water in their town. [Shakes a 16-oz water bottle]. Chat GPT uses this much water every time you 100-word email. So stop using it.

It Takes Power

Does she have a point? According to Bo Yang, Ph.D., vice president, Energy Solutions Lab, R&D Division, Hitachi America, Ltd., she does. In her article on the Hitachi website, under the social innovation tab, she notes the following:

  • The amount of electricity required by AI data center racks is estimated to require seven times more power than traditional data center racks.
  • Goldman Sachs estimates there will be a 160% increase in demand for power propelled by AI applications by the end of this decade.
  • At 2.9 watt-hours per ChatGPT request, AI queries are estimated to require ten times the electricity of traditional Google queries, which use about 0.3 watt-hours apiece.

With so many companies investing in AI, both for expanded use and new use (but mostly expanded), these energy demands aren’t going to slow. In fact, Morgan Stanley Research estimated that GenAI’s power demands will soar 70% each year.

Yang adds that the nation’s power grids already command 2.5% to 3.7% of global greenhouse gas emissions (higher than the aviation industry): “It’s estimated that the daily carbon footprint of GPT-3 adds up to roughly the equivalent of 8.4 tons of C026 in an entire year. Some industry estimates suggest that GPT-4, OpenAI's latest version of its Large Language Model, has 1.76 trillion parameters. That would make it ten times bigger than GPT-3 with its 175 billion parameters.”

The Flip Side of the Argument

This is the dark environmental side of AI. But as Undark, the online website spotlighting the intersection of technology and culture, notes, there are arguments on the other side, as well.

In Undark’s article “The Growing Environmental Footprint of Generative AI,” it notes that even as AI is being embedded into “everything from resume-writing to kidney transplant medicine and from choosing dog food to climate modeling,” tech companies cite the many ways AI could help reduce humanity’s environmental footprint too.

Of course, part of the problem is that while we know the impact of server farms, AI data is not really broken out separately, so we’re guessing. Without clear, accurate numbers, we’re all taking a stab in the dark.

What we do know is that we aren’t really paying attention to the environmental aspect of AI. Fortunately, while the full impact is obscured, it’s still something we can keep in mind. Before we hit “generate” on that query, we might do well to ask ourselves, “Is that something we really need AI to do? Or should we use human intelligence this time?”

It could be that our planet is counting on a little restraint.