top of page
Search

I Don't Think You Understand How Much Power AI is Using

  • Writer: Mary Dippolito
    Mary Dippolito
  • 6 days ago
  • 3 min read
Meta's Eagle Mountain Data Center in Utah
Meta's Eagle Mountain Data Center in Utah

ChatGPT, GrokAI, Google Gemini—AI models are sweeping college campuses, leaving both students and educators divided on their ethicality. “Generative” use of AI, using it to write text or create photos and art for you, is the easiest to condemn its use as unethical. But brainstorming, grammar-checking, breaking down arguments fed to it, some professors even agree there are “ethical” ways to use AI models as a tool. However you see it, whatever ways you find it “ethical,” there's one big problem: How much energy is required to run these things.


A few semesters ago, I used ChatGPT for a few queries, presenting an argument I had written for class and asking it for any additional points I could research, and felt this was an alright way to utilize it. However, when I learned the cost, I couldn't help but feel like every time I asked it what to do with leftover zucchini, twelve trees died somewhere in Brazil. In the Dalles in Oregon, Google recently fought to keep their water records secret as the city's water use has tripled in the past five years, and the company's data centers now consume more than a quarter of all water in the city. In Utah, the great Salt Lake that is prone to drought is drying up at a faster rate than ever, near the Eagle Mountain Data Center, where its citizens are being urged to conserve their water and electricity use while data centers like this slurp the power grid for resources we've never seen before.


Effects like this can be hard to see, and when we can't see something firsthand, it naturally creates dissonance between what we're hearing and how deeply we feel it happening. It's a normal, human reaction to information where figures are hard to picture. But we will feel its effects, and we're already feeling it now, in the form of our electricity bills. According to the Northeast Ohio Public Energy Council, the average household was paying 10-15% more this summer on their electricity bills due to an imbalance between energy supply and demand. (wfmj)


One study from 2024 estimates that a single query from ChatGPT costs 10x the energy of a Google search. Search functions seems to be one of the most popular ways to use it—I have friends that argue that it's simply “better” than Googling something. Yet, whenever I hear what the query was, I find that with a few extra moments of reading through the search results, I find what my friend was looking for within 30 seconds. Sometimes, I even find the exact search result that the AI pulled its answer from, and notice it's something like a Reddit post or another unworthy source. If it takes me 30 seconds to find the same or even a more correct answer, and costs the world 90% less in resources, I'm going to continue to do that.


The energy AI is consuming isn't a “someday maybe” scenario—it's now. People in areas surrounding these data centers are already being told to consume less energy, to keep their lights off, to watch their water useage, and suffer from electricity prices that will continue to spike. All while these tech giants' megacenters use an unprecedented amount of energy and water. Next time you think of using it for a simple query, consider whether it's something you can Google and take the extra time, it's not much. Consider asking a friend to proofread your work or go over ideas with you. It's old-school, but minimizing our use of AI conserves more resources than any amount of keeping the lights low or turning the faucet off to brush our teeth.


 
 
 

Comments


Subscribe Form

Thanks for submitting!

©2019 by Neopa Sound Radar. Proudly created with Wix.com

  • Facebook
  • Twitter
  • LinkedIn
bottom of page