Following the money in AI-land

Following the money in AI-land


The Skinny | By Joan Zwagerman

If you’ve ever watched “Better Call Saul” (a great show, by the way), you’ll remember that the main character, Jimmy McGill, aka Saul Goodman, had a brother, Charles “Chuck” McGill.

Chuck was a brilliant lawyer but left a lucrative practice because he was suffering from electromagnetic hypersensitivity (EHS), a syndrome (real or imagined) where the person suffers anxiety, headaches, fatigue, even rashes from exposure to electromagnetic fields.

Modern life would be unbearable for such a sufferer. In the show, Chuck became a recluse and disconnected himself from the electrical grid, using an ice chest to cool food, a camping stove to cook, and a camping lantern for light. He used a manual typewriter when he felt well enough to work.

After having watched the show in its entirety and currently rewatching it, I began to realize that Chuck must have saved a buttload of money on utilities, which is, of course, completely beside the point, but it assists the point I’m trying to make, and it is no small thing.

Our utilities, water, electricity, and gas are all increasing and will continue to do so, but few of us are drilling wells or chopping wood or reading by camping lanterns.

Instead, we may set the A/C to 74 or 76 and if that doesn’t yield enough savings, we may do without small luxuries, like Starbucks.

Sure, we will remind family members to turn off the lights when they leave the room, but few of us will stop using computers or handheld devices. The cost to power those is hidden in the overall electric bill. Running a laptop doesn’t cost that much, anyway. The real cost comes from how we store our digital stuff. Most of us rely on the cloud to save our photos and documents and spreadsheets.

And the cloud is, of course, not some nebulous entity, but banks and rows and racks of servers, whirring like synapses in some giant brain, requiring not just electricity to keep the servers running, but massive amounts of power to cool them, to light the facility, and to power the backup and security systems.

Years ago, I had a chance to visit a Microsoft office in the Seattle area; we knew one of the employees, and he allowed us to visit. Better yet, he gave a tour of their server room. Now, this was circa 2012, and cloud computing had not yet taken over at this office, so onsite server rooms like the one we saw were still standard.

Normally we would have had to put fabric booties over our shoes, and the floor in the room gleamed. I’ve never seen such a clean floor. The young man who swiped us in was a recent University of Northern Iowa graduate, and he oversaw this area.

As a digital immigrant, this was a jaw-dropping moment, but for my son, a digital native, it was no less amazing. Our host intimated, though, that change was coming, and that the young man in the server room would do well to keep abreast of coming changes.

Later, I realized our friend was alluding to the rise of data centers which are constructed to meet the demands of cloud computing. They are basically mondo server rooms. IT people will surely be rolling their eyes at this description because they are much more than that but bear with me.

I make no claim to be an IT expert. This is all just a platform for saying — to all of us — do we really understand the enormous energy demands our hyperconnected world requires?

I struggle to grasp the size of a data center, but a fresher hell is to consider what AI is doing to this whole enterprise.

Let’s set aside ethical concerns about AI for a moment and think like businesspeople. Let’s follow the money.

Goldman Sachs Research says: “On average, a ChatGPT query needs nearly 10 times as much electricity to process as a Google search. In that difference lies a coming sea change in how the US, Europe, and the world at large will consume power — and how much that will cost.”

Kanoppi.co, a company that helps WordPress website owners monitor their carbon footprint agrees:

  • Google Search: Uses 0.0003 kWh per query, emitting 0.2g CO2.
  • ChatGPT: Uses 0.0029 kWh per query (10x more energy), emitting 68g CO2.

In July 2024, National Public Radio reported that Google “said its greenhouse gas emissions rose last year by 48% since 2019. It attributed that surge to its data center energy consumption and supply chain emissions.”

In May 2025, NPR reported that a 2024 report by Lawrence Berkeley National Laboratory forecasted “that by 2028, U.S. data centers could consume as much as 12% of the nation’s electricity.”

I work in higher education, and educators are grappling with the ethical implications of AI — or trying to smooth them over — because it’s a flood that no one sandbagged for, as is the case with most technological breakthroughs.

Using generative AI based on large language models indiscriminately is bad for the environment. It’s tempting to see ChatGPT and its ilk as the latest whizz-bang, but it carries a hefty price tag, to say nothing about how it will affect learning and the health of our educational system.

Our young people are worried about the health of the planet. So why don’t we address the actual costs of using AI and their reliance on data centers with students? We need to talk a lot more about the cost of AI and big data centers.

This is a teachable moment. We should seize it.

Joan Zwagerman is angered by the “AI Overview” that now appears as the first result in a Google search.





Source link