Why Google Wants You to Know the Environmental Cost of AI Queries
WSJ Tech News BriefingAugust 29, 202500:12:23

Why Google Wants You to Know the Environmental Cost of AI Queries

If a regular web search isn’t doing it for you, and even a generative artificial intelligence chatbot response leaves you wanting more — you could try “deep research.” Our personal tech columnist tried it out and breaks down how it works. But those queries come at an environmental cost. In a new report Google is detailing how much energy a single query uses. Julie Chang hosts. Sign up for the WSJ's free Technology newsletter. Learn more about your ad choices. Visit megaphone.fm/adchoices

If a regular web search isn’t doing it for you, and even a generative artificial intelligence chatbot response leaves you wanting more — you could try “deep research.” Our personal tech columnist tried it out and breaks down how it works. But those queries come at an environmental cost. In a new report Google is detailing how much energy a single query uses. Julie Chang hosts.


Sign up for the WSJ's free Technology newsletter.

Learn more about your ad choices. Visit megaphone.fm/adchoices

[00:00:01] I'll now hand you over to my best man, Eddie. Wow, wow, wow. Second time's a charm, eh, Billy Boy? Oh, God. Substitution, courtesy of Paddy Power, embarrassing Eddie makes way for sensible Samuel. Cool, that was close. You might not always pick the right starter, but your sub can still deliver. Because with Paddy's super sub, your bet rolls over to the player coming on. Paddy Power. Valid on selected leagues and markets only. Pre-match and in-play bets on qualifying player outcome selections only. Tecencies and exclusions apply. 18plus. Scammerware.org.

[00:00:33] Welcome to Tech News Briefing. It's Friday, August 29th. I'm Julie Chang for The Wall Street Journal. If a regular web search isn't doing it for you, and even a generative artificial intelligence chatbot response leaves you wanting more, you could try deep research. Our personal tech columnist did, and we'll break down how it works. But those queries come at a cost to the environment.

[00:00:58] Google has released a new report detailing how much energy a single query uses. But first, earlier this year, ChatGPT and other popular AI chatbots introduced a feature called Deep Research. When activated, the AI goes beyond basic chat. It examines more sources and compiles a more thorough response. But it also takes more time.

[00:01:24] Our personal tech columnist, Nicole Nguyen, has been experimenting with deep research and comparing it to a regular web search. Nicole discussed what she learned with the WSJ's Liz Young. Nicole, ChatGPT and other popular AI chatbots have rolled out what they call advanced research modes that go beyond basic chat. What is deep research? Deep research is a toggle that's shown up on Gemini, ChatGPT, Perplexity, Claude, Grok, etc.

[00:01:52] The world's most popular chatbots now all have a research toggle next to the AI chatbot box. And it triggers this very thorough and intense process that the AI goes through. So you have to wait a little longer, about 10 minutes, 20 minutes, sometimes 30 minutes. So it's not instantaneous like basic chat. And what happens is the bot will plunge into the depths of the internet.

[00:02:18] It will analyze thousands of words from its search and repeat that process again and again and again until it's satisfied. And so the difference between a basic chat response and a deep research response is that the response is much more thorough. It's rooted in web search. It's cited. And it's probably a lot more words than your basic chat response. I will say that it's more expensive for them to produce that response because it takes more time.

[00:02:46] And that's why most people are limited in the deep research queries that they can ask per day. How does it differ from using Google? It's like using Google many times over. So if you have a query with a very complicated decision matrix, so I'll give mine as an example. I'm looking for an electric vehicle. It has to fit a certain type of car seat in the back seat. I want the wheelbase to be shorter than my current model. I'm not going to bore you with the details, but it's a lot of them.

[00:03:14] I can feed all of that into the AI chatbot. And then it can sift through many different kinds of Google searches. So it's not like one Google search that looks for different wheelbase types. It looks for message boards where parents are anecdotally talking about fitting a car seat in the back of certain car models. And then it collates all of that information and puts it together in one long report or a table or whatever format you specify.

[00:03:43] How much time do you think that that saved you? Definitely hours. This is also because it's a part of my job, but I go on a lot of Google rabbit hole search quest type things. And a query like this would probably take me days. You know, when I'm thinking about the perfect place to go to a vacation with my family, I spend hours during the week just trying to find the ideal place.

[00:04:05] And this deep research tool allows me to plop out a prompt, click start query, and then I can make my coffee and come back. And in 10 minutes, it's done. This all sounds so magical on its surface, but what did you think of the reports that AI pulled together? What was the outcome and what were the takeaways that you were able to gather from it? So the results definitely varied.

[00:04:29] Over the course of my deep research, I was able to perfect the way that I prompt the AI chatbot for best results. The beauty of living in this era is that a lot of these tools are now available to us for free, and it probably won't be that way forever. And so what you can do is hit deep research on many different types of AI chatbots and compare notes. How does perplexity differ from Gemini, differ from ChatGPT?

[00:04:55] And that's a really good way to kick off any type of complicated search. That was the WSJ's Liz Young speaking with personal tech columnist Nicole Nguyen. Coming up, what's the environmental cost of using Gen AI chatbots? One tech giant says a single query takes the same amount of energy as watching nine seconds of TV. More on that after the break.

[00:05:24] If you haven't tried Abercrombie denim yet, you're missing out. Denim should fit like this. It's all about proportions. Abercrombie has their classic fits and athletic fits for guys who want a little more room in the thigh. When you find your staple fit, it'll be the pair you reach for day after day for every plan. Shop Abercrombie denim in the app, online and in store.

[00:05:56] Google has released a new report on how much energy it takes to ask its Gen AI chatbot Gemini a question. The tech giant hopes that it'll encourage other tech companies to disclose the energy demands of their AI models with more detail. Google also appears to be looking to ease brewing anxieties that frequently using generative AI such as Gemini can be detrimental to the environment. The report points out that its Gemini energy analysis is lower than public estimates.

[00:06:24] Clara Hudson is a WSJ Pro sustainable business reporter and she joins me now. So Clara, tell us more about this report that Google released. So the report says that when you give Gemini a prompt, it takes the same amount of energy as watching nine seconds of TV. And it also consumes about five drops of water.

[00:06:47] But what will likely be quite helpful to others in the tech industry is that Google went into detail about how it got to those numbers, as well as the formulas that they used for their research. Tell us more about the water use part. I feel like water isn't something commonly thought of when discussing the use of AI.

[00:07:08] Yeah, so it's interesting that Google did talk about its water use because most of the conversation about AI and energy has been typically focused on the electricity that it needs. But data centers use a huge amount of water to cool down all of the electrical equipment that powers AI. So this report might encourage more companies to talk about their water use.

[00:07:35] And it could also help people who use chatbots really frequently and weren't even aware that there is this water cost to asking their questions. So how does this report or the numbers presented in it by Google compare with other companies? So there isn't a lot of transparency about how much energy chatbots use across the tech industry. And where we do have those numbers, it depends on how the company calculates it.

[00:08:04] So the results could really vary depending on what you ask a chatbot to do. It could be churning out a short amount of text or a really lengthy answer. And then there are all kinds of other variables to account for, like images or video too.

[00:08:24] Open AI CEO Sam Altman said recently in a blog post, the average chat GPT query uses about the same amount of energy an oven would use in one second and about one fifteenth of a teaspoon of water. But he actually didn't include how he got to those numbers. Back to the report on Google. Does the tech giant feel that it's meeting its environmental goals then?

[00:08:52] It's a bit of a complicated answer because tech companies are pinning their business ambitions on AI. But they also have environmental goals that they've committed to meeting. So Google reported earlier this summer that its emissions jumped 51 percent since 2019 because of its AI needs.

[00:09:17] But then its Gemini report from last week said it's more efficient now than it was a year ago. So what can users do if they want to lower their energy use when they use these AI chatbots? So if you're someone who turns to chatbots a lot, you can tailor how you use them to consume less energy.

[00:09:40] So if you ask it shorter questions and keep things simple, it's ultimately better for the environment. But if you zoom out, a lot of users asking a lot of questions is going to require a lot of energy. So asking a few questions isn't necessarily the biggest deal.

[00:10:03] But if millions of people are asking dozens of questions every day, then that's really going to add up. And are the companies themselves doing anything to lower their energy use or have a lower environmental impact? Yeah, so it's not hopeless. There are ways to save energy. So you can slow down an AI response even by just a few milliseconds if it's not an urgent question.

[00:10:30] There are ways to better manage water use also, like using recycled water or water that isn't safe to drink. And then there are clean energy options too. So tech companies are investing in new renewable energy contracts to use hydropower, for example.

[00:10:53] But overall, that's still going to be a pretty small percentage of what is powering AI more broadly. That was WSJ Pro Sustainable Business Reporter, Clara Hudson. And that's it for Tech News Briefing. Today's show was produced by me, Julie Chang. Jessica Fenton and Michael LaValle wrote our theme music. Our supervising producer is Melanie Roy. Our development producer is Aisha Al-Muslim. Chris Zinsley is the deputy editor.

[00:11:22] And Falana Patterson is the Wall Street Journal's head of news audio. We'll be back later this morning with T&B Tech Minute. Thanks for listening. Thanks for listening. Thanks for listening.