The Hidden Watts: How Much Energy Does a Single AI Query Use?
Generative AI tools like large language models (LLMs) have become part of everyday life. And the conversations about their energy use is actively being discussed various news platforms and on social media. People worry about rising energy bills and even potential shortages. That consequently raises the question: how much energy does a single AI query actually use?
According to Google, the Gemini model requires about 0.24 watt-hours of electricity per query. To compare that is roughly the same amount of energy it takes to power a standard microwave for one second. Similarly, OpenAI CEO Sam Altman has stated that an average ChatGPT query uses energy like what an oven would consume in a little over one second. To put this in perspective, one ChatGPT query is uses approximately 10 times as much electricity as a normal Google search.
The actual energy consumption depends on several factors:
Type of query: More complex requests — like generating travel itineraries, creating images or videos, or designing presentations, demand significantly more computational power than straightforward text or keyword searches.
Model size: Larger models use a lot more energy than smaller ones for the same query.
Software and hardware: The efficiency of the software and the type of chips used to process data influence energy use a lot.
Query content: Even apparently minor additions like polite words ("please," "thank you") increase processing power because the model must analyze extra input. Additionally, keywords such as "analyze," "explain," and "justify" require more computation.
Infrastructure: Cooling systems and hardware design impact overall energy consumption.
Large Language Models are naturally more computationally demanding than traditional search engines. This means that as AI offers powerful capabilities, it also comes with a higher energy cost per interaction. Since a single generative AI query can use up to 10 times more energy than a standard search engine query, there is a need for more research on optimizing the parameters that affect energy consumption in AI systems.
In conclusion, as AI continues to become part of our daily lives, understanding and managing its energy footprint is important and necessary. Innovations in hardware, software efficiency, and query optimization will be key to balancing AI’s benefits with sustainable energy use.