Vietnam.vn - Nền tảng quảng bá Việt Nam

Do AI chatbots consume as much electricity as rumored?

AI chatbots are booming with hundreds of millions of daily users, but behind that convenience lies enormous power consumption, raising concerns about sustainability.

VTC NewsVTC News19/09/2025

In just the last few years, ChatGPT has exploded in popularity, with nearly 200 million users entering over a billion requests every day. These responses, seemingly processed "out of thin air," actually consume a huge amount of energy behind the scenes.

In 2023, data centers—where AI is trained and operated—accounted for 4.4% of electricity consumption in the US. Globally, this figure was around 1.5% of total electricity demand. It is projected that consumption will double by 2030 as the demand for AI continues to escalate.

“Just three years ago, we didn’t even have ChatGPT,” said Alex de Vries-Gao, a researcher on the sustainability of new technologies at Vrije University Amsterdam and founder of Digiconomist, a platform that analyzes the unintended consequences of digital trends. “And now we’re talking about a technology that could potentially account for nearly half of the electricity consumed by data centers worldwide .”

Asking a question to a large language model (LLM) consumes about 10 times more electricity than a typical Google search. (Image: Qi Yang/Getty Images)

Asking a question to a large language model (LLM) consumes about 10 times more electricity than a typical Google search. (Image: Qi Yang/Getty Images)

What makes AI chatbots so energy-intensive? The answer lies in their enormous scale. According to computer science professor Mosharaf Chowdhury at the University of Michigan, there are two particularly power-hungry phases: the training process and the inference process.

"However, the problem is that today's models are so large that they can't run on a single GPU, let alone fit into a single server," Professor Mosharaf Chowdhury explained to Live Science.

To illustrate the scale, a 2023 study by de Vries-Gao showed that an Nvidia DGX A100 server can consume up to 6.5 kilowatts of electricity. Training an LLM typically requires multiple servers, each with an average of 8 GPUs, running continuously for weeks or even months. In total, the electricity consumption is enormous: OpenAI's GPT-4 training alone consumed 50 gigawatt-hours, equivalent to enough electricity to power all of San Francisco for three days.

OpenAI's GPT-4 training process was sufficient to power all of San Francisco for three days. (Image: Jaap Arriens/NurPhoto/Rex/Shutterstock)

OpenAI's GPT-4 training process was sufficient to power all of San Francisco for three days. (Image: Jaap Arriens/NurPhoto/Rex/Shutterstock)

The inference process is also quite energy-intensive. This is when the AI ​​chatbot uses its learned knowledge to provide answers to the user. Although inference requires fewer computational resources than the training phase, it is still extremely power-hungry due to the sheer volume of requests sent to the chatbot.

As of July 2025, OpenAI estimates that ChatGPT users send over 2.5 billion requests daily. To respond instantly, the system must mobilize many servers operating simultaneously. And that's just ChatGPT alone; it doesn't include other platforms that are also becoming widely popular, such as Google's Gemini, which is expected to soon become the default choice when users access Google Search.

"Even in the inference phase, you can't really save energy," Chowdhury observed. "The problem isn't the huge amount of data anymore. The model is already huge, but the bigger issue is the number of users."

Researchers like Chowdhury and de Vries-Gao are now looking for ways to more accurately measure energy consumption, thereby finding solutions to reduce it. For example, Chowdhury maintains a leaderboard called the ML Energy Leaderboard, which tracks energy consumption in the inferences of open-source models.

However, much of the data related to commercially viable AI platforms remains "secret." Large corporations like Google, Microsoft, and Meta either keep it confidential or only release very vague statistics that fail to accurately reflect the environmental impact. This makes it very difficult to determine how much electricity AI actually consumes, what the demand will be in the coming years, and whether the world can meet it.

Nevertheless, users can certainly exert pressure for transparency. This not only helps individuals make more responsible choices when using AI, but also contributes to promoting policies that hold businesses accountable.

“One of the core problems with digital applications is that their environmental impact is always hidden,” researcher de Vries-Gao emphasized. “Now the ball is in the hands of policymakers: they must encourage data transparency so that users can take action.”

Ngoc Nguyen (Live Science)

Source: https://vtcnews.vn/chatbot-ai-co-ngon-dien-nhu-loi-don-ar965919.html


Comment (0)

Please leave a comment to share your feelings!

Same tag

Same category

Same author

Heritage

Figure

Enterprise

News

Political System

Destination

Product

Happy Vietnam
Let's have fun going to school together.

Let's have fun going to school together.

THE STORY OF THE PIEU SCARF

THE STORY OF THE PIEU SCARF

National Day, September 2nd

National Day, September 2nd