AI chatbots consume huge amounts of electricity and sustainability concerns
AI chatbots are becoming more and more popular, but behind the convenience is huge power consumption, posing a big challenge for the environment and sustainability.
Báo Khoa học và Đời sống•23/09/2025
ChatGPT currently has nearly 200 million users with over 1 billion requests per day. Each question and answer session seems simple but consumes a huge amount of electricity.
By 2023, AI data centers will account for 4.4% of US electricity and 1.5% globally. Training GPT-4 consumed 50 gigawatt-hours, enough to power San Francisco for three days.
Training and inference are the two most power-hungry stages, according to computer science professor Mosharraf Chowdhury at the University of Michigan. A single Nvidia DGX A100 server consumes up to 6.5 kilowatts, which multiplied by thousands of machines is a huge number. Even inference consumes a lot of electricity as there are 2.5 billion requests per day from ChatGPT alone.
Experts call for transparency in energy consumption data to move towards more sustainable AI. Dear readers, please watch more videos : AI Trash Cleaning | Hanoi 18:00
Comment (0)