Vietnam.vn - Nền tảng quảng bá Việt Nam

AI "sucks" electricity globally?

Người Lao ĐộngNgười Lao Động14/07/2024


The artificial intelligence (AI) race is getting fiercer as new models are constantly being launched. Amazon, Microsoft, Apple, Google and Meta are expected to spend billions more dollars on AI infrastructure in the coming time.

Consumes huge amount of electricity

Talking about the prominent AI models today, we can mention OpenAI's ChatGPT, Google's Chatbot AI, Meta's Meta AI, Amazon's Olympus and Microsoft's MAI-1. Amid the rapid development of AI, many people are very concerned about the risks and impacts it can cause, especially in the issue of energy use and increasing carbon emissions.

The amount of energy consumed by AI depends on many factors, including the type of AI, the size of the model, the hardware, and the implementation process. For the large language model GPT-3 developed by OpenAI with 175 billion parameters, to train it, researchers had to run 1,024 GPUs (graphics processing units) continuously for about a month. Associate Professor Mosharraf Chowdhury at the University of Michigan (USA) estimated that training GPT-3 consumed 1.287 million kWh of electricity each time, equivalent to the average electricity consumption of an American household in 120 years.

AI

ChatGPT consumes more than 500,000 kWh of electricity per day, equivalent to the average daily electricity consumption of 17,000 US households. Photo: REUTERS

GPT-3 was released four years ago, and today the parameter size of large language models (LLMs) is growing exponentially. Meanwhile, GPT-4, released in 2023, has a total of 1,760 billion parameters, 10 times that of GPT-3. GPT-5, expected to be released in late 2025, is faster and has more powerful language processing capabilities, so the energy consumption for training will also be significantly higher.

As apps become more popular and the number of users increases, power consumption will continue to increase. The International Energy Agency (IEA) said that ChatGPT consumes an average of 2.9 watts of electricity per hour to meet its needs, nearly 10 times the average power consumption of Google to serve user searches.

In terms of usage, ChatGPT meets about 200 million needs per day, consuming up to 182.5 million kWh of electricity per year. As for Google, with about 9 billion searches per day, if it integrated large-scale synthetic AI into search, the IEA estimates it would need an additional 10 billion kWh of electricity per year.

Promoting data center development

Data centers are the infrastructure of AI, providing the computing resources, storage capacity, and network bandwidth needed for AI, allowing AI applications to run and develop efficiently. At the same time, data centers must also provide powerful cooling systems to maintain suitable temperatures because thousands of servers and chips running around the clock generate a lot of heat. Therefore, the power consumption of the data center itself is significant.

According to the IEA 2022 report, global data center electricity consumption is estimated at 460 billion kWh of electricity in 2022, accounting for nearly 2% of total global electricity consumption. It is forecasted that by 2026, total electricity consumption in data centers could double to 1,000 terawatt hours in 2026, nearly equivalent to the annual electricity consumption of all of Japan.

Along with the need for electricity, the need for clean water to cool the entire system is also increasing. According to the researchers' estimates, the need to operate AI services will cause the amount of groundwater to be extracted to reach 4.2 to 6.6 billion cubic meters by 2027. This is equivalent to half of the amount of water used by the UK each year.

Many observers believe that AI should be used by technology corporations to support the green transition. According to Reuters, technology giants such as Amazon, Microsoft and Google are investing heavily in solar and wind power projects, as well as redesigning data centers to save on cooling water.

Risk of climate change

Mr. Lu Vincent The Hung - founder and CEO of eduX Global Institute Joint Stock Company - believes that using AI tools does not directly "consume" electricity. The process of training AI models such as GPT-3.5, GPT-4o... is the cause of electricity consumption due to the extremely large processing resource requirements, requiring a lot of energy to operate. This is one of the causes of climate change such as global warming, droughts, floods... In addition, AI construction devices often have a short life cycle, only about 1-2 years, so it is predicted that electronic waste will increase sharply in the near future.

"AI training units can create simpler algorithms and more compact AI models to save energy, thereby also helping to reduce costs. At the same time, renewable energy such as solar power and wind power should be used to supply AI systems and there should be specific regulations on maximum electricity consumption in operating and developing this technology," Mr. Hung suggested.

According to an information technology lecturer, to save electricity and develop AI more sustainably and effectively, cloud computing solutions can be applied to optimize resource usage when operating these models. In addition, technology companies that own AI systems need to be required to provide electricity usage data for their systems, thereby calculating energy consumption standards for AI that are suitable for long-term AI technology development in Vietnam.

L. Province



Source: https://nld.com.vn/ai-hut-dien-tren-toan-cau-196240713192735629.htm

Comment (0)

No data
No data

Same category

The fiery red sunrise scene at Ngu Chi Son
10,000 antiques take you back to old Saigon
The place where Uncle Ho read the Declaration of Independence
Where President Ho Chi Minh read the Declaration of Independence

Same author

Heritage

Figure

Business

No videos available

News

Political System

Local

Product