AI's Growing Energy Demand May Soon Rival Entire Nations

Artificial intelligence could consume up to 23 GW of power by the end of 2025, rivaling national electricity usage. Rising demand for AI chips and data centres sparks climate concerns and energy supply challenges.

AI's Growing Energy Demand May Soon Rival Entire Nations

Artificial intelligence is rapidly one of the world's most power-demanding technologies. New research indicates that by 2025, the energy used in running AI systems could overtake that of Bitcoin mining and rival the power usage of the United Kingdom or Ireland.

The increased need for AI generative tools and smart systems has triggered an all-time high boom in the utilization of specialist AI chips, particularly high-end chips such as Nvidia's H100. The chips are required to do AI tasks but are power hogs. An H100 chip uses around 700 watts of power when in operation. Bearing in mind that millions of these chips have already been distributed around the world, the total power demand is enormous.

Statistics gathered by researcher Alex de Vries-Gao of Amsterdam Vrije Universiteit place estimates that machinery manufactured in only two years—2023 and 2024—may require between 5.3 to 9.4 gigawatts of electricity. That alone exceeds Ireland's total national electricity consumption. If the pace of developing the AI infrastructure increases as is currently the case, the electricity demand for the world's network of AIs could very well reach 23 gigawatts by the year 2025. This would be the same as keeping the United Kingdom's average power usage and exceeding the power footprint of the global network of Bitcoin mining.

One of the most primary drivers of such growth in demand is the Taiwan Semiconductor Manufacturing Company (TSMC) manufacturing expansion, which has an important role in the manufacturing of AI chips. TSMC employs CoWoS (Chip-on-Wafer-on-Substrate) technology to enhance chip packaging and performance. TSMC doubled CoWoS production in 2024 and will double it again in 2025 as a response to increasing industry demands.

This expansion comes with environmental consequences. The International Energy Agency (IEA) has warned that the rapid deployment of AI systems could soon double the electricity use of all data centres globally. Despite some progress in energy efficiency and a shift towards renewable energy sources, the industry’s current pace is outstripping these improvements.

Growing sophistication in AI chip design also raises energy expenditure. Emerging packaging solutions such as CoWoS-L, although performance-boosting, are tainted with low output yields. This inefficiency adds to carbon emissions and resource utilization to produce AI hardware.

Some of the major technology firms are currently experiencing what they term as a "power capacity crisis." Firms like Google have been struggling to obtain sufficient electricity to drive their growing AI workloads. In other instances, firms have turned to fossil fuel supplies like natural gas in a bid to energize their operations. A single project has even bought 4.5 gigawatts of natural gas capacity solely to serve AI workloads.

Green cost of AI also greatly relies upon the geographical location of data centers. For instance, data centers which function in coal-dominated states like West Virginia emit almost twice the carbon dioxide than data centers which run in states like California which release more clean energy.

There is a lack of transparency in the industry contributing to the issue. Most technology firms do not divulge where or how their AI devices source their energy. This is a space of secrecy that keeps governments, green activists, and the public from accurately estimating or regulating the environmental impacts of AI growth.

The issue is not just an emissions issue but also an issue of power grid stability and resource management. With increasing AI workloads, the load on national power grids could potentially rise exponentially, especially in areas where there is little room for renewable power supply.

With so much on the line, stricter regulation, better reporting requirements, and greater accountability throughout the tech industry are being called for by experts. AI energy consumption is no longer a matter of theory—increasingly, it is a challenge-defining issue in balancing technological advancement and climate objectives.

As AI technology deepens its roots into the economies of our times, so is its carbon footprint growing at the same rapid rate. The coming months and years will determine whether the world can align sustainability with innovation or if it will have to make even more draconian choices between digitalization and sustainability.

Source:TECHSPOT

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow