IEA estimates that in 2024, data centres consumed approximately 415 terawatt-hours of electricity, representing about 1.5 pc of total global demand (Photo: OECD)
Artificial intelligence (AI) has rapidly become the defining technology of our age. From conversational assistants and autonomous systems to generative models that create lifelike videos and music, AI now powers an astonishing range of human activity.
Yet, behind this impressive digital revolution lies a growing environmental and infrastructural burden. As AI models become more sophisticated, the energy, water and computing hardware needed to build and sustain them are expanding at an alarming pace.
The world’s fascination with AI’s potential has overshadowed the more important questions surrounding its sustainability. Each innovation, every newer model or faster chip brings with it unseen costs measured not in dollars, but in carbon, electricity and water.
“With the advent of newer generative AI features, we are aiming to demonstrate advanced capabilities which in turn means higher calculations, and consequently, a higher energy consumption per unit of generated information,” a machine learning researcher working at a leading tech firm in the United States, who did not wish to be named, tells Media India Group.
“There is a trade off right now, and currently the industry is biased towards an advancement in capabilities. There are ongoing efforts to procure the energy sustainably, but considering the scale at which we are advancing in the AI consumption space, it leaves a lot to be desired,” he adds.
The hidden cost of intelligence
According to the International Energy Agency (IEA), global data centres, which host the computing power behind AI systems now account for around 180 million tonnes of CO₂-equivalent emissions each year, roughly 0.5 pc of all global fuel-combustion emissions.
The IEA estimates that in 2024, data centres consumed approximately 415 terawatt-hours of electricity, representing about 1.5 pc of total global demand. The agency further projects that, by 2030, global data centre electricity consumption could more than double to nearly 945 terawatt-hours, with a large share of that demand driven by AI workloads.
The environmental implications extend beyond energy. Training and operating AI systems require vast amounts of water for cooling, and their hardware depends on resource-intensive chip production.
A 2025 report by the International Telecommunication Union (ITU) found that large digital companies saw their indirect emissions rise by an average of 150 pc between 2020 and 2023, driven largely by AI-related energy use.
Also Read: Artificial Intelligence transforming disease diagnosis
Similarly, researchers at Stanford University’s Human-Centered AI Institute estimate that training the GPT-4 model in 2023 generated about 5,184 tonnes of CO₂ emissions, while Meta’s LLaMA 3.1 in 2024 produced roughly 8,930 tonnes. Even moderate-sized language model training, according to academic studies, can emit nearly 493 tonnes of CO₂ and consume around 2.8 million litres of water, roughly the annual water use of 25 average households.
These findings reveal that AI’s environmental toll is growing rapidly, and efficiency gains are not keeping pace with demand. The IEA’s 2025 assessment also shows that AI-related data centre energy use has been growing by nearly 20 pc annually over the past three years, primarily driven by large language and multimodal models.
“Generative AI models like those that make videos or create high-quality images, use huge amounts of electricity to train and run,” Ramanand Pandey, Director at the Centre of Policy Research and Governance (CPRG), an independent, non-profit think tank tells Media India Group.
“For example, training a single large model can use as much power as hundreds of Indian households consume in a year. Most of this energy still comes from coal and other nonrenewables, especially in countries like India, China, or the United States where grids are not fully green yet. So, while AI is exciting, its rising energy demand puts more pressure on our power systems and increases carbon emissions unless we move to cleaner sources quickly. The concern lies not just in how much energy is used, but in where that energy comes from. In India, where coal still provides over half the electricity, more AI means more emissions unless renewable capacity scales up equally fast,” he adds.
Efficiency versus expansion
Despite mounting evidence of AI’s growing environmental impact, development continues at full speed. Companies like OpenAI, Google, and Meta are releasing ever more capable models, often highlighting improved efficiency or carbon neutrality in their announcements. OpenAI’s Sora 2, for instance, showcases breathtaking advances in video generation, while Google’s Bard AI and Meta’s LLaMA 3.1 boast faster, more accurate performance with supposedly lower per-query energy use.
Yet, the overall picture is far less reassuring. Independent analyses consistently find that while newer AI systems may be more efficient per transaction, total emissions still rise because usage scales exponentially. As the IEA notes, efficiency alone cannot offset the sheer growth in global AI demand, much like how more fuel-efficient cars fail to reduce total emissions if the number of cars on the road keeps increasing.
“These efforts are definitely not sufficient although many companies are trying, by making models more efficient, buying renewable energy credits, or improving data center cooling, but the progress is not keeping pace with how fast AI use is growing,” says Pandey.
“For example, tech giants like Google and Microsoft are investing in clean energy for their data centers, but smaller companies often don’t have those resources. Also, there’s still not enough transparency about how much energy each AI model actually uses,” he adds.
The pressure on infrastructure
The global AI boom is also reshaping the infrastructure landscape, especially in fast-growing economies like India. A recent Deloitte India report projects that by 2030, the country may require an additional 40 to 45 terawatt-hours of electricity each year to power new AI data centres, a figure equivalent to the annual consumption of millions of households. The same report anticipates a need for around 4,645,150 sqm of new data centre space nationwide.
If current trends continue, India’s data centre carbon emissions, now around 7.3 million tonnes annually, could rise to over 19 million tonnes by the end of the decade.
While cloud providers like Amazon Web Services and Microsoft Azure are investing heavily in renewable energy and efficiency, smaller firms lag behind. Encouragingly, a joint study by AWS and Accenture found that migrating AI operations to optimised cloud environments could reduce carbon emissions in India by up to 99 pc, owing to cleaner power and better hardware utilisation.
Still, the problem persists.
Also Read: https://asiacom.in/artificial-intelligence-for-digital-marketing/
“Efficiency gains are being outpaced by growth in demand. Just like how fuel-efficient cars do not reduce pollution if the total number of cars keeps rising, efficiency alone is not enough without overall emission limits and renewable integration. Looking at the concept of Green AI will be helpful in this case. Green AI principles encourage developers to track how much carbon their models produce, write code that uses less energy, and openly share information about their energy use. But in reality, these good practices are not followed consistently by all companies,” says Pandey.
A price of progress
Multiple companies speak of “greener AI” and “carbon-aware computing,” yet the global numbers tell a different story. The IEA warns that without faster renewable deployment, nearly half of the additional electricity required for data centres this decade will still come from fossil fuels. Meanwhile, the water and mineral footprints of chip manufacturing continue to rise, further burdening ecosystems already under stress.
As the technology grows more powerful, its infrastructure demands threaten to eclipse the sustainability gains made in other sectors. The challenge now is not merely to make AI smarter, but to make it sustainable.