Ai’s Appetite: A Revelatory Look at Datacentre Costs That Could Force a Pause

The rapid push to embed ai in everyday services has triggered a global surge in datacentres, and with it an unexpected environmental bill that could reshape energy planning. Datacentre power demand is now growing four times faster than all other sectors, creating pressures on electricity systems, drinking-water supplies and consumer prices while raising tough questions about the social value of these new loads.
Background & context: why this matters now
International projections show datacentre power demand expanding at multiples of other sectors and is on track to exceed the electricity use of an entire industrialized nation by 2030. Generative ai models — those that create text, images and video — use far more energy than traditional computing. Estimates vary, but many studies place that extra consumption in the range of severalfold higher than conventional approaches. A peer-reviewed study published in the journal Patterns estimates ai’s global carbon footprint in 2025 at 32. 6 to 79. 7 million tonnes of CO2, and its water use at 312. 5 to 764. 6 billion litres — a volume comparable to global bottled-water consumption. In one regional forecast, datacentre energy demand is expected to triple within five years and to surpass the electricity used by electric vehicles in that country by 2030, with authorities warning of significant demand on drinking-water supplies as well.
Ai and the energy equation
Compute is no longer the primary bottleneck; the constraint is physical power. Modern models train on vast datasets using specialized chips in facilities that must run continuously to serve chatbots, image generators and other applications. The result: a new class of infrastructure that can consume as much electricity as whole cities. One large training facility would use roughly the annual electricity of 200, 000 American homes if run at full capacity for a year; other announced projects could draw gigawatts of continuous power and, on some estimates, demand more electricity than whole metropolitan areas. This intensifying load creates two distinct problems: locating reliable, firm capacity where datacentres sit, and meeting peak demand without undercutting broader decarbonization goals.
That spatial and temporal mismatch matters because it is not only total energy that counts but when and where it is available. Sampsa Samila, academic director of the AI and the Future of Management Initiative at IESE Business School, warns that “It’s not the overall supply of energy, but having reliable, firm capacity at the right place and the right time that is in short supply. ” The implication is clear: building more compute does not solve an electricity shortfall and may instead shift strain onto grids and other consumers.
Expert perspectives and broader implications
Experts who study digital infrastructure and climate impacts emphasize opacity and inefficiency. Prof Jeannie Paterson, co-director of the Centre for AI and Digital Ethics at the University of Melbourne, says limited transparency from companies on energy, water and emissions complicates policymaking and public oversight, but “it’s clear that training models and running datacentres is an energy intensive task. ” Ketan Joshi, a climate analyst associated with the Australia Institute, characterizes consumer-facing generative tools as “uniquely energy inefficient, ” pointing to the vast datasets and computational strain beneath everyday queries.
On system-scale outcomes, scholars and analysts warn of growing trade-offs. Jesse Jenkins, climate modeler at Princeton, describes these facilities as “the largest single points of consumption of electricity in history. ” Siddharth Singh, an energy-investment analyst at the International Energy Agency, projects that by 2030 U. S. datacentres will consume more electricity than the nation’s heavy industries combined — including cement, steel and chemicals. That scale risks slowing energy transitions, increasing emissions and raising power costs for other consumers.
The environmental consequences also extend beyond carbon. Intensive datacentre deployment can add significant pressure to drinking-water supplies used for cooling, and in some cases developers have turned to on-site fossil generation to meet immediate needs — a decision that intensifies local air-pollution risks.
Policymakers face a thorny choice: prioritize rapid deployment of generative services or impose tighter limits and disclosure to weigh benefits against environmental costs. Movements urging users to boycott certain ai services over surveillance and weaponization concerns have also prompted debate about whether environmental impact should drive similar personal or policy-level opt-outs.
As countries plan grids and regulators consider emissions and water allocation, the central question remains unsettled: will the social gains from ubiquitous ai justify the rising strain on energy, water and public budgets, or will limits be needed to align technological expansion with climate and public-interest goals?
How governments, utilities and communities reconcile that trade-off will determine whether the datacentre boom becomes a sustainable digital platform or an avoidable environmental burden — and whether individual choices to engage with ai will be seen as convenience or complicity?




