- Data storage: a new era of inefficiency?
- An abundance of static data taking up energy
- Consider the existing energy demands of software
- Strategies to reduce energy consumption in software design
- Beyond the blueprint: rethinking our approach to data storage
- Recognising data centres as national infrastructure
- Looking towards a sustainable digital future
Professor Aoife M. Foley of the University of Manchester writes on the AI energy consumption conundrum and how a critical piece of the puzzle has been missing from the dominant narrative: energy demand is about far more than silicon and servers.
The International Energy Agency (IEA) recently raised the alarm on the rapidly increasing energy footprint of data centres, cloud computing, and artificial intelligence (AI), although others have been highlighting this issue for several years. With electricity demand from data centres forecast to more than double to over 1,000 terawatt-hours (TWh) globally by 2026, equivalent to Japan’s total electricity consumption, the urgency to act is undeniable.
This surge in demand is being driven in particular by AI workloads, which consume 300 to 500 percent more energy than traditional cloud computing tasks. While hyperscale operators are investing in renewable energy to offset this growth, cleaner energy alone is not enough.
The underlying structure of digital systems, how data is stored, how code is written, and how it’s designed for efficiency, has been overlooked for too long. However, the dominant narrative misses a critical piece of the puzzle: energy demand from digital infrastructure is not just about silicon and servers. It is also about software, structure, and the way businesses produce, store, and process data.
Have you read:
Three ways AI has been moulding the global energy market
AI in the energy sector – growing energy demand but potential to transform it
Data storage: a new era of inefficiency?
Recent research at Avantern reveals that the industry is entering a new era of inefficiency, one defined not by hardware constraints but by the unchecked freedom given to bloated code, inefficient algorithms, and an economic model that treats data as virtually free. This is not merely a performance issue – it’s a fundamental breakdown in discipline that’s holding companies back.
Code and data are bloated because computation has become too cheap to question, leading to an excess of data and widespread issue of energy consumption. From redundant data to dark data, each consumes energy and storage resources. Not only this but code is written without regard for energy efficiency, as default configurations waste energy, and AI model design is too often based on brute force rather than thoughtful engineering. Organisations need to clean up the stack, from silicon to software, and lead by example.
An abundance of static data taking up energy
Both industry and academia have embraced the exponential growth of big data, but little attention has been paid to how much of it is unnecessary or poorly managed. Research has found that two pervasive types of underutilised data: redundant and dark. Redundant data, semi-structured or outdated content no longer needed for active processes, accumulates across databases and systems, often without oversight.
Meanwhile, dark data refers to the vast quantities of information collected but never used, like a digital junk drawer that consumes electricity and cooling without any return. It is estimated that over 90% of data collected via IoT devices is never analysed. Moreover, up to 60% of this data loses its value within milliseconds of being generated, yet it is still stored, driven by compliance concerns, habit, or the flawed belief that more data is always better. The result is billions of kilowatt-hours spent preserving digital dust.
Consider the existing energy demands of software
The software is just as problematic. For too long, developers have been insulated from the energy consequences of their code. Code is often written with speed and feature delivery as the main objectives, with little thought for energy use.
In a study by Cai and Karsten, modifying a standard Linux system to suppress unnecessary hardware interrupts led to a 45% improvement in throughput, with no hardware changes. This is a stark reminder that even minor adjustments can yield major efficiency gains. However, most systems still run with inefficient defaults, creating thousands of micro-inefficiencies that accumulate into significant waste.
Despite this, developers continue to use brute-force approaches, throwing compute at problems that could be solved with better software design. AI has taken these trends to a new level. Recent research at Avanternii into thermal management shows that large-scale AI models are now among the biggest contributors to data centre heat.
For example, a single AI query can consume ten times the energy of a typical web search. Training a generative model may require hundreds of thousands of Graphics Processing Unit (GPU) hours, often powered by carbon-intensive grids.
Strategies to reduce energy consumption in software design
While cooling technologies such as liquid immersion and direct-to-chip cooling can improve energy efficiency, they only address symptoms, not the underlying inefficiencies in model design. Professionals need to ask better questions. Are companies choosing the right model architectures for the task? Are they designing for efficiency or just scale? Are they optimising for performance or simply defaulting to brute force?
The problem goes deeper to the economic model Itself. Compute and storage have become so inexpensive that there is little incentive to optimise. Bloated data, bloated code, and bloated services persist because their cost is hidden, spread across thousands of servers over time. Meanwhile, developers write JavaScript to render static pages and refresh entire interfaces for minor changes.
Beyond the blueprint: rethinking our approach to data storage
As Moore’s Law slows and chip efficiencies plateau, the belief that performance will continue to improve without intervention is no longer valid. Not only is there a need for better chips, but better decisions. Digital housekeeping and greater collaboration, even among competitive players, must become part of our shared responsibility. This responsibility includes auditing our data, archiving meaningfully, and deleting what is no longer necessary, and writing code with energy efficiency in mind.
Moreover, it means rethinking cost models, introducing carbon pricing for compute-heavy tasks, particularly AI. Strubell et al. (2019) estimated that training a state-of-the-art NLP model with hyperparameter tuning can emit up to 284 tonnes of CO₂, with even modest training runs exceeding 20 tonnes. Voices in the field argue for ‘Green AI,’ promoting efficiency and transparency as key evaluation metrics in AI research.
Recognising data centres as national infrastructure
Businesses must also reimagine the role of data centres as national infrastructure. Research on grid-interactive data centres shows they can support the grid by offering flexible demand, fast frequency response, and ancillary services. They also produce waste heat that can feed into district heating systems, especially in colder countries like the UK. To realise this potential, planning or land zoning reform is needed.
Data centres, unlike heavy industry, are quiet and have minimal traffic impact. Locating them within urban areas would reduce transmission losses and make waste heat recovery more feasible. Planning policy must reflect this reality. The waste heat is suitable for heat networks, and they can supply localised balancing, but electricity markets will need to be revisited.
At the University of Manchester, researchers are examining these challenges at Avanternii, which integrates lifecycle assessment, energy modelling, and explainable AI to support practical, policy-ready solutions. Many believe data centres are no longer background infrastructure. They are essential to the energy transition and must be designed, regulated, and integrated as such. Blind faith in digital abundance cannot be afforded, nor can it be assumed that innovation alone will clean up the stack.
Looking towards a sustainable digital future
The age of digital housekeeping is overdue and is even more urgent with the rapid rise of AI at scale globally. It is time to begin. This transformation is not just technical, it is strategic. Rapid expansion of data centre capacity must be balanced against infrastructure pressures and carbon targets. But it will only work if companies do their housekeeping, from silicon to software. There is an increased need for Sustainable AI. If you prefer the term Green AI, so be it. Let us be the legends of AI, not creators of another plastic crisis.