Ed Ansett
14 November 2024
AI and the Future of Data Centre Design
Artificial Intelligence (AI) is no longer just a futuristic concept; it’s a technology that’s rapidly transforming industries across the globe. As businesses increasingly adopt AI to drive innovation and efficiency, data centres have become crucial in supporting this growth. This growing reliance on AI brings with it a set of new challenges and data centres need to process larger workloads and manage unpredictable power demands. Thus, the need for flexible, scalable, and sustainable data centre solutions has never been more pressing.
In September, we hosted our first in-person AI event, “What Does AI Really Mean for Data Centre Design?” in London’s iconic Turing Lecture Theatre. The event offered an engaging platform for leading industry experts and participants to explore how AI is transforming the future of data centres. From insightful panel discussions to presentations, together we addressed the emerging challenges, opportunities AI brings and the importance of design flexibility.
The growing integration of AI into business brings significant challenges to data centres in adapting to the demands of AI workloads. One of the most pressing issues is the massive increase in energy consumption “where it's not really in a few tens of megawatts anymore, it’s hundreds of megawatts” Zahl Limbuwala, Operating Partner at DTCP said during his fireside chat with Ed Ansett, Founder and Chairman at i3 Solutions Group, Part of Ramboll. This shift demands new power management strategies and design flexibility.
AI also introduces new complexities in the way a data centre is designed. “What are we designing? What differences are we seeing in terms of engineering design, in terms of density configurations, and the opportunities that AI presents in terms of efficiency and energy savings” Limbuwala explained further. The need to accommodate increased density while improving energy efficiency.
AI’s influence on data centre design is creating a “step change in the Mechanical, Electrical, and Plumbing (MEP) infrastructure,” said Luke Neville, Managing Director at i3 Solutions Group, Part of Ramboll during the main panel discussion.
Traditional data centres were never built to handle the unique demands of AI workloads, which are denser and more power-hungry than conventional applications. “High-density computing is not practical to cool with air anymore,” pointing to the growing need for liquid cooling solutions. “Liquid cooling is a must” Neville added.
However, with no single standard for liquid cooling, data centres must carefully choose the right technology for their specific needs. This evolving landscape demands flexibility, Ed Ansett explained “You need flexibility in what you design at day one” that AI infrastructure must be adaptable to future developments as technology continues to evolve at a rapid pace.
Power management is another critical area that the panellists were concerned about as AI places unpredictable and significant pressure on power grids, with spikes, sometimes “within milliseconds” that are difficult for grid operators to manage. Christine Halberg, Renewable Energy Consultant at Ramboll explained “If we operate our data centres the way we do today, it's predicted to triple the data centre-related consumption by 2030”. The complexities introduced by AI’s power demand “create unpredictable spikes and fluctuating power demands, and this turns out to be really, really difficult to manage from a power system perspective”.
To address these challenges, the potential of Battery Energy Storage Systems (BESS) was discussed, which can help smooth out spikes in power demand and ensure stable operations. “Batteries can provide almost instantaneous power supply”, which help to bridge the gap between renewable energy production and AI consumption, Halberg explained. (For more on Battery Energy Storage Systems – BESS, check out our article here)
During the main panel discussion, the panellist dived into the uncertainty surrounding AI’s future infrastructure requirements. Andy Lawrence, Executive Director at Uptime Institute described the current situation as a “massive, trillion-dollar experiment” with many unknowns in terms of cooling, power, and long-term viability of AI systems. “People don’t know how they’re going to cool these systems, or what level of power they’ll need” Lawrence added.
“The challenge for traditional IT was building an asset to last 20 years, but with AI, those cycles have compressed dramatically” said Ambrose McNevin, Panel Moderator. Unlike previous data centre models that followed a predictable upgrade cycle, AI introduces much shorter and less predictable technology lifecycles. As a result, data centres must be designed with built-in flexibility, enabling them to evolve and scale as AI technologies continue to advance.
This demand for flexibility means that infrastructure needs to accommodate not only current workloads but also the shifts in power, cooling, and density that AI will require in the future.
AI's rapidly increasing demands for higher power densities, cooling efficiencies, and real-time adaptability mean that the traditional models of data centre design can no longer keep up. Industry experts highlighted that data centres must prepare for higher power densities, more efficient cooling systems, and increased reliance on technologies like BESS to manage power fluctuations.
Flexibility will be the key to supporting AI's future needs as discussed during the event. By adopting modular, scalable, and sustainable solutions, data centres can remain resilient and efficient, ensuring they meet the challenges posed by AI.
Want to know more?
Ed Ansett
Head of innovation and business development