The Story You’re Not Hearing About AI Data Centers | Ayșe Coskun | TED

AI data centers consume vast amounts of electricity, straining power grids and earning a reputation as energy hogs, but their predictable and delay-tolerant workloads offer a unique opportunity to act as flexible assets that balance supply and demand, especially alongside intermittent renewable energy sources. By leveraging AI-driven power-aware scheduling and workload management, these data centers can reduce grid stress, lower electricity costs, and accelerate sustainable AI development, transforming energy challenges into solutions for a cleaner, more resilient future.

The world is currently engaged in an intense race to develop more advanced AI models, which requires the construction of increasingly large data centers. These AI data centers demand enormous amounts of electricity, sometimes equivalent to the power consumption of entire cities. This surge in energy demand is straining existing power grids, leading to challenges in regions where utilities struggle to keep up. For example, data centers in Ireland consume nearly 20% of the nation’s electricity, and communities near such centers have experienced significant increases in their electricity bills. This has led to AI data centers being labeled as energy hogs, a reputation that is well-deserved but only tells part of the story.

Beyond their high energy consumption, AI data centers have the potential to become flexible assets for the power grid. Unlike homes or hospitals, the workloads in AI data centers are often predictable, controllable, and delayable, allowing these centers to adjust their power usage in response to grid demands. This flexibility can help balance supply and demand, making electricity more affordable and resilient. Moreover, the rise of AI data centers coincides with the growth of renewable energy sources like wind and solar, which are intermittent. By aligning AI data center operations with renewable energy availability, it is possible to support a cleaner and more sustainable energy future.

This vision of flexible AI data centers is the result of years of research into energy-efficient computing and power-aware scheduling. Early skepticism about intentionally slowing down computing tasks has been overcome by recognizing that many AI workloads can tolerate delays or reduced speeds without impacting user experience. By reframing the problem to focus on meeting power grid constraints while maintaining user performance, researchers developed strategies such as capping power usage, shifting workloads, and provisioning data centers as flexible reserves for the grid. These approaches have been successfully tested on real data center servers, proving that power-flexible AI computing is feasible.

The urgency of this approach is underscored by the limitations of traditional power infrastructure. Renewable energy supply fluctuates, nuclear power is slow and costly to deploy, and battery storage is expensive and environmentally challenging to scale. Meanwhile, AI data centers face long wait times—up to five to seven years—to connect to the grid, which is incompatible with the rapid pace of AI development. By making data centers power-flexible, they can immediately help absorb excess renewable energy, reduce peak demand, and act as virtual batteries, easing grid stress and lowering electricity costs. This flexibility can prevent blackouts and accelerate AI adoption by enabling faster grid connections without waiting for new infrastructure.

However, managing this flexibility is complex due to fluctuating electricity prices, unpredictable workloads, and varying grid regulations. This is where AI itself plays a crucial role, acting as a conductor that orchestrates data center operations in real time to match power availability and user demands. The speaker’s team has developed software that dynamically adjusts workloads across data centers, ensuring performance commitments are met while optimizing power use. This innovation not only facilitates faster integration of AI data centers into the grid but also enhances the overall sustainability and resilience of energy systems. Ultimately, the future of AI depends not just on how much energy it consumes, but on how much flexibility and clean power it can unlock to build a sustainable AI-powered world.