Unlock Critical Insights from Top Tech Practitioners >> DISCOVER OUR voiceS of innovation program
GET CRITICAL TECH INSIGHTS > DISCOVER OUR VOICES OF INNOVATION PROGRAM
X

Using the Power of Micro Grids to Feed AI’s Insatiable Appetite

December 11, 2025

Data centers have always been power hungry, but the AI revolution has transformed them into energy consumers on an unprecedented scale. My recent conversation at the Open Compute Project Global Summit with Dr. Andrew A. Chien, professor of computer science at the University of Chicago and senior scientist at Argonne National Laboratory, alongside Solidigm’s Jeniece Wnorowski, revealed how AI’s insatiable appetite for energy is forcing a fundamental rethink of data center infrastructure.

Andrew brings a unique perspective to these challenges. He has worked both in academia and the information technology industry, including as a senior executive leading research at Intel. Through his journey, he’s learned which problems academics are uniquely positioned to solve versus those better suited for industry. His current focus spans two areas: accelerators for scalable graph analytics, and how data centers interact with the power grid. The latter brought him to OCP Summit.

The Power Density Challenge

The trends driving change are impossible to ignore. Andrew said he began thinking about the trend toward higher power density more than a decade ago. “I’d sort of figured Moore’s Law was coming to an end,” he said. “And that means that computing, which we seem to have an infinite appetite for, was going to consumer more and more power.” Anticipating this, he launched the Zero-carbon Cloud project to explore how data centers might harmonize with an increasingly renewable-based power grid.

The challenge extends beyond raw megawatts. As power grids transition to renewable sources, they’re becoming inherently volatile. Solar and wind generation fluctuate based on weather and time of day, creating periods of abundance and scarcity. Data centers, designed to run flat out at full load to maximize value, must now find ways to coexist with this fluctuating supply without sacrificing performance or reliability.

Micro Grids as the Bridge

Andrew’s solution centers on micro grids that provide flexibility in how data centers consume power and manage thermal loads. The concept addresses the fact that power grids are built to meet peak demand, which means they face stress during just one percent of the year. During those peak moments, if data centers could back off slightly from their demand on the main power grid for a few hours, they could dramatically ease its strain.

With a micro grid in place, such an offload could be possible without disrupting operations in the data center. AI training, inference, and other computing workloads could continue to run nearly uninterrupted, while the grid gains the breathing room it needs during stress periods. The micro grid would act as a buffer, filling gaps between what the grid can provide and what the data center requires.

Andrew’s recent research demonstrates that these power micro grids can be deployed as a small fraction of total data center cost. Lightweight generators and small-scale storage technologies make the approach economically viable even for massive facilities. The technology exists and is affordable relative to overall data center investments.

Breaking New Ground

If the technology is ready and cost-effective, what’s holding back adoption? Andrew identified two primary barriers, neither purely technical. First comes the question of who pays. Second, and perhaps more complex, is establishing clear responsibility between data center operators and power utilities.

Current debates between data center companies and power grids focus on connection costs and shared responsibilities. These discussions are breaking new ground, creating precedents for an entirely new relationship between computing infrastructure and energy systems. Andrew emphasized a lesson from his industry experience. “It’s not that industry won’t pay. It’s they want everyone to pay fairly. They don’t want to be disadvantaged,” he noted. With clear standards in place, he says, “We can have our cake and eat it too.”

The cultural challenge may prove equally significant. Data center operations have historically prioritized reliability above all else, with infrastructure optimized for stable, predictable conditions. Moving to dynamically managed systems that respond to fluctuating power availability requires embracing flexibility in an industry built on consistency. For organizations where reliability culture defines their identity, this shift can feel uncomfortable.

Beyond AI

While AI dominates current data center conversations, Andrew sees the massive infrastructure being built today as enabling far more than machine learning. The scale of computing infrastructure now available will support diverse applications and create opportunities across many domains. “There are other kinds of computing that are going to be enriching our lives, creating commercial opportunity, leading to exciting research for many more years to come,” he says.

TechArena Take

Andrew A. Chien’s work illuminates the infrastructure challenges hiding beneath AI’s exponential growth. His vision of micro grid-enabled data centers is a fascinating blueprint for sustainable computing at scale. As renewable energy transforms power grids and AI-enhanced workloads push data centers to new extremes, the solutions emerging from collaborations between academia, industry, and organizations like OCP will determine whether we can support computing’s future. The path forward requires not just technology but clear standards, shared responsibility, and willingness to embrace dynamic management in an industry built on stability.

Learn more about Andrew’s research at the University of Chicago Computer Science website.

Watch the full podcast | Subscribe to our newsletter

Data centers have always been power hungry, but the AI revolution has transformed them into energy consumers on an unprecedented scale. My recent conversation at the Open Compute Project Global Summit with Dr. Andrew A. Chien, professor of computer science at the University of Chicago and senior scientist at Argonne National Laboratory, alongside Solidigm’s Jeniece Wnorowski, revealed how AI’s insatiable appetite for energy is forcing a fundamental rethink of data center infrastructure.

Andrew brings a unique perspective to these challenges. He has worked both in academia and the information technology industry, including as a senior executive leading research at Intel. Through his journey, he’s learned which problems academics are uniquely positioned to solve versus those better suited for industry. His current focus spans two areas: accelerators for scalable graph analytics, and how data centers interact with the power grid. The latter brought him to OCP Summit.

The Power Density Challenge

The trends driving change are impossible to ignore. Andrew said he began thinking about the trend toward higher power density more than a decade ago. “I’d sort of figured Moore’s Law was coming to an end,” he said. “And that means that computing, which we seem to have an infinite appetite for, was going to consumer more and more power.” Anticipating this, he launched the Zero-carbon Cloud project to explore how data centers might harmonize with an increasingly renewable-based power grid.

The challenge extends beyond raw megawatts. As power grids transition to renewable sources, they’re becoming inherently volatile. Solar and wind generation fluctuate based on weather and time of day, creating periods of abundance and scarcity. Data centers, designed to run flat out at full load to maximize value, must now find ways to coexist with this fluctuating supply without sacrificing performance or reliability.

Micro Grids as the Bridge

Andrew’s solution centers on micro grids that provide flexibility in how data centers consume power and manage thermal loads. The concept addresses the fact that power grids are built to meet peak demand, which means they face stress during just one percent of the year. During those peak moments, if data centers could back off slightly from their demand on the main power grid for a few hours, they could dramatically ease its strain.

With a micro grid in place, such an offload could be possible without disrupting operations in the data center. AI training, inference, and other computing workloads could continue to run nearly uninterrupted, while the grid gains the breathing room it needs during stress periods. The micro grid would act as a buffer, filling gaps between what the grid can provide and what the data center requires.

Andrew’s recent research demonstrates that these power micro grids can be deployed as a small fraction of total data center cost. Lightweight generators and small-scale storage technologies make the approach economically viable even for massive facilities. The technology exists and is affordable relative to overall data center investments.

Breaking New Ground

If the technology is ready and cost-effective, what’s holding back adoption? Andrew identified two primary barriers, neither purely technical. First comes the question of who pays. Second, and perhaps more complex, is establishing clear responsibility between data center operators and power utilities.

Current debates between data center companies and power grids focus on connection costs and shared responsibilities. These discussions are breaking new ground, creating precedents for an entirely new relationship between computing infrastructure and energy systems. Andrew emphasized a lesson from his industry experience. “It’s not that industry won’t pay. It’s they want everyone to pay fairly. They don’t want to be disadvantaged,” he noted. With clear standards in place, he says, “We can have our cake and eat it too.”

The cultural challenge may prove equally significant. Data center operations have historically prioritized reliability above all else, with infrastructure optimized for stable, predictable conditions. Moving to dynamically managed systems that respond to fluctuating power availability requires embracing flexibility in an industry built on consistency. For organizations where reliability culture defines their identity, this shift can feel uncomfortable.

Beyond AI

While AI dominates current data center conversations, Andrew sees the massive infrastructure being built today as enabling far more than machine learning. The scale of computing infrastructure now available will support diverse applications and create opportunities across many domains. “There are other kinds of computing that are going to be enriching our lives, creating commercial opportunity, leading to exciting research for many more years to come,” he says.

TechArena Take

Andrew A. Chien’s work illuminates the infrastructure challenges hiding beneath AI’s exponential growth. His vision of micro grid-enabled data centers is a fascinating blueprint for sustainable computing at scale. As renewable energy transforms power grids and AI-enhanced workloads push data centers to new extremes, the solutions emerging from collaborations between academia, industry, and organizations like OCP will determine whether we can support computing’s future. The path forward requires not just technology but clear standards, shared responsibility, and willingness to embrace dynamic management in an industry built on stability.

Learn more about Andrew’s research at the University of Chicago Computer Science website.

Watch the full podcast | Subscribe to our newsletter

Transcript

Subscribe to TechArena

Subscribe