Powering AI with Speed: Pervez Siddique on Co-Locating Data Centers with Clean Energy Infrastructure

In the race to build out our AI-driven future, an often-overlooked truth is emerging: artificial intelligence runs on electricity before it runs on code. As model complexity increases, so does the computing power required to support it. A single AI training run, in some parts of the country, can consume as much electricity as dozens of U.S. homes do in a year. That energy has to come from somewhere—and how we generate and deliver it may be one of the defining infrastructure questions of the next decade.

While much of the AI industry’s growth has centered on cloud performance and compute scaling, a new priority is taking shape behind the scenes: ensuring that the energy needed to fuel this growth can be delivered reliably, sustainably, and locally. Time to power and power capacity are two of the most pressing issues in digital infrastructure development at the moment.

And the answer may lie in co-location—a strategy that aligns large-scale data centers directly with renewable energy and grid-scale battery storage infrastructure.

The Grid Is Not Invisible

As a clean energy executive who has led the development of over 6 GW of utility-scale solar, wind, battery storage, and power-to-X projects across the United States—including more than 3 GW within Texas’s ERCOT market—I’ve seen firsthand how energy demands are shifting in both volume and geography. Data centers, once confined to tech corridors and business parks, are now driving land and grid access demand in non-traditional regions like West Texas, Kansas, and Oklahoma.

But the grid isn’t infinite. In ERCOT alone, more than 70 GW of new load—much of it data center-related—is seeking interconnection over the next few years. Without strategic planning, these demands can quickly overwhelm infrastructure, especially in areas with limited transmission capacity or no firm generation backup.

Co-Location as a Practical Framework

Co-location isn’t a new idea. But its application to AI infrastructure is gaining renewed urgency. By strategically building data centers adjacent to—or directly integrated with—renewable generation and battery energy storage systems (BESS), developers and operators can align power demand with clean supply in real time, and provide the coveted up-time and reliability metrics data-center tenants desire.

In Texas, several 300+ MW BESS projects now under development are exploring dual-use zoning with hyperscale data facilities. These pairings enable several advantages:

  • Reduced grid congestion and transmission loss
  • Improved uptime through localized energy reserves
  • Stronger ESG positioning based on physical—not just financial—sustainability measures

From a development perspective, it’s complex. Coordinating land use, permitting, interconnection, and bilateral agreements with hyperscalers demands a multidisciplinary approach. But the payoff is a more resilient infrastructure footprint—and a clearer climate story.

Offsets Aren’t Enough

Many AI firms currently lean on power purchase agreements (PPAs) and carbon credits to meet sustainability goals. While helpful, these solutions often rely on virtual arrangements that do little to affect local grid strain or real-time emissions.

Co-location, in contrast, offers a physical answer to a physical problem. By reducing the distance between generation and load, it shortens the feedback loop between energy creation and use. In grids like ERCOT that operate independently from the rest of the U.S., these efficiencies matter—especially during high-demand seasons when every megawatt counts.

According to the U.S. Department of Energy, co-located facilities can cut transmission-related energy losses by up to 8%, depending on distance and load profile. In fast-growth markets with no capacity margin to spare, that’s not trivial.

A Moment of Infrastructure Choice

The conversation about AI’s impact on society often focuses on data privacy, ethics, or workforce displacement. But we can’t ignore the foundation beneath it all: the power it consumes.

The industry now faces a choice. Continue down a path where infrastructure is an afterthought, or design energy responsibility into the AI ecosystem from the start.

The good news is that the tools exist. So do the markets, the land, and the capital. What’s needed now is coordination—and the willingness to plan for power as intentionally as we plan for performance.

Pervez Siddique is a clean energy executive with over 15 years of experience leading utility-scale development and infrastructure M&A. A graduate of MIT and former President of the MIT Climate & Energy Club, he has directed more than $6 billion in clean energy project development across the U.S., including grid-scale solar, wind, BESS, RNG, and power-to-X initiatives.