How Google turned its climate program into an AI booster

By Ariel Wittenberg | 04/22/2026 06:44 AM EDT

A “carbon-intelligent computing” tool has come in handy as the tech giant negotiates with utilities to connect data centers to the grid.

The exterior of a Google Data Center is shown on Thursday, April 2, 2026, in Henderson, Nevada.

The exterior of a Google Data Center is shown earlier this month in Henderson, Nevada. Ty ONeil/AP

A computing platform Google first developed to meet its climate goals is now helping the technology giant quickly connect data centers to the grid.

Google created its “carbon-intelligent computing platform” in 2020 to assign energy-intensive tasks to data centers when their local grids were flush with renewable energy. Now the company is leveraging the platform — and its ability to shift computing tasks among data centers — to jump to the front of the grid connection queue.

Utilities have agreed to power Google’s supercomputing hubs in exchange for the company ramping down energy use at times of peak grid demand. As of last month, Google had integrated so-called demand response into five utility contracts across the South and Midwest, making up to 1 gigawatt of its data centers’ electricity demand available for curtailment.

Advertisement

“The capabilities we developed for carbon-intelligent computing, we have built those into our demand response capabilities,” Michael Terrell, Google’s head of advanced energy, said in an interview. “They are very much related. It’s very similar capabilities in terms of how we manage our compute.”

Google isn’t alone. Utilities and technology companies expressed growing interest in demand response deals at last month’s CERAWeek by S&P Global energy conference. The startup Emerald AI, which produces software to help data center workloads respond to grid demands, recently announced it raised $25 million in a funding round backed by computer chipmaker Nvidia and power giants GE, Vernova and Siemens.

The pivot comes as utilities race to keep up with the unprecedented energy needs of artificial intelligence — and as multiple technology companies pull back on climate and clean energy goals.

When Google developed its carbon-intelligent computing program, the company was confident it could power all its operations with clean energy by 2030. Today, the company calls that goal a “climate moonshot.”

Google’s latest environmental report noted that “external factors — largely outside our direct control — are converging to create significant uncertainty,” including “AI’s energy demands, policy uncertainties, resource-challenged markets and more.”

The same report says that despite a 27 percent increase in electricity consumption, Google’s data centers decreased their energy emissions by 12 percent in 2024, compared to 2023, largely due to the U.S. grid adding more clean energy.

Since then, however, President Donald Trump has torn down Biden-era climate policies — including programs to boost renewable energy — and pushed fossil fuels, most recently by invoking wartime powers.

Still, boosters of data center demand response programs say they will enable development of more clean energy down the road. Emerald AI CEO Varun Sivaram asserts that data centers can match renewable energy’s variability, using that power when and where it’s available.

“If you have flexible energy users, that’s a shock-absorber to the grid so that when you do have more variable renewables on the grid, you have data centers that can respond,” Sivaram said in an interview.

Shifting demand

The idea of shifting computing capabilities among data centers popped up in a 2024 Department of Energy advisory report, which included six recommendations for supplying power to data center infrastructure and AI. Among them were more research into renewable and nuclear energy, as well as into the ability of data centers to “leverage improvements in how AI models are trained and queried to mitigate stresses on the energy grid.”

The report honed in specifically on two potential ways to shift the energy loads of AI.

The first would be through coordinating the timing and location of one of data centers’ most energy-intensive tasks: training large language models for AI. Delaying training to times with less grid demand could potentially ease strain on the system.

The second suggestion was to spread out AI energy demand by tasking multiple data centers with responding to user queries of AI chatbots.

The report came out just as utilities started fielding requests for immense amounts of power from hyperscale data centers needing 100 megawatts or more, said John Dabiri, a professor at the California Institute of Technology who co-authored the report.

“At the time, people weren’t really sure if these requests would materialize, and there was concern that a utility could build out all this extra power generation and have the customers not materialize,” Dabiri said in an interview. “If it was real, we wanted to make sure that the demand didn’t become an undue burden on the American people.”

Just a few months later, the Electric Power Research Institute launched a new research and pilot program — DCFlex —to examine how to deploy those computational flexibilities as part of a demand response program.

Demand response programs are not new, but they have evolved for the digital age. Before the boom in AI and data centers, a factory or other large industrial client might agree to lower its energy use during times of peak demand in exchange for lower electricity prices overall.

Such deals were not attractive to technology companies until utilities realized there was another incentive to offer: faster connection to the overtaxed grid for large loads that can ramp down with a few minutes’ notice. EPRI’s DCFlex has a number of big-name partners, including Google, Meta, Nvidia, Compass Datacenters and QTS Data Centers, along with multiple electric utilities.

“The real reason we have all these hyperscalers interested in flexibility projects is that it can get them connected months to years faster, that’s really the incentive,” said Tom Wilson, an EPRI technical executive.

If the practice becomes widespread, it could have real impacts on the grid, according to a 2025 white paper published by Duke University’s Nicholas School of the Environment.

Some 76 gigawatts of new load could be added to the grid with “minimal capacity expansion” if new customers agreed to curtail their maximum potential annual energy consumption by 0.25 percent during the highest load times, the paper calculated. In that scenario, the average length of a load curtailment “would be relatively short, at 1.7 hours.”

The grid could accommodate even more new customers if each agreed to higher average annual load curtailment rates, according to the paper. The paper’s author, Tyler Norris, now works at Google.

A bridge to clean energy?

Google won’t say the exact terms of its demand response agreements with utilities. But the company has published blog posts and released videos about specific incidents where its data centers shifted their workloads to accommodate the grid. That includes during February’s Winter Storm Fern, when Google reduced power demand at one of its Oklahoma data centers in response to a request from the Grand River Dam Authority.

While Google has publicized its capabilities, it also says it can’t enter into such agreements in every location. In an interview, Terrell declined to provide specifics, saying he did not want to “speculate about hypotheticals” and noting that each of Google’s demand response arrangements is unique, depending on the data centers in question, as well as the utility and its power generation mix.

Google said it still uses the carbon-intelligent computing platform for its climate program, as well as demand response. While Terrell did not discuss Google’s climate goals, he said the company views demand response arrangements as a “bridge” to allow utilities and data centers to delay building more fossil fuel generation and “go look for carbon-free resources.”

Sivaram agrees that more flexible data centers can help aid deployment of renewable energy. He started EmeraldAI after reading the Department of Energy advisory report on AI.

At the time, he said, “Google did this but no one else talked about it.” Now his company is supporting a DOE push for the Federal Energy Regulatory Commission to officially “fast track” grid connections for data centers that enter demand response programs.

Part of Sivaram’s pitch is to use data centers to offset renewable energy’s variability. He wrote to FERC in November that software such as Emerald AI could be used to “route” demand from data centers in an area where the grid is overtaxed to one with “excess wind generation.”

“This capability acts as a ‘virtual transmission line.’ Instead of moving megawatts of electrons across a congested wire to meet a static demand, the system moves the demand for these electrons to a location where power is abundant,” he wrote.

If more data centers can be flexible in their demand, Sivaram said in an interview, “they get transformed from this amazing load and burden on the grid into an amazing resource where you can send the variable solar or wind generation.”