IBM prepares software to better read an ‘intelligent grid’

By Colin Sullivan | 07/07/2015 08:23 AM EDT

YORKTOWN HEIGHTS, N.Y. — Researchers at IBM recently offered an inside glimpse into how the company’s energy team is developing software products and services for utilities looking to get ahead of the power grid’s transformation to a more distributed future.

YORKTOWN HEIGHTS, N.Y. — Researchers at IBM recently offered an inside glimpse into how the company’s energy team is developing software products and services for utilities looking to get ahead of the power grid’s transformation to a more distributed future.

The company’s software offering is still very much under development but already has a name — Opus — reflecting more than 15 years of thinking, plotting and anticipation at IBM’s Thomas J. Watson Research Center here, within an hour’s drive of New York City.

Opus will formally be introduced in October but is already active in the pilot stage, with several demonstration projects in play to work out the bugs. The problem Opus means to address is fairly simple, even if its application is about as complex as applied research can get: IBM wants to take all the data generated by new grid technologies and help make sense of it.


Take smart meters. According to an internal IBM presentation on Opus, screened for utilities during an event last month, the company expects smart meters to produce more than 150 quadrillion bytes of data per year worldwide by 2020.

That’s 150 followed by 15 zeros, or the amount of information that would be generated by more than 800 million smart meters worldwide by 2020, according to IBM calculations. All of which means an overload of data piled atop an already high level of uncertainty about weather, consumer behavior, fuel costs and energy regulations.

Enter Opus, which is meant to merge IBM’s long history of expertise in analytics with utility know-how into a single picture meant to project supply and demand — all with the goal of wasting less energy and helping to realize a more distributed reality that does not impair reliability or undermine industry profits.

The concept has evolved alongside new policies like New York’s forthcoming "reforming the energy vision," or REV, which is supposed to establish new rules for managing a grid that IBM sees as increasingly instrumented and more intelligent. Under this paradigm, utilities will serve as energy aggregators or distributed system platform providers, with the goal being a more flexible grid that fails less and gets more out of assets.

Ron Ambrosio, chief technology officer of Smarter Energy Research at IBM, has been analyzing all this uncertainty for years with an eye on how variables are bound to increase in the coming age of distributed generation and transmission. He says that not only will the utility industry have to solve equations with more variables — they’ll also have to do it faster than ever.

The trends appear to bear this out: Renewable energy is increasingly cost competitive; state mandates have increased investment; demand-side approaches are growing; and energy cost is more often based on time and place of use and not a static variable.

An Opus to ‘pull it all together’

Ambrosio says uncertainty for the industry will only continue to increase as the business becomes fragmented into more instruments and what he calls "intelligence." With all that in mind, IBM just completed a five-year stint advising the Energy Department as part of the Pacific Northwest Smart Grid Demonstration Project — which was billed by DOE as the largest smart grid demonstration in the United States.

Ambrosio was a key figure in that pilot, which spanned five states and about 60,000 metered customers. The goal was to start building a more dynamic electricity infrastructure that does a better job of containing costs, lowering emissions, incorporating more renewable energy sources, improving grid reliability and giving consumers greater flexibility.

He called the concept from his viewpoint a "transactive control system" that uses economic value signals to distribute decisionmaking throughout the electric grid to individual devices and users. In other words, it means two-way communication to optimize when power is cheapest along a smart grid of the future scaled to make a host of new technologies work together seamlessly while extracting data and not compromising reliability.

The result of which will be IBM’s Opus, Ambrosio said, explaining that the company’s experience with the prototype grid has informed its work on software development. Though DOE has yet to report on the Pacific Northwest project, Ambrosio called it a success that has laid the groundwork for other demonstrations with forward-looking utilities, including Hydro Quebec, DTE and Xcel.

"Opus is starting to pull it all together," he explained in an interview. "Now that we’ve got a lot of the underlying pieces, we have real prototypes and have started to move ahead into the product line."

Ambrosio was emphatic that utility participation in his research is essential, and the company is still looking for more partners in the industry. He noted that Hydro Quebec "has already started to give serious thought to the implications of uncertainty from market operations to grid operations," and he thinks other utilities should do more of the same.

"We need partners in the industry, we can’t do this on our own," he said. "No one company can really solve these issues by themselves. We have been working for years on different pieces of this."

Flexible applications

The IBM presentation on Opus described a product that will be scalable and open, to allow utilities to contribute to an "uncertainty workbench" so that clients can make real-time decisions "without leaving performance or value on the table." It compared the evolving industry to IBM’s experience in using statistics to "eke more performance" out of computer chips — which IBM says resulted in 9 percent costs savings in a given year for a $100 million company.

Opus itself is a kind of umbrella term, as the product could be adapted to take into account variables such as wind or solar energy forecast along with demand expectations to, say, make decisions on which assets to repair, replace or upgrade. IBM says the correct solution for a utility that knows what it’s doing could be worth billions.

For Hydro Quebec in Montreal, IBM has deployed what Ambrosio called "a copy" of the current platform so that HQ can connect with live data feeds and make decisions. The goal is to use the same sort of platform as defined by clients (or partners) to identify how the system can best be deployed when factors change from utility to utility.

"This is very much applied research," Ambrosio said. "We are trying to push the boundary as fast as possible."

Which is also why IBM is sitting on advisory councils to the New York Public Service Commission as it works through draft REV rules. Ambrosio described the PSC’s work so far as "really very forward looking," and he views New York as approaching California in terms of willingness to take chances and open the field to innovations.

"New York is really right there now" in comparison to California, he said. "When I go around and talk to people, New York is on everyone’s mind."

Transformed by data

As for the upside of Opus at IBM, Ambrosio says the company is well-positioned but noted "significant changes to how an entrenched industry operates" — so there is plenty of risk involved. Of energy conferences, he said, "10 or 12 years ago, they would ask, ‘Why are you here?’ We were able to identify and start to engage an understanding well ahead of the market emerging."

He added: "It’s been a very good example of how IBM research works well for IBM the company. It’s the growth opportunity that is really the carrot we’re going after."

Chandu Visweswariah, IBM fellow for Smarter Energy and Environmental Science, added that he feels utilities need the help because the old way of doing calculations was one-sided and "left money on the table." He thinks Opus can improve the approach because it is primarily a software approach getting built out in consultation with "deep industry connections."

"IBM feels that data will transform a number of industries," he said. "If you can do the right things with this data, you can make better decisions. This is our interest."

Visweswariah also believes a crucial factor is how regulatory agencies like the PSC will incentivize customers to care and take advantage of new technologies. He called New York "a proving ground" under REV, which he hopes will move faster.

"For me, it can’t move fast enough," he said. "I love the REV. The first time I read it last year I was shouting out every few pages as to how much it was aligned with our own thinking.

"We all see the path forward," he added. "The question is, how do we get there from here?"