A Siebel Energy Institute Project led by Laurent E Ghaoui, a professor of electrical engineering and computer science at the University of California, Berkeley, has demonstrated how machine learning algorithms can make energy production more efficient at cogeneration power plants.

Across the globe, local cogeneration, or “co-gen” plants provide electricity and heat to specific areas, like a neighborhood or part of a city. Known for being efficient, these individually managed systems burn fossil fuels to heat water in giant furnaces. The resulting steam powers their generators.

A new machine learning model, produced with support from the Siebel Energy Institute, makes co-gen plants substantially more efficient – saving both natural resources and reducing carbon dioxide (CO2) emissions.

A new machine learning model, produced with support from the Siebel Energy Institute, makes co-gen plants substantially more efficient – saving both natural resources and reducing carbon dioxide (CO2) emissions.

A new machine learning model, produced with support from the Siebel Energy Institute, makes co-gen plants substantially more efficient – saving both natural resources and reducing carbon dioxide (CO2) emissions. (Image credit: iStock Photo/ imantsu)

Co-gen plant technicians must consider future conditions when scheduling the plant to produce the necessary amount of steam to power the generators. Variables include things like holidays when more people will be home, and the weather. It takes many hours to heat water into steam, and if actual conditions differ from expected conditions, the plant will either produce too much steam or not enough resulting in costly or even impossible adjustments.

With an international team of researchers, Laurent El Ghaoui, a professor of electrical engineering and computer science at the University of California, Berkeley, developed an algorithm that more efficiently manages co-gen plant energy needs than human operators. By taking into account a complex array of variable data, the mathematical model has been shown to save a plant anywhere from 6%-15% in energy production costs.

El Ghaoui collaborated with EDF, one of the world’s leading energy companies, to test the machine learning model at a co-gen plant in the United Kingdom that provides power and heat to 700 homes. El Ghaoui and Dr. Stephanie Jumel from EDF’s Innovation Lab, discussed how data science and machine learning can be used to optimize the smart grid in an IEEE Smart Grid webinar.

Through simulations, El Ghaoui, with project partners Giuseppe Carlo Calafiore, a professor of electrical engineering at Politecnico di Torino, and Stéphane Gaubert, a professor of applied mathematics at École Polytechnique, found that machines are better at managing uncertain energy demands than people are.

Their project, “Robust Optimization for Local and Global Energy Management,” employs algorithms that calculate tradeoffs between expectations and uncertainty to make decisions more quickly than humans can.

El Ghaoui spoke with the Siebel Energy Institute about the project.

Q: Co-gen systems are typically managed in an ad-hoc fashion; that is, operations are custom-designed at each facility. How is this individuality a weakness for the plants in terms of efficiency and lack of automation?

Humans can become quite efficient at managing such plants, the same way a F1 pilot can become hard to compete with after intensive training. However, software can be better suited to operate co-gen plants, since they start making more efficient operational choices very soon after they are installed.

Q: In a co-gen facility, are machines better at making predictions than people simply because they make more calculations more quickly?

It is not just this “fast computation” aspect that is important. Machines are better at making predictions and handling the uncertainty inherent in any prediction because they can handle more variables simultaneously than a person can.

Q: Could you describe an example of a scenario you encountered during the simulations where the algorithm made a more accurate prediction about energy demand than a human operator?

We conducted extensive simulations under a large array of scenarios, and there were hundreds of such examples. Energy demand is constantly fluctuating, from minute to minute and from second to second. In the simulations the algorithm was able to respond to these fluctuations in near real-time because it is constantly looking ahead in a “rolling horizon” fashion. The algorithm prepares a range of contingency plans based on all of the possible fluctuation scenarios, so it is poised to make adjustments immediately, much more quickly than a person can react.

Q: Which were the most challenging variables to incorporate into your model and why?

The most challenging was accounting for the fact that a human operator typically makes real-time adjustments in response to changing conditions, or conditions that are different from the predictions. The model had to factor these aspects in too. The factors include a wide variety of things that impact human behavior, such as fluctuations in the job market and the economy, the price of oil, and the weather.

Q: What role does historical data from the plant play in the machine learning process? 

This is a very important aspect, since we needed to provide the model with a variety examples from the past so it could “learn” how to react optimally.

Q: “Robust optimization” as it relates to this research is described as a way to address uncertainties in decision-making. Could you provide an example of a “decision” to illustrate what this looks like in action?

Think about driving along a one-way, winding road in foggy weather. You would typically make guesses as to where the roadsides are, and you typically avoid being too close to where you think the roadside is, just because you are uncertain about the correctness of your guess. The more risk –averse you are, the closer to the center of the road you are; the less risk-averse, the more closely you will cut into the curves, in order to go faster. Making a calculated trade-off between the risk (driving safer) and the reward (going faster) is what robust optimization is about.

Q: Why is robust capability so critical to creating a model that is resilient in the face of errors or unforeseen circumstances?

Robustness is a critical aspect of any real-time decision making, due to the inherent trade-off between risk and reward.

Q: You likened the model to the autopilot system in an airplane because both systems free the operator from onerous mathematical calculations. What are the things a co-gen plant operator could focus on if he/she could rely on an automated system?

In the future, such plants will be much more automated. I am not sure that, in the future, there will be human operators running the plants on a day-to-day basis.  This is the classical story about automation. Today, a refinery process for example is largely automated. Humans operators are monitoring the process, not driving it.

Q: The project yielded a proof of concept with implementable software. What is the biggest challenge when it comes to integrating the software into an existing co-gen system?

The biggest challenge is developing software that is resilient and secure. A secondary challenge is the implementation, which requires custom installation.

Q: What are the next steps with EDF or other energy companies?

We are investigating follow-up research to expand our experiments, in collaboration with our partners at EDF Research and Development, notably Dr. Faille from the STEP department.