The smart power grid for cities

The smart power grid for cities

| stock.adobe.com/Sergey Nivens
2022-07-01 publication

Energy Transition: A smart power grid

To successfully decarbonize, we need more electricity – and it needs to be intelligently distributed. Only then can the many small power generators, battery storage systems and electric cars be integrated in a stable manner. Real-world use cases are showing how artificial intelligence can help.

By Tim Schröder

Contact
VDE dialog - the technology magazine

Everything used to be so simple. Electricity was generated in large power plants and transported across the country to towns, villages and buildings via high-voltage power lines. Crucially, electricity flowed in only one direction – from power plants to consumers. This longstanding system is now being turned upside down. With biogas plants in the countryside, solar panels on rooftops, and wind farms, there are thousands of small and large power plants that feed in their electricity to be distributed across the power grid. Households with photovoltaic systems on their roofs no longer merely consume electricity; they also generate it and pump it back into circulation.

The problem is that the local distribution networks in villages and towns have operated in a simple manner without much communication technology. The extra-high-voltage grid, which carries electricity over long distances, is closely monitored and controlled. At the municipality level, however, electricity flows to customers like a river with no detailed monitoring of their status or that of the grid. That will no longer be possible in the future. With households feeding in electricity from their solar panels and electric vehicles greatly increasing consumption in residential areas, the grid is becoming more dynamic than ever before. For example, if many residents in a single area one day connect their electric cars to power outlets after work, the local network can quickly get overloaded. The solution? Local distribution networks that work hand-in-hand with customers. One way to achieve this in the future is by using artificial intelligence to coordinate the supply of power from several plants with the current consumption and align both to the larger power grid. This will require new sensors, and feed-in systems and consumers will also need to be integrated.

Smart meters: finally on the market after over 15 years of development

The concept isn’t new. People were already talking about smart meters that could communicate with the power grid via a router 15 years ago. One of the first ideas that caught on was to have smart meters turn on devices when electricity was cheap on the energy exchanges. For a long time, though, customers weren’t really involved in the conversation. “The demands placed on smart meters were high,” recalls Prof. Sebastian Lehnhoff, board chairman of the OFFIS Institute for Information Technology in Oldenburg. “They were supposed to do lots of things at the same time.” In addition to following the price of electricity, for example, they were expected to help stabilize the power grid. They also needed to be secure against hacker attacks and transmit data in an encrypted manner. Furthermore, the question of how smart meters would protect people's data remained unanswered for years. It was feared that the devices could reveal a great deal about residents’ habits. “The development took an extremely long time,” Lehnhoff affirms. “It was a difficult process.” 

In the meantime, local networks have undergone a change or two. Transformers are now being equipped with sensors and communication interfaces, and there are smart meters on the market that are versatile and safe. “For about two years now, this technology has been spreading much faster than before,” Lehnhoff reports.

The roadmap “Towards a climate protection grid by 2030”, which was presented by the Forum Network Technology/Network Operation in VDE (VDE FNN) at the end of March, is meant to solidify this progress. It explains that a crucial factor in climate-neutral electricity is an intelligent measuring system that controls customer applications such as heat pumps and wallboxes in a targeted manner. Electricity consumption needs to be postponed to times of day when plenty of renewable energy is available. “We have the technology for the necessary infrastructure. The rollout is underway,” the roadmap states. “However, it is still not clear who has what rights and obligations and which incentives should be set to encourage end customers to participate.”

As valuable as a small power plant: swarms of batteries and charging stations

A project led by Lehnhoff’s OFFIS colleague Prof. Astrid Nieße is showing what attractive solutions for more intelligent local networks might look like. Nieße, an executive board member of the energy R&D division at OFFIS, collaborated with the startup be.storaged to develop software agents for battery storage like the ones used in electric car charging stations. To promote electromobility even in places with poorly developed power grids, it makes sense to equip charging stations with batteries that can charge electric vehicles in the event of grid issues. The software agents add an interesting dimension to this simple battery function by configuring battery storage for electricity trading – thus creating a completely new business model in the process. When the electricity price is high, the agent tells the charging station to sell power from its battery. When the price is low, the battery is charged. What makes the arrangement even more unique is the agents’ ability to communicate with each other and thereby link many batteries in a region to form a swarm. This swarm acts as a complete system and can feed in or consume large amounts of electricity as needed. In the future, intelligent software agents will also be able to learn the typical usage profile of each charging station – for instance, how much electricity vehicles consume on which days and at what times. The agents will be able to use this data to determine how much power can be delivered to the network and when. A swarm of charging stations ultimately functions like a small power plant in the grid, meaning it can one day also be used for nationwide power plant planning – also known as the dispatch system. In dispatching, all the power plant operators in a region coordinate how much power each plant will produce one day in advance. They take both the expected price of electricity and the power plants’ costs (such as the price of coal or natural gas) into account. The dispatch process produces a precise timetable for all power plants that indicates when each plant will be switched on and how long it will feed in electricity, as well as the output at which it will operate. Once they are linked into a swarm, it’s also possible to use charging station batteries for dispatching.

If a charging station is unable to supply power because its battery is too low, the agents automatically determine which charging station will provide more. “The big advantage of this distributed intelligence is that the technology is more robust than a large control center that controls everything,” Prof. Nieße explains. If the control center were to fail, the entire system would come to a standstill. At most, only individual software agents in charging stations can fail using the agent concept. Another advantage is that it doesn’t require a huge amount of data to be sent back and forth between a control center and hundreds of charging stations because the agents coordinate with each other directly. According to Nieße, the way in which they help minimize data transfers makes such approaches to on-site data processing crucial for the intelligent power grid of the future.

Todor Kostov

“Our software learns the tendencies of a local network down to the smallest detail.” Todor Kostov, Founder of Reasonance GmbH

| Reasonance GmbH

Predicting power consumption with a city’s digital twin

A Karlsruhe startup (and VDE member) is also aiming to make local networks more intelligent. Reasonance is developing software that can very precisely analyze who is currently consuming and producing electricity in a given local network. At present, this kind of analysis is still more or less impossible. Local utilities only have a general overview of what is happening overall – a balance of electricity consumption and production. There is barely any detailed information on the amount of power individual transformer stations are feeding in and when. “Our software makes it possible to analyze the share each one is contributing bit by bit,” says Reasonance founder Todor Kostov. Since its software also relies on sensor data, the company is cooperating with local utilities in Baden-Württemberg that have retrofitted their local network transformers with corresponding sensors. Data from photovoltaic systems is also evaluated, and smart meter data is to be incorporated in the future.

Local utilities have to estimate their future power consumption at regular intervals and purchase electricity accordingly. These calculations are often based on historical values. However, the constantly increasing number of private photovoltaic installations, battery storage systems, electric cars and heat pumps is making such estimates more and more difficult. This can result in utilities overestimating power consumption and ordering too much electricity. In the worst case, the surplus has to be resold below value. This is where Reasonance’s technology comes in. The local network sensors provide it with a very precise overview of who feeds in and consumes certain amounts of electricity and when. This information is processed in a digital twin – a computer model of the local network – and combined with a variety of parameters that influence the state of the grid: the temperature, the position and angle of the sun, and power loss caused by inverters that convert the direct current from photovoltaic systems into alternating current. Machine learning techniques help the software develop a highly detailed understanding of the local network’s tendencies, and its analysis takes place automatically. This leads to much better estimates of, say, a city’s expected power consumption than historical data would produce. “We can analyze lots of other things, too,” Kostov says – such as how a new cogeneration unit or a certain number of electric cars will affect the electricity grid.

Only grid-friendly components have long-term viability

A digital twin is also the centerpiece of the Karlsruhe Institute of Technology’s Energy Lab 2.0, which is examining how to operate large power plant components and electrotechnical elements in the power grid of the future. The key concept here is grid-friendly operation. In the future, individual components are to contribute more than ever to maintaining the stability of such networks. This is challenging because the power grid is a finely tuned physical system in which the oscillation frequency of the alternating current can deviate only minimally from the norm. Otherwise, there will be a power failure. The voltage, power output and frequency in the power grid have thus far been stabilized by rotating the large power generators in coal-fired power plants. These massive components turn at 50 revolutions per second at 50 hertz, which means that the alternating current in the power lines also oscillates constantly at 50 hertz. In light of the increase in photovoltaic systems, wind turbines and many small, decentralized power generators, the large coal-fired power plants will be taken off the grid in Germany by 2038 at the latest. The power grid of the future therefore needs clever new concepts to compensate for the resulting loss of stability.

Machines that store power and release it in seconds

Energy Lab 2.0 is investigating how gas turbines, flywheels and supercapacitors – current accumulators that are capable of releasing energy in seconds – can be used for this purpose. The highlight thus far is that the researchers have succeeded in digitally imitating the behavior of these machines in detail. They can thus directly compare the behavior of actual machines with their digital model. That has a big advantage according to Giovanni De Carne, head of the Real Time System Integration group. “Only one working group can work on a machine at a time, but several can work on a digital twin. This can accelerate the introduction of new energy technologies on the market,” he explains.

The experts involved in Energy Lab 2.0 are testing how heavily the machines are stressed in different scenarios and how quickly they can react. Their large flywheel, for instance, rotates at 45,000 revolutions per minute. If the power grid requires more electricity because power-hungry industrial plants are starting operations or electrolyzers are being ramped up for hydrogen production, it can convert kinetic energy into electricity within seconds – up to a full power output of 120 kilowatts per second. That’s roughly equivalent to the output of rapid charging stations for electric cars. As De Carne emphasizes, the way in which Energy Lab 2.0 is combining a digital twin and machine learning tools with industrial-size equipment makes it the only project of its kind in Europe.

Representation of a digital twin of a city.

Energy Lab 2.0, a project spearheaded by the Karlsruhe Institute of Technology, is examining the intelligent networking of environmentally friendly energy producers and storage methods. At the heart of the research is a digital twin that is being used to simulate the challenge of keeping the grid stable once the large coal-fired power plants go offline.

| Marie-Thérèse Frank und Andreas Sexauer/KIT

Together, many small plants have great potential

In the future, though, not only larger plants will operate in a grid-friendly manner; small generators such as private photovoltaic systems, power storage systems in home basements, and electric cars will, as well. Like swarms of charging stations, they collectively have great potential that can be harnessed to promote network stability. According to OFFIS researcher Sebastian Lehnhoff, integrating them into “redispatching” will be crucial in the future. Once the actual dispatching has been completed, transmission system operators check the extent to which transporting electricity between producers and consumers will strain the various sections of the network the following day. The output of certain power plants then needs to be reduced or ramped up to compensate. This detailed planning is referred to as redispatching, and it used to be regulated by large power plants. If a power line was expected to be overloaded the following day, the major network operators throttled the power plant upstream of the bottleneck and increased production at the downstream power plant in the region with high power requirements.

In the fall of last year, Germany’s “Redispatch 2.0” regulation came into force across the nation. Since then, smaller network operators and power companies have also been required to participate in redispatching – and regulate their solar or wind farms accordingly. In the three-year “Redispatch 3.0” project, OFFIS and a number of partners are now investigating how private photovoltaic systems and electric cars can also be integrated in real time. If they hadn’t already considered it before, the war in Ukraine now has many consumers thinking about replacing their gas heating with a heat pump, Lehnhoff says. He reports that demand is currently going through the roof, which will also shift a great deal of demand from the gas market to the electricity market in the coming years. For Lehnhoff, this is another development that underscores the importance of digitalization and Redispatch 3.0 for the energy transition.

To the mind of Jens Strüker, however, there is a big obstacle in the way. “In terms of digital networking, there’s still no continuity from one end of the power grid to the other – from the producer and the transmission system operator to the consumer.” Strüker is a professor of information systems and digital energy management at the University of Bayreuth and head of the Fraunhofer Blockchain Lab. His pet project is examining how to decarbonize the energy system through digitalization. Right now, he’s still seeing many communication breakdowns in this regard. “If we also want to integrate small producers in a grid-friendly manner and use them for redispatching power, they have to be controllable from the outside. We’re still a long way from that.” Germany’s Federal Network Agency has its market master data register, which lists most energy producers with an output of 100 kilowatts or more. According to Strüker, though, this is merely a directory where one can register online; no digital interaction is possible. “When you register with an Internet provider as a private individual, you’re directly connected to the digital world,” he says. This should also be implemented for the market master data register. That would make it possible to quickly introduce integrated communication across all levels of the power grid.

Another shortcoming is that the register is not currently linked to the national proof-of-origin portal where operators register their green electricity plants. “In my opinion, this kind of combination is long overdue,” Strüker declares. A continuous digital link would not only pave the way for Redispatch 3.0 and grid-friendly plant operation. In the long term, it would also be possible to determine the level of carbon dioxide emissions the electricity in the grid was causing at any given time. “You could then control power plants according to a CO2 signal, for example, to make the overall balance particularly climate-friendly,” Strüker suggests. That would truly be a major coup for the energy transition.

Tim Schröder is a science journalist in Oldenburg, Germany.