Creating Information and Knowledge
The feature articles in this issue are devoted to the emerging field of data analytics, which is a computational capability to extract a cause–effect understanding of power system events. This knowledge gets extracted from field measurements through analytical methods and, in many cases, involves the use of various data and power system models. As it links the cause of an event with its consequence, it may be readily used by operators or in designing controllers to enhance power system operation. The data analytics solutions use field data obtained from various intelligent electronic devices located in substations and a variety of databases spread across the utility enterprise. The data analytics tools then convert the data to information and eventually information to knowledge. In this process, the knowledge of experts is formulated as set of rules or equations and combined with computational models to provide the match between measured data and event hypothesis. Once the data and the hypothesis are matched, the desired knowledge about the cause–effect relationship is inferred. Since the process of matching prior experience with measured data to obtain knowledge is done often in an automated way, the results of the process are typically made available online and may be used in real-time decision making. The described process of converting data to knowledge using data analytics is illustrated inFigure 1.
Many existing applications in power systems are also focused on processing data, but only a few are using innovative monitoring and control concepts enabled by data analytics solutions. To illustrate the trend, this issue of IEEE Power & Energy Magazine provides several examples of the advanced solutions. Since this is an emerging field, the articles are selected from the user and research groups that closely collaborate with industry in demonstrating the benefits. The examples that follow are an important step forward and illustrative of the new trend, but there are many other similar ideas that did not make it into this issue due to the practical publishing limitations.
A careful selection of the article authors and topics illustrates emerging data analytics for control center applications, enhanced security assessment and management, tuned state estimation, automated fault analysis, and renewable resource integration. The topics of the articles shown in the context of automated data analytics are depicted in Figure 1.
The first article, “The Situation Room,” discusses a suite of advanced data analytics solutions based on phasor measurement unit (PMU) measurements: a) angular separation, b) oscillatory stability, c) disturbance location identification, and d) islanding and resynchronization. The authors illustrate how such advanced solutions may be integrated with the legacy emergency management system (EMS) design to provide a major enhancement in operator’s ability to make decisions. This requires advanced graphical representation of the data analytics results.
The article provides operator views that incorporate combined graphical and geographical views. Correlating electrical and spatial components of decision making enhances the ability to make prudent decisions. As examples of the synergies that have occurred due to the advanced analytics and visualization framework, the operators’ ability to monitor operating limits, understand complex events, and enhance post-mortem analysis are discussed. This development leads to the new concept of enhanced situational awareness.
The authors indicate that such an improvement “maximized human understanding and comprehension without increasing operator stress.” This is achieved through the analytics that offer enhanced perception, comprehension, and projection leading to better informed decision making and action. Since the implementation of the new EMS solutions carries a substantial risk, the authors use an example from a utility company deployment to illustrate how the risk may be managed. The company has decided to first implement a proof of concept (POC), and then proceed with full implementation. The POC included a PMU, as well as the stability and EMS analytics, and allowed for ample testing of equipment and software. As a result, a long list of benefits of the POC is experienced and shared with readers.
The second article, “Operating in the Fog,” provides broad user perspective of the new data analytics. The Pan-European network plans are outlined and the main conclusion is that the uncertainty in short term planning requires new tools to handle operation decisions. This additional knowledge for decision making is envisioned coming from better description of neighboring systems, improved forecasting and enhanced model accuracy. This led to a discussion of the overall toolbox structure for the future operator needs that includes existing application and new data analytics for security assessment. The security assessment tools are envisioned as being used for on-line decisions, but they will be widely supported by off-line tools helping define security rules, validate dynamic models and outline defense plan and restoration strategy. To achieve this new way of handling uncertainties, the framework for contingency assessment including corrective and preventive actions is proposed. This approach is illustrated though several examples of how the tools may be used in some critical operating conditions. While given at a high abstraction level, the new data analytics clearly indicate the reliance on better models, more up-to-date data, and knowledge from the past experiences.
The next article, “Metrics for Success,” illustrates data analytics applied to evaluate state estimator (SE) performance. As well known, SE is an indispensable tool for matching the measurements with models to account for erroneous and missing data. The knowledge of the authors used to evaluate SEs comes from combining the experience with designing measurements in existing SEs with new experience of using synchrophasor data. This leads to a concept of a synchrophasor assisted state estimation (SPASE), which allows for improvements based on statistical properties of the measurements while taking into account model uncertainties. To make the point about how the new approach differs from the traditional one, the article explores the issue of network observability and bad data detection, two key design components of an SE. This leads to an analysis of what affects the accuracy of an SE, and the measurement design and selection of critical measurements are recognized as the key impacts.
As noted by the authors, the two issues are becoming difficult to handle as the size of the SE design grows. The scaling up of the SE design happens when attempts are made to represent the entire transmission and distribution system or an entire electricity market with all the participating players using one unified power system model and a generalized SE. As the critical need to improve existing SE is elaborated upon, the authors point to the importance of metrics that should be used to evaluate any improvements. The recommended metric for an SE solution is the number of iterations for the results to converge. While this was always known, the additional insight is given to “distinguish such reasons” and the quality of measurements and their design were pinpointed as the focus of additional metric. The impacts that are quantified by the metric are: a) the objective function and largest normalized residual impact on quality of measurements and b) the measurement system vulnerability, pseudomeasurements ratio, and SE accuracy that impact measurement design. With such insight, data analytics for the extensive evaluation of SE solutions have been developed and demonstrated using cases from a utility company. It becomes obvious that this type of data analytics is quite useful in making decisions about SE improvements using new measurements and their optimal placement. The use of the proposed data analytics as the metric for assessment of measurement quality and design enables SE designers to make the right choices as the new measurement infrastructures such as PMUs become available for the use in SEs in the future.
The following article, “Measures of Value,” points out how understanding the manual for the disturbance analysis can be translated into the data analytics solution executed automatically online. The core benefit is the ability to determine a cause–effect relationship between an event such as a transmission line fault and a consequence such as an incorrect relay or breaker operation in a matter of seconds. This way the nonoperational data obtained from digital relays and transient recorders is actually turned into operational knowledge available for operator decision making. The results of the data analytics processing can tell operators the basic information about the fault type and location, as well as whether the fault clearing sequences were executed correctly and whether they included auto reclosing that cleared a temporary fault or circuit breaker operation that isolated a permanent fault. Based on this result obtained in seconds after the event has occurred, operators are able to make key decisions whether to restore the line or whether to issue a work order request for the repair crew to go to a very accurately located site and repair the damage.
To provide such a powerful processing capability, this data analytics function utilizes the knowledge of experts to develop a model of expert reasoning that links cause–effect rules in a software solution called an expert system, which, in this case, is the core of the data analytics approach. The article illustrates how once the experts’ knowledge is embedded in a software solution, the rules formulated by experts get “fired” automatically for each new set of measurements. The measurements come from intelligent electronic devices (IEDs) located in substations that are triggered by such events. The firing of the rules results in the cause–effect analysis that presents operators with clear decision-making options to react in the case that inferior performance of the relaying system and/or circuit breakers require their action. This data analytics benefit should be compared with the events of the blackout in 2003 when it took days and weeks to actually perform a post-mortem analysis of the events that could have been identified in a matter of second with the proposed data analytics enabling operators to react and perhaps contain the cascade that led to the blackout.
The final article, “One Step Ahead,” focuses on data analytics needed for the integration and use of renewable resources such as wind power. Since it is widely known that the wind is intermittent, the authors are proposing new data analytics for wind power forecasts that may be utilized for predictive control. This idea is already attracting several research groups, and many approaches using different forecasting technique are being proposed. The authors introduce a simplified forecasting method that uses just the outputs of active and reactive power from wind turbines to predict the next control action. A neural network based data analytics tool is developed and tested using data from multiple wind farms in Germany. An optimization scheme that takes into account load tap changers and shunt reactors is developed and tested using several cases of reactive power controllers embedded with the wind generators. This new data analytics tool for predictive control is incorporated in a system solution that, besides the wind farm, also has access to the battery storage and wind power balancing controller.
The authors acknowledge the need for new data analytics to perform short-term wind power prediction in the order of seconds, minutes and a few hours and its application in control centers. They also state that “This will become critical for the real-time operation of the electricity supply system as more and more wind power penetrates into it. The value of short-term wind power forecasting is high considering the reduction in power losses, as is maximizing the security and stability of the power system, especially when stochastic security-constrained optimal power flow is far from reaching control centers in the near future.” They also recognize that this solution may become quite attractive to wind power providers once short-term wind power forecast based system applications become common in control centers as the results “enable the maximization of revenue by minimizing penalties.”
In summary, all the articles have something in common that paves the way for future thinking about new data analytics.
- Almost all of the applications use some new data not used in legacy solutions
- The analytics take advantage of the formulation of experts knowledge and improved models.
- The advantages are obtained from being able to better understand cause–effect relationships.
- The combined physical, electrical, and data model views of the results enhances decision-making.
- The applications are helping operators in more accurate planning and robust operations.
- Since the software tools for data analytics are new, their integration in legacy solutions is critical.
In closing, this special issue has targeted data analytics as a promising development that will enhance future EMS solutions. It will, however, require close attention to the methods for capturing experts’ knowledge and translating it into analytical tools that can produce new value out of abundance of data about the power system.