Distribution management system

In the recent years, utilization of electrical energy increased exponentially and customer requirement and quality definitions of power were changed enormously. As the electric energy became an essential part of the daily life, its optimal usage and reliability became important. Real-time network view and dynamic decisions have become instrumental for optimizing resources and managing demands, thus making a distribution management system which could handle proper work flows, very critical.

Overview

A Distribution Management System (DMS) is a collection of applications designed to monitor & control the entire distribution network efficiently and reliably. It acts as a decision support system to assist the control room and field operating personnel with the monitoring and control of the electric distribution system. Improving the reliability and quality of service in terms of reducing outages, minimizing outage time, maintaining acceptable frequency and voltage levels are the key deliverables of a DMS.

Most distribution utilities have been comprehensively using IT solutions through their Outage Management System (OMS) that makes use of other systems like Customer Information System (CIS), Geographical Information System (GIS) and Interactive Voice Response System (IVRS). An outage management system has a network component/connectivity model of the distribution system. By combining the locations of outage calls from customers with knowledge of the locations of the protection devices (such as circuit breakers) on the network, a rule engine is used to predict the locations of outages. Based on this, restoration activities are charted out and the crew is dispatched for the same.

In parallel with this, distribution utilities began to roll out Supervisory Control and Data Acquisition (SCADA) systems, initially only at their higher voltage substations. Over time, use of SCADA has progressively extended downwards to sites at lower voltage levels.

DMSs access real-time data and provides all information on a single console at the control centre in an integrated manner. Their development varied across different geographic territories. In the USA, for example, DMSs typically grew by taking Outage Management Systems to the next level, automating the complete sequences and providing an end to end, integrated view of the entire distribution spectrum. In the UK, by contrast, the much denser and more meshed network topologies, combined with stronger Health & Safety regulation, had led to early centralisation of high-voltage switching operations, initially using paper records and schematic diagrams printed onto large wallboards which were 'dressed' with magnetic symbols to show the current running states. There, DMSs grew initially from SCADA systems as these were expanded to allow these centralised control and safety management procedures to be managed electronically. These DMSs required even more detailed component/connectivity models and schematics than those needed by early OMSs as every possible isolation and earthing point on the networks had to be included. In territories such as the UK, therefore, the network component/connectivity models were usually developed in the DMS first, whereas in the USA these were generally built in the GIS.

The typical data flow in a DMS has the SCADA system, the Information Storage & Retrieval (ISR) system, Communication (COM) Servers, Front-End Processors (FEPs) & Field Remote Terminal Units (FRTUs).

Why DMS?

DMS Functions

In order to support proper decision making and O&M activities, DMS solution shall have to support the following functions:

The various sub functions of the same, carried out by the DMS are listed below:-

Network Connectivity Analysis (NCA)

Distribution network usually covers over a large area and catering power to different customers at different voltage levels. So locating required sources and loads on a larger GIS/Operator interface is often very difficult. Panning & zooming provided with normal SCADA system GUI does not cover the exact operational requirement. Network connectivity analysis is an operator specific functionality which helps the operator to identify or locate the preferred network or component very easily. NCA does the required analyses and provides display of the feed point of various network loads. Based on the status of all the switching devices such as circuit breaker (CB), Ring Main Unit (RMU) and/or isolators that affect the topology of the network modeled, the prevailing network topology is determined. The NCA further assists the operator to know operating state of the distribution network indicating radial mode, loops and parallels in the network.

Switching Schedule & Safety Management

In territories such as the UK a core function of a DMS has always been to support safe switching and work on the networks. Control engineers prepare switching schedules to isolate and make safe a section of network before work is carried out, and the DMS validates these schedules using its network model. Switching schedules can combine telecontrolled and manual (on-site) switching operations. When the required section has been made safe, the DMS allows a Pemit To Work (PTW) document to be issued. After its cancellation when the work has been finished, the switching schedule then facilitates restoration of the normal running arrangements. Switching components can also be tagged to reflect any Operational Restrictions that are in force.

The network component/connectivity model, and associated diagrams, must always be kept absolutely up to date. The switching schedule facility therefore also allows 'patches' to the network model to be applied to the live version at the appropriate stage(s) of the jobs. The term 'patch' is derived from the method previously used to maintain the wallboard diagrams.

State Estimation (SE)

The state estimator is an integral part of the overall monitoring and control systems for transmission networks. It is mainly aimed at providing a reliable estimate of the system voltages. This information from the state estimator flows to control centers and database servers across the network. [1] The variables of interest are indicative of parameters like margins to operating limits, health of equipment and required operator action. State estimators allow the calculation of these variables of interest with high confidence despite the facts that the measurements may be corrupted by noise, or could be missing or inaccurate.

Even though we may not be able to directly observe the state, it can be inferred from a scan of measurements which are assumed to be synchronized. The algorithms need to allow for the fact that presence of noise might skew the measurements. In a typical power system, the State is quasi-static. The time constants are sufficiently fast so that system dynamics decay away quickly (with respect to measurement frequency). The system appears to be progressing through a sequence of static states that are driven by various parameters like changes in load profile. The inputs of the state estimator can be given to various applications like Load Flow Analysis, Contingency Analysis, and other applications.

Load Flow Applications (LFA)

Load flow study is an important tool involving numerical analysis applied to a power system. The load flow study usually uses simplified notations like a single-line diagram and focuses on various forms of AC power rather than voltage and current. It analyzes the power systems in normal steady-state operation. The goal of a power flow study is to obtain complete voltage angle and magnitude information for each bus in a power system for specified load and generator real power and voltage conditions. Once this information is known, real and reactive power flow on each branch as well as generator reactive power output can be analytically determined.

Due to the nonlinear nature of this problem, numerical methods are employed to obtain a solution that is within an acceptable tolerance. The load model needs to automatically calculate loads to match telemeter or forecasted feeder currents. It utilises customer type, load profiles and other information to properly distribute the load to each individual distribution transformer. Load-flow or Power flow studies are important for planning future expansion of power systems as well as in determining the best operation of existing systems.

Volt-VAR Control (VVC)

Volt-VAR Control or VVC refers to the process of managing voltage levels and reactive power (VAR) throughout the power distribution systems. These two quantities are related, because as reactive power flows over an inductive line (and all lines have some inductance) that line sees a voltage drop. VVC encompasses devices that purposely inject reactive power into the grid to alter the size of that voltage drop, in addition to equipment that more directly controls voltage.

In the legacy grid, there are three primary tools for carrying out voltage management: Load Tap Changers (LTCs), voltage regulators, and capacitor banks. LTCs and voltage regulators refer to transformers with variable turns ratios that are placed at strategic points in a network and adjusted to raise or lower voltage as is necessary. Capacitor banks manage voltage by “generating” reactive power, and have thus far been the primary tools through which true Volt/VAR control is carried out. These large capacitors are connected to the grid in shunt configuration through switches which, when closed, allow the capacitors to generate VARs and boost voltage at the point of connection. In the future, further VVC might be carried out by smart inverters and other distributed generation resources, which can also inject reactive power into a distribution network. A VVC application helps the operator mitigate dangerously low or high voltage conditions by suggesting required action plans for all VVC equipment. The plan will give a required tap position and capacitor switching state to ensure the voltage stays close to its nominal value and thus optimize Volt-VAR control function for the utility.

Beyond maintaining a stable voltage profile, VVC has potential benefits for the ampacity (current-carrying capacity) of power lines. There could be loads that contain reactive components like capacitors and inductors (such as electric motors) that strain the grid. This is because the reactive portion of these loads causes them to draw more current than an otherwise comparable, purely resistive load would draw. The extra current can result in heating up of equipment like transformers, conductors, etc. which might then need resizing to carry the total current. An ideal power system needs to control current flow by carefully planning the production, absorption and flow of reactive power at all levels in the system.

Load Shedding Application (LSA)

Electric Distribution Systems have long stretches of transmission line, multiple injection points and fluctuating consumer demand. These features are inherently vulnerable to instabilities or unpredicted system conditions that may lead to critical failure. Instability usually arises from power system oscillations due to faults, peak deficit or protection failures. Distribution load shedding and restoration schemes play a vital role in emergency operation and control in any utility.

An automated Load Shedding Application detects predetermined trigger conditions in the distribution network and performs predefined sets of control actions, such as opening or closing non-critical feeders, reconfiguring downstream distribution or sources of injections, or performing a tap control at a transformer. When a distribution network is complex and covers a larger area, emergency actions taken downstream may reduce burden on upstream portions of the network. In a non-automated system, awareness and manual operator intervention play a key role in trouble mitigation. If the troubles are not addressed quickly enough, they can cascade exponentially and cause major catastrophic failure.

DMS needs to provide a modular automated load shedding & restoration application which automates emergency operation & control requirements for any utility. The application should cover various activities like Under Frequency Load Shedding (UFLS), limit violation and time of day based load shedding schemes which are usually performed by the operator.

Fault Management & System Restoration (FMSR)

Reliability and quality of power supply are key parameters which need to be ensured by any utility. Reduced outage time duration to customer, shall improve over all utility reliability indices hence FMSR or automated switching applications plays an important role. The two main features required by a FMSR are: Switching management & Suggested switching plan

The DMS application receives faults information from the SCADA system and processes the same for identification of faults and on running switching management application; the results are converted to action plans by the applications. The action plan includes switching ON/OFF the automatic load break switches / RMUs/Sectionalizer .The action plan can be verified in study mode provided by the functionality .The switching management can be manual/automatic based on the configuration.

Load Balancing via Feeder Reconfiguration (LBFR)

Load balancing via feeder reconfiguration is an essential application for utilities where they have multiple feeders feeding a load congested area. To balance the loads on a network, the operator re-routes the loads to other parts of the network. A Feeder Load Management (FLM) is necessary to allow you to manage energy delivery in the electric distribution system and identify problem areas. A Feeder Load Management monitors the vital signs of the distribution system and identifies areas of concern so that the distribution operator is forewarned and can efficiently focus attention where it is most needed. It allows for more rapid correction of existing problems and enables possibilities for problem avoidance, leading to both improved reliability and energy delivery performance.

On a similar note, Feeder Reconfiguration is also used for loss minimization. Due to several network and operational constraints utility network may be operated to its maximum capability without knowing its consequences of losses occurring. The overall energy losses and revenue losses due to these operations shall be minimized for effective operation. The DMS application utilizes switching management application for this, the losses minimization problem is solved by the optimal power flow algorithm and switching plans are created similar to above function

Distribution Load Forecasting (DLF)

Distribution Load Forecasting (DLF) provides a structured interface for creating, managing and analyzing load forecasts. Accurate models for electric power load forecasting are essential to the operation and planning of a utility company. DLF helps an electric utility to make important decisions including decisions on purchasing electric power, load switching, as well as infrastructure development.

Load forecasting is classified in terms of different planning durations: short-term load forecasting or STLF (up to 1 day, medium-term load forecasting or MTLF (1 day to 1 year), and long-term load forecasting or LTLF (1-10years). To forecast load precisely throughout a year, various external factors including weathers, solar radiation, population, per capita gross domestic product seasons and holidays need to be considered. For example, in the winter season, average wind chill factor could be added as an explanatory variable in addition to those used in the summer model. In transitional seasons such as spring and fall, the transformation technique can be used. For holidays, a holiday effect load can be deducted from the normal load to estimate the actual holiday load better.

Various predictive models have been developed for load forecasting based on various techniques like multiple regression, exponential smoothing, iterative reweighted least-squares, adaptive load forecasting, stochastic time series, fuzzy logic, neural networks and knowledge based expert systems. Amongst these, the most popular STLF were stochastic time series models like Autoregressive (AR) model, Autoregressive moving average model (ARMA), Autoregressive integrated moving average (ARIMA) model and other models using fuzzy logic and Neural Networks.

DLF provides data aggregation and forecasting capabilities that is configured to address today’s requirements and adapt to address future requirements and should have the capability to produce repeatable and accurate forecasts.

Standards Based Integration

In any integrated energy delivery utility operation model, there are different functional modules like GIS, Billing & metering solution, ERP, Asset management system that are operating in parallel and supports routine operations. Quite often, each of these functional modules need to exchange periodic or real time data with each other for assessing present operation condition of the network, workflows and resources (like crew, assets, etc.). Unlike other power system segments, distribution system changes or grows every day, and this could be due to the addition of a new consumer, a new transmission line or replacement of equipment. If the different functional modules are operating in a non-standard environment and uses custom APIs and database interfaces, the engineering effort for managing shall become too large. Soon it will become difficult to manage the growing changes and additions which would result in making system integrations non- functional. Hence utilities cannot make use of the complete benefit of functional modules and in some cases; the systems may even need to be migrated to suitable environments with very high costs.

As these problems came to light, various standardization processes for inter application data exchanges were initiated. It was understood that a standard based integration shall ease the integration with other functional modules and that it also improves the operational performance. It ensures that the utility can be in a vendor neutral environment for future expansions, which in turn means that the utility can easily add new functional modules on top of existing functionality and easily push or pull the data effectively without having new interface adapters.

IEC 61968 Standards Based Integration

IEC 61968 is a standard being developed by the Working Group 14 of Technical Committee 57 of the IEC and defines standards for information exchanges between electrical distribution system applications. It is intended to support the inter-application integration of a utility enterprise that needs to collect data from different applications which could be new or legacy.

As per IEC 61968, a DMS encapsulates various capabilities like monitoring and control of equipment for power delivery, management processes to ensure system reliability, voltage management, demand-side management, outage management, work management, automated mapping and facilities management. The crux of IEC 61968 standards is the Interface Reference Model (IRM) that defines various standard interfaces for each class of applications. Abstract (Logical) components are listed to represent concrete (physical) applications. For example, a business function like Network Operation (NO) could be represented by various business sub-functions like Network Operation Monitoring (NMON), which in turn will be represented by abstract components like Substation state supervision, Network state supervision, and Alarm supervision.

IEC 61968 recommends that system interfaces of a compliant utility inter-application infrastructure be defined using Unified Modelling Language (UML). UML includes a set of graphic notation techniques that can be used to create visual models of object-oriented software-intensive systems. The IEC 61968 series of standards extend the Common Information Model (CIM), which is currently maintained as a UML model, to meet the needs of electrical distribution. For structured document interchange particularly on the Internet, the data format used can be the Extensible Markup Language (XML). One of its primary uses is information exchange between different and potentially incompatible computer systems. XML is thus well-suited to the domain of system interfaces for distribution management. It formats the message payloads so as to load the same to various messaging transports like SOAP (Simple Object Access Protocol), etc.

References

  1. Yih-Fang Huang; Werner, S.; Jing Huang; Kashyap, N.; Gupta, V., "State Estimation in Electric Power Grids: Meeting New Challenges Presented by the Requirements of the Future Grid," Signal Processing Magazine, IEEE , vol.29, no.5, pp.33,43, Sept. 2012

External links