Load factor (electrical)
In electrical engineering the load factor is defined as the average load divided by the peak load in a specified time period.[1] It is a measure of variability of consumption or generation; a low load factor indicates that load is highly variable, whereas consumers or generators with steady consumption or supply will have a high load factor.
An example, using a large commercial electrical bill:
Hence:
- load factor = { 200 kWh / ( 57 d × 24 hours per day × 30 kW) } × 100% = 18.22% 436
It can be derived from the load profile of the specific device or system of devices. Its value is always less than one because maximum demand is always higher than average demand, since facilities likely never operate at full capacity for the duration of an entire 24-hour day. A high load factor means power usage is relatively constant. Low load factor shows that occasionally a high demand is set. To service that peak, capacity is sitting idle for long periods, thereby imposing higher costs on the system. Electrical rates are designed so that customers with high load factor are charged less overall per kWh. This process along with others is called load balancing or peak shaving.
The load factor is closely related to and often confused with the demand factor.
The major difference to note is that the denominator in the demand factor is fixed depending on the system. Because of this, the demand factor cannot be derived from the load profile but needs the addition of the full load of the system in question.
See also
References
- ↑ Watkins, G. P. (1915). "A Third Factor in the Variation of Productivity: The Load Factor". American Economic Review. American Economic Association. 5 (4): 753–786. JSTOR 1809629.