Low-power

From Wikipedia, the free encyclopedia

In electronics, the term low-power means one of two things about a device:

Contents

[edit] Radio

Some people claim that low power "smart" radio is inherently superior to standard broadcast radio, although others dispute this claim.

"Technologists are increasingly discussing a related kind of gain called 'cooperation gain.' ... think about a party. If I need to tell you that it's time to leave, I could choose to shout that message across the room. Shouting, however, is rude. So instead, imagine I choose to whisper my message to the person standing next to me, and he whispered it to the next person, and she to the next person, and so on. This series of whispers could get my message across the room without forcing me to shout." -- "Wireless Spectrum: Defining the 'Commons'" by Lawrence Lessig 2003 (mirror)

"if nodes repeat each other's traffic. If I want to talk to someone across the room, I don't have to shout. I can just whisper it to someone near me, who can pass it on, and so on. ... as we add more transmitters, the total capacity goes up slightly, but we still have to face the fact that each transmitter's capacity goes down (just slower). Even better, we all end up using less energy (since we don't have to transmit as far), saving battery life." -- Open Spectrum: A Global Pervasive Network by Aaron Swartz

"Every time a broadcaster receives a license, the amount of available spectrum goes down. ... New technology, however, increases bandwidth with the number of users." -- "Why Open Spectrum Matters: The End of the Broadcast Nation" by David Weinberger

[edit] Electronics

The density and speed of integrated circuit computing elements has increased roughly exponentially for a period of several decades, following a trend described by Moore's Law. While it is generally accepted that this exponential improvement trend will end, it is unclear exactly how dense and fast integrated circuits will get by the time this point is reached. Working devices have been demonstrated that were fabricated with a MOSFET transistor channel length of 6.3 nanometres using conventional semiconductor materials, and devices have been built that used carbon nanotubes as MOSFET gates, giving a channel length of approximately 1 nanometre.

The ultimate density and computing power of integrated circuits are limited primarily by power dissipation concerns.

An integrated circuit chip contains many capacitive loads, formed both intentionally (as is the case with gate to channel capacitance) and unintentionally (between any conductors that are near each other but not electrically connected). Changing the state of the circuit causes a change in the voltage across these parasitic capacitances, which involves a change in the amount of stored energy. As the capacitive loads are charged and discharged through resistive devices, an amount of energy comparable to that stored in the capacitor is dissipated as heat.

E_\mathrm{stored} = {1 \over 2} C V^2

The result of heat dissipation on state change is to limit the amount of computation that may be performed on a given power budget. While device shrinkage can reduce some of the parasitic capacitances, the number of devices on an integrated circuit chip has increased more than enough to compensate for reduced capacitance in each individual device.

As circuits shrink, Subthreshold_leakage current is becoming much more important. This leakage current results in power consumption even when no switching is taking place (static power consumption), and with modern chips this current is frequently more than 50% of power used by the IC. This loss can be reduced by raising the threshold voltage and lowering the supply voltage. Both of these changes slow the circuit down significantly, and so some modern low-power circuits use dual supply voltages to provide speed on critical parts of the circuit, and lower power on non-critcal paths. Some circuits even use different transistors (with different threshold voltages) in different parts of the circuit in an attempt to further reduce power consumption without significant performance loss.

Another method used to reduce static power consuption is the use of sleep transistors to disable entire blocks when not in use. By shutting down a leaky functional block until it is used, leakage current can be reduced significantly. For some embedded systems that only function for short periods at a time, this can dramatically reduce power consumption. Since systems that are dormant for long periods of time and "wake up" to perform a periodic activity are often in isolated locations monitoring some sort of activity, they are generally battery or solar powered and power consumption is a key design factor.

Two other approaches exist to lowering the power cost of state changes. One is to reduce the operating voltage of the circuit, or to reduce the voltage change involved in a state change (making a state change only change node voltage by a fraction of the supply voltage -- Low voltage differential signaling). This approach is limited by thermal noise within the circuit. There is a characteristic voltage proportional to the device temperature and to the Boltzmann constant, which the state switching voltage must exceed in order for the circuit to be resistant to noise. This is typically on the order of 50–100 mV, for devices rated to 100 degrees Celsius external temperature (about 4 kT, where T is the device's internal temperature and k is the Boltzmann constant).

The second approach is to attempt to provide charge to the capacitive loads through paths that are not predominately resistive. This is the principle behind adiabatic circuits. The charge is supplied either from a variable-voltage inductive power supply, or by other elements in a reversible logic circuit. In both cases, the charge transfer must be primarily regulated by the non-resistive load. As a practical rule of thumb, this means the rate of change of a signal must be much slower than that dictated by the RC time constant of the circuit being driven. In other words, the price of reduced power consumption per unit computation is reduced absolute speed of computation.

In practice, while adiabatic circuits have been built, it has proven very difficult to use it to reduce computation power substantially in practical circuits.

Lastly, there are several techniques used to reduce the number of state changes associated with any given computation. For clocked logic circuits, the technique of clock gating is used, to avoid changing the state of functional blocks that aren't required for a given operation. As a more extreme alternative, the asynchronous logic approach implements circuits in such a way that an explicit externally supplied clock is not required. While both of these techniques are used to varying extents in integrated circuit design, the limit to practical applicability of each appears to have been reached.

[edit] External links

[edit] Electronics