Signal (electrical engineering)

From Wikipedia, the free encyclopedia

In the fields of communications, signal processing, and in electrical engineering more generally, a signal is any time-varying quantity. Signals are often scalar-valued functions of time (waveforms), but may be vector valued and may be functions of any other relevant independent variable.

The concept is broad, and hard to define precisely. Definitions specific to subfields are common. For example, in information theory, a signal is a codified message, ie, the sequence of states in a communications channel that encodes a message. In a communications system, a transmitter encodes a message into a signal, which is carried to a receiver by the communications channel. For example, the words "Mary had a little lamb" might be the message spoken into a telephone. The telephone transmitter converts the sounds into an electrical voltage signal. The signal is transmitted to the receiving telephone by wires; and at the receiver it is reconverted into sounds.

Signals can be categorized in various ways. The most common distinction is between discrete and continuous spaces that the functions are defined over, for example discrete and continuous time domains. Discrete-time signals are often referred to as time series in other fields. Continuous-time signals are often referred to as continuous signals even when the signal functions are not continuous; an example is a square-wave signal.

A second important distinction is between discrete-valued and continuous-valued. Digital signals are discrete-valued, but are often derived from an underlying continuous-valued physical process.

Contents

[edit] Discrete-time and continuous-time signals

If for a signal, the quantities are defined only on a discrete set of times, we call it a discrete-time signal. In other words, a discrete-time real (or complex) signal can be seen as a function from the set of integers to the set of real (or complex) numbers.

A continuous-time real (or complex) signal is any real-valued (or complex-valued) function which is defined for all time t in an interval, most commonly an infinite interval.

[edit] Analog and digital signals

Less formally than the theoretical distinctions mentioned above, two main types of signals encountered in practice are analog and digital. In short, the difference between them is that digital signals are discrete and quantized, as defined below, while analog signals possess neither property.

[edit] Discretization

Main article: Discrete signal

One of the fundamental distinctions between different types of signals is between continuous and discrete time. In the mathematical abstraction, the domain of a continuous-time (CT) signal is the set of real numbers (or some interval thereof), whereas the domain of a discrete-time signal is the set of integers (or some interval). What these integers represent depends on the nature of the signal.

DT signals often arise via of CT signals. For instance, sensors output data continuously, but since a continuous stream may be difficult to record, a discrete-time signal is often used as an approximation. Computers and other digital devices are restricted to discrete time.

[edit] Quantization

If a signal is to be represented as a sequence of numbers, it is impossible to maintain arbitrarily high precision - each number in the sequence must have a finite number of digits. As a result, the values of such a signal are restricted to belong to a finite set; in other words, it is quantized.

[edit] Examples of signals

  • Motion. The motion of a particle through some space can be considered to be a signal, or can be represented by a signal. The domain of a motion signal is one-dimensional (time), and the range is generally three-dimensional. Position is thus a 3-vector signal; position and orientation is a 6-vector signal.
  • Sound. Since a sound is a vibration of a medium (such as air), a sound signal associates a pressure value to every value of time and three space coordinates. A microphone converts sound pressure at some place to just a function of time, using a voltage signal as an analog of the sound signal.
  • Compact discs (CDs). CDs contain discrete signals representing sound, recorded at 44,100 samples per second. Each sample contains data for a left and right channel, which may be considered to be a 2-vector (since CDs are recorded in stereo).
  • Pictures. A picture assigns a color value to each of a set of points. Since the points lie on a plane, the domain is two-dimensional. If the picture is a physical object, such as a painting, it's a continuous signal. If the picture a digital image, it's a discrete signal. It's often convenient to represent color as the sum of the intensities of three primary colors, so that the signal is vector-valued with dimension three.
  • Videos. A video signal is a sequence of images. A point in a video is identified by its position (two-dimensional) and by the time at which it occurs, so a video signal has a three-dimensional domain. Analog video has one continuous domain dimension (across a scan line) and two discrete dimensions (frame and line).
  • Biological membrane potentials. The value of the signal is a straightforward electric potential ("voltage"). The domain is more difficult to establish. Some cells or organelles have the same membrane potential throughout; neurons generally have different potentials at different points. These signals have very low energies, but are enough to make nervous systems work; they can be measured in aggregate by the techniques of electrophysiology.

[edit] Frequency analysis

Main article: Frequency domain

It is often useful to analyze the frequency spectrum of a signal. This technique is applicable to all signals, both continuous-time and discrete-time. For instance, if a signal is passed through an LTI system, the frequency spectrum of the resulting output signal is the product of the frequency spectrum of the original input signal and the frequency response of the system.

[edit] Entropy

Another important property of a signal (actually, of a statistically defined class of signals) is its entropy or information content.

[edit] See also

[edit] Works cited

Shannon, C. E., 2005 [1948], "A Mathematical Theory of Communication," (corrected reprint), accessed Dec. 15, 2005. Orig. 1948, Bell System Technical Journal, vol. 27, pp. 379-423, 623-656.