Heaviside step function

The Heaviside step function, using the half-maximum convention

The Heaviside step function, or the unit step function, usually denoted by H (but sometimes u or θ), is a discontinuous function whose value is zero for negative argument and one for positive argument. It is an example of the general class of step functions, all of which can be represented as linear combinations of translations of this one.

It seldom matters what value is used for H(0), since H is mostly used as a distribution. Some common choices can be seen below.

The function is used in operational calculus for the solution of differential equations, and represents a signal that switches on at a specified time and stays switched on indefinitely. Oliver Heaviside, who developed the operational calculus as a method for telegraphic communications, represented the function as 1.

It is the cumulative distribution function of a random variable which is almost surely 0. (See constant random variable.)

The Heaviside function is the integral of the Dirac delta function: H = δ. This is sometimes written as

 H(x) = \int_{-\infty}^x { \delta(s)} \, \mathrm{d}s

although this expansion may not hold (or even make sense) for x = 0, depending on which formalism one uses to give meaning to integrals involving δ.

Discrete form

An alternative form of the unit step, as a function of a discrete variable n:

H[n]=\begin{cases} 0, & n < 0, \\ 1, & n \ge 0, \end{cases}

where n is an integer. Unlike the usual (not discrete) case, the definition of H[0] is significant.

The discrete-time unit impulse is the first difference of the discrete-time step

 \delta\left[ n \right] = H[n] - H[n-1].

This function is the cumulative summation of the Kronecker delta:

 H[n] = \sum_{k=-\infty}^{n} \delta[k] \,

where

 \delta[k] = \delta_{k,0} \,

is the discrete unit impulse function.

Analytic approximations

For a smooth approximation to the step function, one can use the logistic function

H(x) \approx \frac{1}{2} + \frac{1}{2}\tanh(kx) = \frac{1}{1+\mathrm{e}^{-2kx}},

where a larger k corresponds to a sharper transition at x = 0. If we take H(0) = ½, equality holds in the limit:

H(x)=\lim_{k \rightarrow \infty}\frac{1}{2}(1+\tanh kx)=\lim_{k \rightarrow \infty}\frac{1}{1+\mathrm{e}^{-2kx}}.

There are many other smooth, analytic approximations to the step function.[1] Among the possibilities are:

\begin{align}
  H(x) &= \lim_{k \rightarrow \infty} \left(\frac{1}{2} + \frac{1}{\pi}\arctan(kx)\right)\\
  H(x) &= \lim_{k \rightarrow \infty}\left(\frac{1}{2} + \frac{1}{2}\operatorname{erf}(kx)\right)
\end{align}

These limits hold pointwise and in the sense of distributions. In general, however, pointwise convergence need not imply distributional convergence, and vice versa distributional convergence need not imply pointwise convergence.

In general, any cumulative distribution function of a continuous probability distribution that is peaked around zero and has a parameter that controls for variance can serve as an approximation, in the limit as the variance approaches zero. For example, all three of the above approximations are cumulative distribution functions of common probability distributions: The logistic, Cauchy and normal distributions, respectively.

Integral representations

Often an integral representation of the Heaviside step function is useful:

H(x)=\lim_{ \epsilon \to 0^+} -{1\over 2\pi i}\int_{-\infty}^\infty {1 \over \tau+i\epsilon} \mathrm{e}^{-i x \tau} \mathrm{d}\tau =\lim_{ \epsilon \to 0^+} {1\over 2\pi i}\int_{-\infty}^\infty {1 \over \tau-i\epsilon} \mathrm{e}^{i x \tau} \mathrm{d}\tau.

Zero argument

Since H is usually used in integration, and the value of a function at a single point does not affect its integral, it rarely matters what particular value is chosen of H(0). Indeed when H is considered as a distribution or an element of L^\infty (see Lp space) it does not even make sense to talk of a value at zero, since such objects are only defined almost everywhere. If using some analytic approximation (as in the examples above) then often whatever happens to be the relevant limit at zero is used.

There exist various reasons for choosing a particular value.

 H(x) = \tfrac{1}{2}(1+\sgn(x)).
 H(x) = \mathbf{1}_{[0,\infty)}(x).\,
The corresponding probability distribution is the degenerate distribution.
 H(x) = \mathbf{1}_{(0,\infty)}(x).\,

Antiderivative and derivative

The ramp function is the antiderivative of the Heaviside step function: R(x) := \int_{-\infty}^{x} H(\xi)\mathrm{d}\xi = x H(x).

The distributional derivative of the Heaviside step function is the Dirac delta function:  \tfrac{d H(x)}{dx} = \delta(x)

Fourier transform

The Fourier transform of the Heaviside step function is a distribution. Using one choice of constants for the definition of the Fourier transform we have


\hat{H}(s) = \lim_{N\to\infty}\int^N_{-N} \mathrm{e}^{-2\pi i x s} H(x)\,\mathrm{d}x  = \frac{1}{2} \left( \delta(s) - \frac{i}{\pi}\mathrm{p.v.}\frac{1}{s} \right).

Here \mathrm{p.v.}\frac{1}{s} is the distribution that takes a test function \varphi to the Cauchy principal value of \int^{\infty}_{-\infty} \varphi(s)/s\,\mathrm{d}s. The limit appearing in the integral is also taken in the sense of (tempered) distributions.

Hyperfunction representation

This can be represented as a hyperfunction as H(x) = \left(\frac{1}{2\pi i}\log(z),\frac{1}{2\pi i}\log(z)-1\right).

See also

References