Sigmoid function

Plot of the error function

A sigmoid function is a mathematical function having an "S" shape (sigmoid curve). Often, sigmoid function refers to the special case of the logistic function shown in the first figure and defined by the formula

S(t) = \frac{1}{1 + e^{-t}}.

Other examples of similar shapes include the Gompertz curve (used in modeling systems that saturate at large values of t) and the ogee curve (used in the spillway of some dams). A wide variety of sigmoid functions have been used as the activation function of artificial neurons, including the logistic and hyperbolic tangent functions. Sigmoid curves are also common in statistics as cumulative distribution functions, such as the integrals of the logistic distribution, the normal distribution, and Student's t probability density functions.

Definition

A sigmoid function is a bounded differentiable real function that is defined for all real input values and has a positive derivative at each point.[1]

Properties

In general, a sigmoid function is real-valued and differentiable, having either a non-negative or non-positive first derivative which is bell shaped. There are also a pair of horizontal asymptotes as t \rightarrow \pm \infty. The differential equation  \tfrac{\mathrm{d}}{\mathrm{d}t} S(t) = c_1 S(t) \left( c_2 - S(t) \right), with the inclusion of a boundary condition providing a third degree of freedom, c_3, provides a class of functions of this type.

Examples

Some sigmoid functions compared. In the drawing all functions are normalized in such a way that their slope at the origin is 1.

Many natural processes, such as those of complex system learning curves, exhibit a progression from small beginnings that accelerates and approaches a climax over time. When a detailed description is lacking, a sigmoid function is often used[2] .

Besides the logistic function, sigmoid functions include the ordinary arctangent, the hyperbolic tangent, the Gudermannian function, and the error function, but also the generalised logistic function and algebraic functions like f(x)=\tfrac{x}{\sqrt{1+x^2}}.

The integral of any smooth, positive, "bump-shaped" function will be sigmoidal, thus the cumulative distribution functions for many common probability distributions are sigmoidal. The most famous such example is the error function, which is related to the cumulative distribution function (CDF) of a normal distribution.

See also

Wikimedia Commons has media related to Sigmoid functions.

References

  1. Han, Jun; Morag, Claudio (1995). "The influence of the sigmoid function parameters on the speed of backpropagation learning". In Mira, José; Sandoval, Francisco. From Natural to Artificial Neural Computation. pp. 195–201.
  2. Gibbs, M.N. (Nov 2000). "Variational Gaussian process classifiers". IEEE Transactions on Neural Networks 11 (6): 1458–1464. doi:10.1109/72.883477.
This article is issued from Wikipedia - version of the Sunday, January 31, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.