Propagation of uncertainty

From Wikipedia, the free encyclopedia

In statistics, propagation of uncertainty (or propagation of error) is the effect of variables' uncertainties (or errors) on the uncertainty of a function based on them. Mainly, the variables are measured in an experiment, and have uncertainties due to measurement limitations (e.g. instrument precision) which propagate to the result.

The uncertainty is usually defined by the absolute error — a variable that is probable to get the values x±Δx is said to have an uncertainty (or margin of error) of Δx. In other words, for a measured value x, it is probable that the true value lies in the interval [x−Δx, xx]. Uncertainties can also be defined by the relative error Δx/x, which is usually written as a percentage. In many cases it is assumed that the difference between a measured value and the true value is normally distributed, with the standard deviation of the distribution being the uncertainty of the measurement.

This article explains how to calculate the uncertainty of a function if the variables' uncertainties are known.

Contents

[edit] General formula

Let f(x1,x2,...,xn) be a function which depends on n variables x1,x2,...,xn. The uncertainty of each variable is given by Δxj:

x_j \pm \Delta x_j\, .

If the variables are uncorrelated, we can calculate the uncertainty Δf of f that results from the uncertainties of the variables:

\Delta f = \Delta f \left(x_1, x_2, ..., x_n, \Delta x_1, \Delta x_2, ..., \Delta x_n \right) = \left( \sum_{i=1}^n \left(\frac{\partial f}{\partial x_i}\Delta x_i \right)^2 \right)^{1/2} \, ,

where \frac{\partial f}{\partial x_j} designates the partial derivative of f for the j-th variable.

If the variables are correlated, the covariance between variable pairs, Ci,k := cov(xi,xk), enters the formula with a double sum over all pairs (i,k):

\Delta f = \left( \sum_{i=1}^n \sum_{k=1}^n \left(\frac{\partial f}{\partial x_i}\frac{\partial f}{\partial x_k}C_{i,k} \right) \right)^{1/2}\, ,

where Ci,i = var(xi) = Δxi².

After calculating Δf, we can say that the value of the function with its uncertainty is:

f \pm \Delta f \, .

[edit] Example formulas

This table shows the uncertainty of simple functions, resulting from uncorrelated variables A, B, C with uncertainties ΔA, ΔB, ΔC, and a precisely-known constant c.

Function Uncertainty
X = A \pm B \, (\Delta X)^2= (\Delta A)^2 + (\Delta B)^2 \,
X = cA \, \Delta X = c \Delta A \,
X = c (A \cdot B) \, or X = c \left( \frac{A}{B} \right) \, \left( \frac{\Delta X}{X} \right)^2 = \left( \frac{\Delta A}{A} \right)^2 + \left( \frac{\Delta B}{B} \right)^2 \,
X = c(A \cdot B \cdot C) \, or X = c \left( \frac{A}{B} \right) \cdot C \left( \frac{\Delta X}{X} \right)^2 = \left(\frac{\Delta A}{A} \right)^2 + \left(\frac{\Delta B}{B} \right)^2 + \left(\frac{\Delta C}{C} \right)^2 \,
X = cA^n \, \frac{\Delta X}{X} = |n| \frac{\Delta A}{A} \,
X = \ln (cA) \, \Delta X = \frac{\Delta A}{A} \,
X = e^A \, \frac{\Delta X}{X} = \Delta A \,

[edit] Example calculation: Inverse tangent function

We can calculate the uncertainty propagation for the inverse tangent function as an example of using partial derivatives to propagate error.

Define

f(θ) = arctanθ,

where σθ is the absolute uncertainty on our measurement of θ.

The partial derivative of f(θ) with respect to θ is

\frac{\partial f}{\partial \theta} = \frac{1}{1+\theta^2}.

Therefore, our propagated uncertainty is

\sigma_{f} = \frac{\sigma_{\theta}}{1+\theta^2},

where σf is the absolute propagated uncertainty.

[edit] Example application: Resistance measurement

A practical application is an experiment in which one measures current, I, and voltage, V, on a resistor in order to determine the resistance, R, using Ohm's law, R = V / I.

Given the measured variables with uncertainties, I±ΔI and V±ΔV, the uncertainty in the computed quantity, ΔR is

\Delta R = \left( \left(\frac{\Delta V}{I}\right)^2+\left(\frac{V}{I^2}\Delta I\right)^2\right)^{1/2} = R\sqrt{\left(\frac{\Delta V}{V}\right)^2+\left(\frac{\Delta I}{I}\right)^2}.

Thus, in this simple case, the relative error ΔR/R is simply the square root of the sum of the squares of the two relative errors of the measured variables.

[edit] External links

[edit] See also

In other languages