Orthogonality

From Wikipedia, the free encyclopedia

In mathematics, orthogonal is synonymous with perpendicular when used as a simple adjective that is not part of any longer phrase with a standard definition. It means at right angles. It comes from the Greek ὀρθός orthos, meaning "straight", used by Euclid to mean right; and γωνία gonia, meaning angle. Two streets that cross each other at a right angle are orthogonal to one another.

Contents

[edit] Explanation

Formally, two vectors x and y in an inner product space V are orthogonal if their inner product \langle x, y \rangle is zero. This situation is denoted x \perp y.

Two vector subspaces A and B of vector space V are called orthogonal subspaces if each vector in A is orthogonal to each vector in B. The largest subspace that is orthogonal to a given subspace is its orthogonal complement.

A linear transformation T : V \rightarrow V is called an orthogonal linear transformation if it preserves the inner product. That is, for all pairs of vectors x and y in the inner product space V,

\langle Tx, Ty \rangle = \langle x, y \rangle.

This means that T preserves the angle between x and y, and that the lengths of Tx and x are equal.

A term rewriting system is said to be orthogonal if it is left-linear and is non-ambiguous. Orthogonal term rewriting systems are confluent.

The word normal is sometimes also used in place of orthogonal. However, normal can also refer to unit vectors. In particular, orthonormal refers to a collection of vectors that are both orthogonal and normal (of unit length). So, using the term normal to mean "orthogonal" is often avoided.

In some contexts, two things are said to be orthogonal if they are mutually exclusive.

[edit] In Euclidean vector spaces

In 2- or 3-dimensional Euclidean space, two vectors are orthogonal if their dot product is zero, i.e. they make an angle of 90° or π/2 radians. Hence orthogonality of vectors is a generalization of the concept of perpendicular. In terms of vector subspaces, the orthogonal complement of a line is the plane perpendicular to it, and vice versa. Note however that there is no correspondence with regards to perpendicular planes, because vectors in subspaces start from the origin.

In 4-dimensional Euclidean space, the orthogonal complement of a line is a hyperplane and vice versa, and that of a plane is a plane.

Several vectors are called pairwise orthogonal if any two of them are orthogonal, and a set of such vectors is called an orthogonal set. Such a set is an orthonormal set if all its vectors are unit vectors. Non-zero pairwise orthogonal vectors are always linearly independent.

[edit] Orthogonal functions

It is common to use the following inner product for two functions f and g:

\langle f, g\rangle_w = \int_a^b f(x)g(x)w(x)\,dx.

Here we introduce a nonnegative weight function w(x) in the definition of this inner product.

We say that those functions are orthogonal if that inner product is zero:

\int_a^b f(x)g(x)w(x)\,dx = 0.

We write the norms with respect to this inner product and the weight function as

||f||_w = \sqrt{\langle f, f\rangle_w}

The members of a sequence { fi : i = 1, 2, 3, ... } are:

  • orthogonal if
\langle f_i, f_j \rangle=\int_{-\infty}^\infty f_i(x) f_j(x) w(x)\,dx=||f_i||^2\delta_{i,j}=||f_j||^2\delta_{i,j}
  • orthonormal
\langle f_i, f_j \rangle=\int_{-\infty}^\infty f_i(x) f_j(x) w(x)\,dx=\delta_{i,j}

where

\delta_{i,j}=\left\{\begin{matrix}1 & \mathrm{if}\ i=j \\ 0 & \mathrm{if}\ i\neq j\end{matrix}\right\}

is Kronecker's delta. In other words, any two of them are orthogonal, and the norm of each is 1 in the case of the orthonormal sequence. See in particular orthogonal polynomials.

[edit] Examples

  • The vectors (1, 3, 2), (3, −1, 0), (1/3, 1, −5/3) are orthogonal to each other, since (1)(3) + (3)(−1) + (2)(0) = 0, (3)(1/3) + (−1)(1) + (0)(−5/3) = 0, (1)(1/3) + (3)(1) − (2)(5/3) = 0. Observe also that the dot product of the vectors with themselves are the norms of those vectors, so to check for orthogonality, we need only check the dot product with every other vector.
  • The vectors (1, 0, 1, 0, ...)T and (0, 1, 0, 1, ...)T are orthogonal to each other. Clearly the dot product of these vectors is 0. We can then make the obvious generalization to consider the vectors in Z2n:
\mathbf{v}_k = \sum_{\begin{matrix}i=0\\ai+k < n\end{matrix}}^{n/a} \mathbf{e}_i
for some positive integer a, and for 1 ≤ ka − 1, these vectors are orthogonal, for example (1, 0, 0, 1, 0, 0, 1, 0)T, (0, 1, 0, 0, 1, 0, 0, 1)T, (0, 0, 1, 0, 0, 1, 0, 0)T are orthogonal.
  • Take two quadratic functions 2t + 3 and 5t2 + t − 17/9. These functions are orthogonal with respect to a unit weight function on the interval from −1 to 1. The product of these two functions is 10t3 + 17t2 − 7/9 t − 17/3, and now,
\int_{-1}^{1} \left(10t^3+17t^2-{7\over 9}t-{17\over 3}\right)\,dt = \left[{5\over 2}t^4+{17\over 3}t^3-{7\over 18}t^2-{17\over 3}t\right]_{-1}^{1}
=\left({5\over 2}(1)^4+{17\over 3}(1)^3-{7\over 18}(1)^2-{17\over 3}(1)\right)-\left({5\over 2}(-1)^4+{17\over 3}(-1)^3-{7\over 18}(-1)^2-{17\over 3}(-1)\right)
={19\over 9}-{19\over 9}=0.
  • The functions 1, sin(nx), cos(nx) : n = 1, 2, 3, ... are orthogonal with respect to Lebesgue measure on the interval from 0 to 2π. This fact is basic in the theory of Fourier series.

[edit] Derived meanings

Other meanings of the word orthogonal evolved from its earlier use in mathematics.

[edit] Art

In art the perspective imagined lines pointing to the vanishing point are referred to as 'orthogonal lines'.

[edit] Computer science

Orthogonality is a system design property facilitating feasiblility and compactness of complex designs. Orthogonality guarantees that modifying the technical effect produced by a component of a system neither creates nor propagates side effects to other components of the system. The emergent behavior of a system consisting of components should be controlled strictly by formal definitions of its logic and not by side effects resulting from poor integration, i.e. non-orthogonal design of modules and interfaces. Orthogonality reduces testing and development time because it is easier to verify designs that neither cause side effects nor depend on them.

For example, a car has orthogonal components and controls (e.g. accelerating the vehicle does not influence anything else but the components involved exclusively with the acceleration function). On the other hand, a non-orthogonal design might have its steering influence its braking (e.g. Electronic Stability Control), or its speed tweak its suspension.[1] Consequently, this usage is seen to be derived from the use of orthogonal in mathematics: One may project a vector onto a subspace by projecting it onto each member of a set of basis vectors separately and adding the projections if and only if the basis vectors are mutually orthogonal.

An instruction set is said to be orthogonal if any instruction can use any register in any addressing mode. This terminology results from considering an instruction as a vector whose components are the instruction fields. One field identifies the registers to be operated upon, and another specifies the addressing mode. An orthogonal instruction set uniquely encodes all combinations of registers and addressing modes.

[edit] Radio communications

In radio communications, multiple access schemes are orthogonal when a receiver can (theoretically) completely reject an arbitrarily strong unwanted signal. An example of an orthogonal scheme is Code Division Multiple Access, CDMA. Examples of non-orthogonal schemes are TDMA and FDMA.

[edit] Statistics

When performing statistical analysis, variables that affect a particular result are said to be orthogonal if they are independent. That is to say that by varying each separately, one can predict the combined effect of varying them jointly. If correlation is present, the factors are not orthogonal. In addition, orthogonality restrictions are necessary for inference. This meaning of orthogonality derives from the mathematical one, because orthogonal vectors are linearly independent.

[edit] Taxonomy

In taxonomy, an orthogonal classification is one in which no item is a member of more than one group, that is, the classifications are mutually exclusive.

[edit] Combinatorics

In combinatorics, two n×n Latin squares are said to be orthogonal if their superimposition yields all possible n2 combinations of entries. One can also have a more general definition of combinatorial orthogonality.

[edit] Quantum mechanics

In quantum mechanics, two eigenstates of a wavefunction, ψm and ψn, are orthogonal unless they are identical (i.e. m=n). This means, in Dirac notation, that < ψm | ψn > = 0 unless m=n, in which case < ψm | ψn > = 1. The fact that < ψm | ψn > = 1 is because wavefunctions are normalized.

[edit] See also

[edit] External links

  1. ^ Lincoln Mark VIII speed-sensitive suspension (MPEG video). Retrieved on September 15, 2006.