Cluster decomposition theorem
From Wikipedia, the free encyclopedia
In physics, the cluster decomposition theorem guarantees locality in quantum field theory. According to this theorem, the vacuum expectation value of a product of many operators - each of them being either in region A or in region B where A and B are very separated - asymptotically equals the product of the expectation value of the product of the operators in A, times a similar factor from the region B. Consequently, sufficiently separated regions behave independently.
If A1, ..., An are n operators each localized in a bounded region and U(a) represents the unitary operator actively translating the Hilbert space by the vector a, then if we pick some subset of the n operators to translate,
where |Ω> is the vacuum state, and
provided a is a spacelike vector.
Expressed in terms of the connected correlation functions, it means if some of the arguments of the connected correlation function are shifted by large spacelike separations, the function goes to zero.
This theorem only holds if the vacuum is a pure state. If the vacuum is degenerate and we have a mixed state, the cluster decomposition theorem fails.
If the theory has a mass gap m>0, then there is a value a0 beyond which the connected correlation function is absolutely bounded by c e-m|a| where c is some coefficient and |a| is the length of the vector a for |a|>a0.