Seismic moment
From Wikipedia, the free encyclopedia
Seismic moment is a quantity used by earthquake seismologists to measure the size of an earthquake. The scalar seismic moment M0 is defined by the equation M0 = μAu, where
- μ is the shear modulus of the rocks involved in the earthquake, typically 30 gigapascals[1]
- A is the area of the rupture along the geologic fault where the earthquake occurred, and
- u is the average displacement on A.
The seismic moment of an earthquake is typically estimated using whatever information is available to constrain its factors. For modern earthquakes, moment is usually estimated from ground motion recordings of earthquakes known as seismograms. For earthquakes that occurred in times before modern instruments were available, moment may be estimated from geologic estimates of the size of the fault rupture and the displacement.
Seismic moment is the basis of the moment magnitude scale introduced by Hiroo Kanamori, which is often used to compare the size of different earthquakes and is especially useful for comparing the sizes of especially large (great) earthquakes.
- See also: Richter magnitude scale
[edit] References
- Aki, Keiti; Richards, Paul G. (2002). Quantitative seismology, 2nd ed., University Science Books. ISBN 0-935702-96-2.
- Fowler, C. M. R. (1990). The solid earth. Cambridge, UK: Cambridge University Press. ISBN 0-521-38590-3.