Gutenberg–Richter law

From Wikipedia, the free encyclopedia

In seismology, the Gutenberg–Richter law[1] expresses the relationship between the magnitude and total number of earthquakes in any given region and time period.

\!\,{\log N} = A - b M

or

\!\,N = 10^{A - b M}

Where:

  • \!\, N is the number of events in a given magnitude range
  • \!\, M is a magnitude range
  • \!\, A and \!\, b are constants

The relationship was first proposed by Charles Francis Richter and Beno Gutenberg. The relationship is surprisingly robust and does not vary significantly from region to region or over time.

The constant b is typically equal to 1.0. This means that for every magnitude 4.0 event there will be 10 magnitude 3.0 quakes and 100 magnitude 2.0 quakes. A notable exception is during earthquake swarms when the b-value can become as high as 2.5 indicating a large proportion of small quakes to large ones. A b-value significantly different from 1.0 may suggest a problem with the data set; e.g. it is incomplete or contains errors in calculating magnitude. The "roll off" of the b-value is an indicator of the completeness of the data set at the low magnitude end.

The a-value is of less scientific interest and simply indicates the total seismicity rate of the region.

Modern attempts to understand the law involve theories of self-organized criticality or self similarity.

  1. ^ B. Gutenberg and C.F. Richter, Seismicity of the Earth and Associated Phenomena, 2nd ed. (Princeton, N.J.: Princeton University Press, 1954).
Languages