Magnitude (astronomy)

From Wikipedia, the free encyclopedia

In astronomy, magnitude refers to the logarithmic measure of the brightness of an object, measured in a specific wavelength or passband, usually in optical or near-infrared wavelengths. The scale is based on each magnitude being 2.512 times greater than the last, making a 1st magnitude star 100 times the brightness of a 6th magnitude star. Two specific types of magnitudes distinguished by astronomers are:

  • Apparent magnitude, the apparent brightness of an object. For example, Alpha Centauri has higher apparent magnitude (i.e. lower value) than Betelgeuse, because it is much closer to the Earth.
  • Absolute magnitude, which measures the luminosity of an object (or reflected light for non-luminous objects like asteroids); it is the object's apparent magnitude as seen from certain location. For stars it is 10 parsecs (32.6 light years). Betelgeuse has much higher absolute magnitude than Alpha Centauri, because it is much more luminous.

Usually only apparent magnitude is mentioned, because it can be measured directly; absolute magnitude can be derived from apparent magnitude and distance using the distance modulus.