Beam divergence
From Wikipedia, the free encyclopedia
The beam divergence of an electromagnetic beam is the increase in beam diameter with distance from the aperture from which the beam emerges in any plane that intersects the beam axis.
Beam divergence is usually used to characterize electromagnetic beams in the optical regime, in other words cases in which the aperture from which the beam emerges is very large with respect to the wavelength.
Beam divergence usually refers to a beam of circular cross section, but not necessarily so. A beam may, for example, have an elliptical cross section, in which case the orientation of the beam divergence must be specified, for example with respect to the major or minor axis of the elliptical cross section.
To calculate the divergence of a beam only three parameters are needed: the beam diameter at two separate points (Di, Df), and the distance (l) between these points. By subtracting the final beam diameter from the initial beam diameter then dividing that by the distance between the measured diameters, the beam divergence is obtained:
- .
Like all electromagnetic beams, lasers are subject to divergence, which is measured in milliradians (mrads) or degrees. For many applications, a lower-divergence beam is preferable. Neglecting divergence due to poor beam quality, the divergence of a laser beam is proportional to its wavelength and inversely proportional to the diameter of the beam at its narrowest point. For example, an ultraviolet laser that emits at 308 nm will have a lower divergence than an infrared laser at 808 nm, if both have the same minimum beam diameter.