A Shack–Hartmann (or Hartmann–Shack) wavefront sensor (SHWFS) is an optical instrument used to characterize an imaging system. It is a wavefront sensor commonly used in adaptive optics systems. It consists of an array of lenses (called lenslets) of the same focal length. Each is focused onto a photon sensor (typically a CCD array or quad-cell). The local tilt of the wavefront across each lens can then be calculated from the position of the focal spot on the sensor. Any phase aberration can be approximated to a set of discrete tilts. By sampling an array of lenslets all of these tilts can be measured and the whole wavefront approximated.
Since only tilts are measured the Shack–Hartmann can not detect discontinuous steps in the wavefront.
The design of this sensor was based on an aperture array that had been developed in 1900 by Johannes Franz Hartmann as a means to trace individual rays of light through the optical system of a large telescope, thereby testing the quality of the image.[1] In the late 1960s, Roland Shack and Ben Platt modified the Hartmann screen by replacing the apertures in an opaque screen by an array of lenslets.[2][3] The terminology as proposed by Shack and Platt was Hartmann screen. The fundamental principle seems to be documented even before Huygens by the Jesuit philosopher, Christopher Scheiner,[4] in Austria.
Shack–Hartmann sensors are used to characterize eyes for corneal treatment of complex refractive errors.[5][6] Recently, Pamplona et al.[7] developed an inverse of the Shack–Hartmann system to measure one's eye lens aberrations. While Shack–Hartmann sensors measure localized slope of the wavefront error using spot displacement in sensor plane, Pamplona et al. make the user shift the spots till they are aligned. The knowledge of this shift provides data to estimate the first-order parameters such as radius of curvature and hence error due to defocus and spherical aberration.
In September 2010, a research group at the MIT Media Lab demonstrated application of the inverse Shack-Hartmann wavefront sensor coupled with a high-resolution mobile phone display to perform refractive eye exams on a mobile phone. This solution involves a display set at very close range to the human eye. The laser which is usually shined into the eye of the patient is replaced by user interaction in which the subject looks into a display and aligns patterns which pass through different visual regions, thus giving a measure of optical distortions and also myopia, hyperopia, and astigmatism. [8]