A micrometer (pronounced /maɪˈkrɒmɨtər/, us dict: mī·krŏm′·ĭ·tər), sometimes known as a micrometer screw gauge, is a device incorporating a calibrated screw used widely for precise measurement of small distances in mechanical engineering and machining as well as most mechanical trades, along with other metrological instruments such as dial, vernier, and digital calipers. Micrometers are often, but not always, in the form of calipers.
Colloquially the word micrometer is often shortened to mike (IPA: /mаɪk/) (enPR: mīk).
Contents |
The image shows three common types of micrometer; the names are based on their application:
Each type of micrometer caliper can be fitted with specialized anvils and spindle tips for particular measuring tasks. For example, the anvil may be shaped in the form of a segment of screw thread; in the form of a v-block; in the form of a large disc; etc.
Universal micrometer sets come with interchangeable anvils: flat, spherical, spline, disk, blade, point, knife-edge, etc. The term universal micrometer may also refer to a type of micrometer whose frame has modular components, allowing one micrometer to function as outside mic, depth mic, step mic, etc (often known by the brand names Mul-T-Anvil and Uni-Mike).
Blade mics have a matching set of narrow tips (blades). They allow, for example, the measuring of a narrow o-ring groove.
Pitch-diameter mics have a matching set of thread-shaped tips for measuring the pitch diameter of screw threads.
Limit mics have two anvils and two spindles, and are used like a snap gauge. The part being checked must pass through the first gap and must stop at the second gap in order to be within specification.
Micrometer stops are essentially inside mics that are mounted on the table of a manual milling machine or other machine tool, in place of simple stops. They help the operator to position the table precisely.
Micrometers use the principle of a screw to amplify small distances that are too small to measure directly into large rotations of the screw that are big enough to read from a scale. The accuracy of a micrometer derives from the accuracy of the threadform that is at its heart. The basic operating principles of a micrometer are as follows:
For example, if the lead of a screw is 1 mm, but the major diameter (here, outer diameter) is 10 mm, then the circumference of the screw is 10π, or about 31.4 mm. Therefore, an axial movement of 1 mm is amplified (magnified) to a circumferential movement of 31.4 mm. This amplification allows a small difference in the sizes of two similar measured objects to correlate to a larger difference in the position of a micrometer's thimble.
In older micrometers the position of the thimble is read directly from scale markings on the thimble and shaft. A vernier scale is usually included, which allows the position to be read to a fraction of the smallest scale mark. In newer digital micrometers, an electronic readout displays the length digitally on an LCD display on the instrument.
A micrometer is composed of:
The spindle of an inch-system micrometer has 40 threads per inch, so that one turn moves the spindle axially 0.025 inch (1 ÷ 40 = 0.025), equal to the distance between two graduations on the frame. The 25 graduations on the thimble allow the 0.025 inch to be further divided, so that turning the thimble through one division moves the spindle axially 0.001 inch (0.025 ÷ 25 = 0.001). Thus, the reading is given by the number of whole divisions that are visible on the scale of the frame, multiplied by 25 (the number of thousandths of an inch that each division represents), plus the number of that division on the thimble which coincides with the axial zero line on the frame. The result will be the diameter expressed in thousandths of an inch. As the numbers 1, 2, 3, etc., appear below every fourth sub-division on the frame, indicating hundreds of thousandths, the reading can easily be taken mentally.
Suppose the thimble were screwed out so that graduation 2, and three additional sub-divisions, were visible (as shown in the image), and that graduation 1 on the thimble coincided with the axial line on the frame. The reading then would be 0.2000 + 0.075 + 0.001, or .276 inch.
The spindle of an ordinary metric micrometer has 2 threads per millimetre, and thus one complete revolution moves the spindle through a distance of 0.5 millimetre. The longitudinal line on the frame is graduated with 1 millimetre divisions and 0.5 millimetre subdivisions. The thimble has 50 graduations, each being 0.01 millimetre (one-hundredth of a millimetre). Thus, the reading is given by the number of millimetre divisions visible on the scale of the sleeve plus the particular division on the thimble which coincides with the axial line on the sleeve.
Suppose that the thimble were screwed out so that graduation 5, and one additional 0.5 subdivision were visible (as shown in the image), and that graduation 28 on the thimble coincided with the axial line on the sleeve. The reading then would be 5.00 + 0.5 + 0.28 = 5.78 mm.
Some micrometers are provided with a vernier scale on the sleeve in addition to the regular graduations. These permit measurements within 0.001 millimetre to be made on metric micrometers, or 0.0001 inches on inch-system micrometers.
The additional digit of these micrometers is obtained by finding the line on the sleeve vernier scale which exactly coincides with one on the thimble. The number of this coinciding vernier line represents the additional digit.
Thus, the reading for metric micrometers of this type is the number of whole millimetres (if any) and the number of hundredths of a millimetre, as with an ordinary micrometer, and the number of thousandths of a millimetre given by the coinciding vernier line on the sleeve vernier scale.
For example, a measurement of 5.783 millimetres would be obtained by reading 5.5 millimetres on the sleeve, and then adding 0.28 millimetre as determined by the thimble. The vernier would then be used to read the 0.003 (as shown in the image).
Inch micrometers are read in a similar fashion.
Note: 0.01 millimetre = 0.000393 inch, and 0.002 millimetre = 0.000078 inch (78 millionths) or alternately, 0.0001 inch = 0.00254 millimetres. Therefore, metric micrometers provide smaller measuring increments than comparable inch unit micrometers—the smallest graduation of an ordinary inch reading micrometer is 0.001 inch; the vernier type has graduations down to 0.0001 inch (0.00254 mm). When using either a metric or inch micrometer, without a vernier, smaller readings than those graduated may of course be obtained by visual interpolation between graduations.
An additional feature of many micrometers is the inclusion of a torque-limiting device on the thimble—either a spring-loaded ratchet or a friction sleeve. Without this device, workers may overtighten the micrometer on the work, causing the mechanical advantage of the screw to squeeze the material or tighten the screw threads, giving an inaccurate measurement. However, with a thimble that will ratchet or friction slip at a certain torque, the micrometer will not continue to advance once sufficient resistance is encountered. This results in greater accuracy and repeatability of measurements—most especially for low-skilled or semi-skilled workers, who may not have developed the light, consistent touch of a skilled user.
A standard ordinary one-inch micrometer has readout divisions of .001 inch and a rated accuracy of +/- .0001 inch. [1] Both the measuring instrument and the object being measured should be at room temperature for an accurate measurement; dirt, abuse, and operator skill are the main sources of error. [2]
The accuracy of micrometers is checked by using them to measure gauge blocks, rods, or similar standards whose lengths are precisely and accurately known. If the gauge block is known to be 0.7500" (± .00005"), then the micrometer should measure it as 0.7500". If the micrometer measures 0.7516", then it is out of calibration.
The accuracy of the gauge blocks themselves is traceable through a chain of comparisons back to a master standard, such as are maintained in measurement standards laboratories.
The word micrometer is a neoclassical coinage from Greek micros, "small", and metron, "measure". The Merriam-Webster Collegiate Dictionary[3] says that English got it from French and that its first known appearance in English writing was in 1670. Neither the metre nor the micrometre nor the micrometer (device) as we know them today existed at that time. However, humans of that time did have much need for, and interest in, the ability to measure small things, and small differences; the word no doubt was coined in reference to this endeavor, even if it did not refer specifically to its present-day senses.
The first ever micrometric screw was invented by William Gascoigne in the 17th century, as an enhancement of the vernier; it was used in a telescope to measure angular distances between stars. Its adaptation for the precise measurement of handheld objects was made by Jean Laurent Palmer of Paris in 1848[4]; the device is therefore often called palmer in French, and tornillo de Palmer ("Palmer screw") in Spanish. (Those languages also use the micrometer cognates: micromètre, micrómetro.) The micrometer caliper was introduced to the mass market in anglophone countries by Brown & Sharpe in 1867,[5] allowing the penetration of the instrument's use into the average machine shop. Brown & Sharpe were inspired by several earlier devices, one of them being Palmer's design. In 1888 Edward Williams Morley added to the precision of micrometric measurements and proved their accuracy in a complex series of experiments.
|
|