0% found this document useful (0 votes)
6 views3 pages

The Meter 45877

Uploaded by

saleh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views3 pages

The Meter 45877

Uploaded by

saleh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

1.

0 Length

1.1 The Meter

At the turn of 19th century there were two distinct major length systems. The metric length unit
was the meter that was originally defined as 1/10,000,000 of the great arc from the pole to the
equator, through Paris. Data from a very precise measurement of part of that great arc was used
to define an artifact meter bar, which became the practical and later legal definition of the meter.
The English system of units was based on a yard bar, another artifact standard [6].

These artifact standards were used for over 150 years. The problem with an artifact standard for
length is that nearly all materials are slightly unstable and change length with time. For example,
by repeated measurements it was found that the British yard standard was slightly unstable. The
consequence of this instability was that the British inch ( 1/36 yard) shrank [7], as shown in table
1.1.

Table 1.1

1895 - 25.399978 mm

1922 - 25.399956 mm

1932 - 25.399950 mm

1947 - 25.399931 mm

The first step toward replacing the artifact meter was taken by Albert Michelson, at the request of
the International Committee of Weights and Measures (CIPM). In 1892 Michelson measured the
meter in terms of the wavelength of red light emitted by cadmium. This wavelength was chosen
because it has high coherence, that is, it will form fringes over a reasonable distance. Despite the
work of Michelson, the artifact standard was kept until 1960 when the meter was finally
redefined in terms of the wavelength of light, specifically the red-orange light emitted by excited
krypton-86 gas.

Even as this definition was accepted, the newly invented helium-neon laser was beginning to be
used for interferometry. By the 1970's a number of wavelengths of stabilized lasers were
considered much better sources of light than krypton red-orange for the definition of the meter.
Since there were a number of equally qualified candidates the International Committee on
Weights and Measures (CIPM) decided not to use any particular wavelength, but to make a
change in the measurement hierarchy. The solution was to define the speed of light in vacuum as
exactly 299,792,458 m/s, and make length a derived unit. In theory, a meter can be produced by
anyone with an accurate clock [8].

2
In practice, the time-of-flight method is impractical for most measurements, and the meter is
measured using known wavelengths of light. The CIPM lists a number of laser and atomic
sources and recommended frequencies for the light. Given the defined speed of light, the
wavelength of the light can be calculated, and a meter can be generated by counting wavelengths
of the light. Methods for this measurement are discussed in the chapter on interferometry.

1.2 The Inch

In 1866, the United Stated Surveyor General decided to base all geodetic measurements on an
inch defined from the international meter. This inch was defined such that there were exactly
39.37 inches in the meter. England continued to use the yard bar to define the inch. These
different inches continued to coexist for nearly 100 years until quality control problems during
World War II showed that the various inches in use were too different for completely
interchangeable parts from the English speaking nations. Meetings were held in the 1950's and
in 1959 the directors of the national metrology laboratories of the United States, Canada,
England, Australia and South Africa agreed to define the inch as 25.4 millimeters, exactly [9].
This definition was a compromise; the English inch being somewhat longer, and the U.S. inch
smaller. The old U.S. inch is still in use for commercial surveying of land in the form of the
"surveyor's foot," which is 12 old U.S. inches.

3
2.0 Gauge Blocks

2.1 A Short History of Gauge Blocks

By the end of the nineteenth century the idea of interchangeable parts begun by Eli Whitney had
been accepted by industrial nations as the model for industrial manufacturing. One of the
drawbacks to this new system was that in order to control the size of parts numerous gauges were
needed to check the parts and set the calibrations of measuring instruments. The number of
gauges needed for complex products, and the effort needed to make and maintain the gauges was
a significant expense. The major step toward simplifying this situation was made by C.E.
Johannson, a Swedish machinist.

Johannson's idea, first formulated in 1896 [10], was that a small set of gauges that could be
combined to form composite gauges could reduce the number of gauges needed in the shop. For
example, if four gauges of sizes 1 mm, 2 mm, 4 mm, and 8 mm could be combined in any
combination, all of the millimeter sizes from 1 mm to 15 mm could be made from only these four
gauges. Johannson found that if two opposite faces of a piece of steel were lapped very flat and
parallel, two blocks would stick together when they were slid together with a very small amount
of grease between them. The width of this "wringing" layer is about 25 nm, and was so small for
the tolerances needed at the time, that the block lengths could be added together with no
correction for interface thickness. Eventually the wringing layer was defined as part of the
length of the block, allowing the use of an unlimited number of wrings without correction for the
size of the wringing layer.

In the United States, the idea was enthusiastically adopted by Henry Ford, and from his example
the use of gauge blocks was eventually adopted as the primary transfer standard for length in
industry. By the beginning of World War I, the gauge block was already so important to industry
that the Federal Government had to take steps to insure the availability of blocks. At the
outbreak of the war, the only supply of gauge blocks was from Europe, and this supply was
interrupted.

In 1917 inventor William Hoke came to NBS proposing a method to manufacture gauge blocks
equivalent to those of Johannson [11]. Funds were obtained from the Ordnance Department for
the project and 50 sets of 81 blocks each were made at NBS. These blocks were cylindrical and
had a hole in the center, the hole being the most prominent feature of the design. The current
generation of square cross-section blocks have this hole and are referred to as "Hoke blocks."

2.2 Gauge Block Standards (U.S.)

There are two main American standards for gauge blocks, the Federal Specification GGG-G-15C
[12] and the American National Standard ANSI/ASME B89.1.9M [13]. There are very few
differences between these standards, the major ones being the organization of the material and
the listing of standard sets of blocks given in the GGG-G-15C specification. The material in the
ASME specification that is pertinent to a discussion of calibration is summarized below.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy