Vertical section of Cavendish balance |
In 1798, the English
physicist and chemist Henry Cavendish was the first to measure Newton's universal
gravitational constant (G) using a spectacularly ingenious
method, which has been scarcely improved later. The method was devised by John
Michell, who died without being able to carry it out, so Cavendish performed the
experiment. In fact, his objective was not to measure the constant, but the
mass of the Earth, but the value of the constant could be inferred from the
result.
Cavendish’s
instrument was a torsion balance
from which two identical balls of lead hung. Next to these balls, one on one
side and one on the other, hung two much larger lead spheres, 175 kg each,
which attracted the first two, producing a slight twist of the balance, which
Cavendish could observe by means of a small telescope located outside the
enclosure, to avoid observer interference. He thus detected a displacement of about
4 mm, which he measured with a precision of ¼ mm. This allowed him to calculate
that the density of the Earth is 5.448 times greater than that of water, from
which it is possible to deduce the mass of the Earth and the value of G:
G=6.674×10-11N.m2/kg2
This is
the official value, which is known with quite a low accuracy (1 in 10,000), compared
with other universal constants.
Cavendish’s
experiment is still being used to measure the universal gravitational constant.
Two recent experiments performed in
China by a team directed by Luo Jun, using steel balls
and vacuum chambers to prevent interference, has got the following results:
G=6.674184×10-11
±11.64 ppm
G=6.674484×10-11
±11.61 ppm
where ppm
means parts per million. The uncertainty of the results of these two
experiments is the lowest obtained so far when measuring G.
The value of G accepted
previously, based on experiments carried out during the last 40 years, is the
following:
G=6.67408×10-11
±47 ppm
The two values
obtained in the new experiments are, therefore, slightly above the generally
accepted value, but have a much smaller uncertainty (about 4 times lower). The
minimum uncertainty previously obtained among all the measurements taken (there
have been many) was 13.7 ppm, slightly worse than that of the new experiments.
In comparison, the uncertainty of the Cavendish experiment was 1%.
To understand the
meaning of these numbers, we must remember three different statistical concepts
which are used in measurements:
•
Accuracy:
distance between the measured value and the real value.
•
Precision: ability
of an instrument to give the same results in different measurements.
• Uncertainty: Applied
to an instrument, we speak of calibration
uncertainty. Applied to a specific measurement,
it measures the dispersion of the values obtained when performing several
times the same experiment.
Accuracy and precision of a measurement |
Note that
measurements can be very precise but little accurate, and vice versa.
This explains why the different measurements made of this constant, including
the last two made by the Chinese team, do not match each other, in the sense
that, if we convert them into intervals, they don’t overlap. So, the two
previous measurements, converted to uncertainty intervals, would be:
(6.674106,
6.674262) y (6.674407, 6.674561)
It can be seen that both intervals are above the commonly
admitted value, although one of them (the smallest) is totally included
in the most probable interval considered above, while the other is outside and
above that interval.
An
article in Science News shows an explanatory graph that makes it possible to
compare the results (accuracy and uncertainty) of the two new experiments with
those previously made. It can be seen that the dispersion of the results is
quite large, and that the intervals rarely overlap.
All this means that we must keep performing
experiments.
The same post in Spanish
Thematic thread on Standard Cosmology: Preceding Next
Manuel Alfonseca
No comments:
Post a Comment