[time-nuts] Re: Derivation of time from celestial sight

Steve Allen sla at ucolick.org
Tue Dec 28 17:05:16 UTC 2021


On Tue 2021-12-28T14:40:43+0000 Poul-Henning Kamp hath writ:
> You want a bright star as close to your latitudes Zenith as possible,
> to get maximum apperant transit velocity.

Most precise time determination in history was by meridian telescopes
which only had one axis of motion not counting the ability to reverse
the tube as a way of checking for systematic errors.
Near the zenith was best for the meridian telescopes.

The Danjon "Impersonal" Astrolabe with its prism and mercury bath
allowed selection of targets looking around the sky rather than
waiting for something near overhead.  The Danjon Astrolabes were
widely deployed as part of preparations for the International
Geophysical Year.  That allowed dozens of folks to take up
paid residence in funky places like Curacao for a year.
The data from the IGY then took a decade to reduce and publish.
At the point when the optical astronomy measures of the IGY were
published all of the techniques which had been used were obsolete.
(That was true of much of the rest of the IGY data because of
the advancement of technology during the late 1950s and early
1960s.)

The optical measurements suffered from the "personal equation" which
was a systematic offset of how much each observer tended to be early
or late.  Many of the optical measurements were done by pulling a
trigger.  Some has a screw to be turned to match the motion of the
star across the zenith and then measure by looking at the chart
recording of the potentiometer.

Stars pretty much at the zenith were required for Photographic Zenith
Tubes.  PZTs were unable to compare their results with anyone else
because each one had a unique list of target stars.  They were the
most accurate optical measure of earth rotation.

The evolution of precision of time measured by optical observatories
was plotted by the Stoykos as they prepared to retire from BIH.

https://www.ucolick.org/~sla/leapsecs/BHsHn05p142.html

Note that mean difference between the best observatories never fell as
low as a millisecond, and that was after the BIH had removed the
persistent systematic differences which resulted from the fact
that the observatories used conventional longitudes which were not
self consistent.

Note that the random error fell about as low as 0.4 millisecond for
the best observatories.

The longitudes and latitudes of everything changed in 1968 as the
new recommendations for terrestrial reference frame were implemented.
The USNO actually, and finally, removed the about 0.03 s offset to UT
that had been in all US measures of time since the inception of USNO.

Of course for non-stationary observatories the 18th through 20th
century measurement to determine the offset of the chronometer was
using a sextant to shoot a lunar and determine the time based on the
angle between moon and star.

--
Steve Allen                    <sla at ucolick.org>              WGS-84 (GPS)
UCO/Lick Observatory--ISB 260  Natural Sciences II, Room 165  Lat  +36.99855
1156 High Street               Voice: +1 831 459 3046         Lng -122.06015
Santa Cruz, CA 95064           https://www.ucolick.org/~sla/  Hgt +250 m




More information about the Time-nuts_lists.febo.com mailing list