[time-nuts] When did computer clocks get so bad?
Alec Teal
alec at unifiedmathematics.com
Wed Sep 29 18:35:19 UTC 2021
Hi there,
I have a question and I cannot think of anyone better to ask, for a
project we need to time some things which are connected to a computer,
using NTP and later using a GPS over bluetooth serial ports, we have
discovered that computer clocks are terrible
If you remove a linear drift (for example assuming it ticks at 1.00026
seconds per second) it gets less terrible, and Linux can do this but it
is clear that the computer clock doesn't expose this coefficient to the
OS to let it compensate, it must be found (eg through NTP) - any ideas why?
But more concretely, my watch is actually pretty good, it's off by < 3
seconds and hasn't been set probably this year (I don't tend to bother
with DST stuff, not for any reason just never get round to it) - when I
was growing up and even now wall-clocks are not so terrible that I have
to fix them (or NTP does with computers) very routinely.
My theory is that super cheap crappy quartz clocks are now used in
things which can be reasonably expected to be online most of the time,
and thus use NTP - my watch cannot (and probably has temperature
correction too? Given the varied temps it is exposed to) any truth to this?
This is a very open ended question I understand, but if clocks were as
terrible as I've found every computer and thing I've checked recently,
why don't I remember setting wall clocks easily once a week?
Alec
More information about the Time-nuts_lists.febo.com
mailing list