[time-nuts] Visual clock comparison

David J Taylor david-taylor at blueyonder.co.uk
Mon Apr 20 06:12:06 UTC 2015


From: Chris Albertson

I think the question really was "How close must two visual clock
displays be to be perceived as being exactly in sync?".  Some people
(but not me) can see a 1/10 second difference and to me a one second
difference is obvious.  The answer is likely between 1.0 and 0.1
seconds.  But if you add a "tick" every second then the 1/10 second
difference is very easy to hear but most people can't hear a 1/40th
second difference.
[]
=========================================

In HDTV, the BBC report that an offset of 20 milliseconds between sound and 
vision can be identified by most users, and they have achieved 2 
milliseconds in tests.

  https://tech.ebu.ch/docs/techreview/trev_2009-Q1_HD-Audio-Delays.pdf

Cheers,
David
-- 
SatSignal Software - Quality software written to your requirements
Web: http://www.satsignal.eu
Email: david-taylor at blueyonder.co.uk 




More information about the Time-nuts_lists.febo.com mailing list