Discussion:
Does Australia have an Atomic Clock?
(too old to reply)
Alasdair Baxter
2004-01-04 19:37:41 UTC
Permalink
I was looking for a radio-controlled watch which works all over the
world. Does anyone know if such a beast exists? According to my
information there are 4 standard time signals in the whole world and
these are UK, Germany, United States and Japan. I find it hard to
accept that developed countries like Australia, New Zealand, Canada
and Russia don't have an atomic clock which transmits a standard time
signal.

Can anyone enlighten me?
--

Alasdair Baxter, Nottingham, UK.Tel +44 115 9705100; Fax +44 115 9423263

"It's not what you say that matters but how you say it.
It's not what you do that matters but how you do it"
Kent Betts
2004-01-04 21:19:12 UTC
Permalink
"Alasdair Baxter"
Post by Alasdair Baxter
I find it hard to
accept that developed countries like Australia... don't have an atomic
clock

Has anyone tried using the US time signal to keep a watch synched in
Australia?....seems like a 20 kW or 50 kW signal at 60 kHz should make it
down there at least a few days a week.
John Rowland
2004-01-04 21:42:02 UTC
Permalink
Post by Alasdair Baxter
I was looking for a radio-controlled
watch which works all over the world.
Does anyone know if such a beast exists?
AFAIK the Casio GPS watch is the only such beast. I have worn it for a few
minutes, and IMO it's too big to be a useful watch, and too small to be a
useful GPS unit.
Post by Alasdair Baxter
According to my information there are 4 standard
time signals in the whole world and
these are UK, Germany, United States and Japan.
AFAIK there are five long-wave time signals suitable for watch reception
(there are 2 in different parts of Japan, on 40 kHz and 60 kHz).

There are approx 50 or 100 short wave transmitters, but I don't understand
much more about them. They are listed on the web somewhere, but I can't find
the details now. Someone here can probably explain why watches don't use
them.

Have you seen this? http://www.hy-gain.com/man/mfjpdf/MFJ-890.pdf . I don't
understand why this system can't be used to keep a watch on time anywhere in
the world, even though the watch would have to be manually initialised to
the right time and date after the battery was inserted. Maybe a watch using
this system could be comfortably small, unlike a GPS watch.
Post by Alasdair Baxter
I find it hard to accept that developed countries
like Australia, New Zealand, Canada and Russia
don't have an atomic clock which transmits
a standard time signal.
IIRC they all have short wave time signals.
--
John Rowland - Spamtrapped
Transport Plans for the London Area, updated 2001
http://www.geocities.com/Athens/Acropolis/7069/tpftla.html
A man's vehicle is a symbol of his manhood.
That's why my vehicle's the Piccadilly Line -
It's the size of a county and it comes every two and a half minutes
Jack Denver
2004-01-05 01:58:37 UTC
Permalink
Almost all industrialized countries broadcast some sort of audible radio
time signal based on an atomic clock - "at the tone the time will be X,
beep". However, self-synchrononizing radio controlled watches require a
special digitally encoded low frequency signal This is a relatively new
technology - for example, the current US system (though run on an
experimental basis much earlier) has been a viable (high powered) national
signal only since 1999 (and still does not cover 100% of the country,
especially during daytime). So the radio controlled watch is a relatively
new thing and you'll have to be patient before such signals blanket the
whole globe and even more patient before a single watch receives all the
incompatible broadcasts. Some countries may never implement a system of
their own as they can receive another country's signal for free - for
example, during the night all of Mexico and Central America and probably 99%
of the population of Canada is within range of the WWVB signal from the US
transmitter at Ft. Collins, Colorado.
Post by Alasdair Baxter
I was looking for a radio-controlled watch which works all over the
world. Does anyone know if such a beast exists? According to my
information there are 4 standard time signals in the whole world and
these are UK, Germany, United States and Japan. I find it hard to
accept that developed countries like Australia, New Zealand, Canada
and Russia don't have an atomic clock which transmits a standard time
signal.
Can anyone enlighten me?
--
Alasdair Baxter, Nottingham, UK.Tel +44 115 9705100; Fax +44 115 9423263
"It's not what you say that matters but how you say it.
It's not what you do that matters but how you do it"
R.L. Horn
2004-01-09 04:19:14 UTC
Permalink
Post by Jack Denver
However, self-synchrononizing radio controlled watches require a
special digitally encoded low frequency signal This is a relatively new
technology
Well, 39 years old. Which, I suppose, is pretty darn new by horological
standards. It's true that the ca. 1999 upgrades did a lot to make receivers
with dinky little ferrite rod antennas viable.
Post by Jack Denver
So the radio controlled watch is a relatively new thing and you'll have to
be patient before such signals blanket the whole globe
I can't see that happening. Low-bitrate (1 bps!) LF services like WWVB are
extremely big, expensive, and technologically (though not practically)
obsolete.
Jack Denver
2004-01-09 16:40:33 UTC
Permalink
I was pretty clear in my original post that WWVB existed earlier, but only
since the '99 upgrade has the power been sufficient to serve inexpensive
consumer devices like watches and alarm clocks. The signal may have been out
there (in weak form) for almost 40 years but the clocks and watches have
only been widely available or practical for the last 4 and and so these
devices are relatively new to the market in any terms.

I don't think WWVB is particularly expensive to operate, if you consider
that this single station covers most of N. America. NIST also put the system
together using bits and pieces - surplus Navy LF transmitters, reusing the
antenna from defunct WWVL etc. so I don't think they spent a huge amount of
money on this, at least in government terms where they spend billions on
various other things. When you divide the operating cost up among 300
milllion + people, it comes to a tiny fraction of a cent per day. Perhaps a
1/2 dozen more such transmitters would put 90% of the world's population in
range of a time signal, so it would not be a huge expense.

I agree that in this age of broadband and fiber optics, 1 BPS seems like a
ridiculously low bitrate, but the bitrate is sufficient for the purpose of
transmitting a time signal. So what if it takes a full minute to get a
complete time signal? You only need to get one signal/day for most purposes,
so that gives you 1,440 chances/day to synch. You can build very
inexpensive receivers around this signal (I see the clocks now for as little
as $12 retail).. All you have to do is distinguish, once per second,
through all the noise and static, whether a bit is "high" or "low" - a very
simple circuit can figure this out (hell you could take it down with a
crystal set, headphones and paper and pencil). Only LF has the ability to
propagate across a whole continent with a single transmitter and the
bandwith of LF is inherently limited anyway (not to 1BPS, but you can't
modulate a huge amount of data onto a 60 khz signal and have it received
100% reliably 1500 miles away). By putting all the cost at the transmitting
end, you make the receiving end very cheap. A higher bitrate service would
require fancier, more expensive receivers or perhaps multiple transmitters
or satellites and you don't need the bandwith anyway just to send a time
signal. Perhaps if they were starting fresh they'd do it differently, but
the current setup is at least adequate and as I said before, we are just in
the early days of this. Eventually most consumer products with a clock
(clock radios, VCRs, cofeemakers with timer, car clocks, etc.) will probably
offer RC as it only adds a couple of $ (eventually pennies) to add this
feature to something that has a quartz clock already. How many times have
you seen a VCR flashing 12:00? How many clocks do you have to reset when
daylight savings goes in/out or after a power failure? Has your alarm clock
ever failed to work because there was a power failure during the night?
Even more than high precision, just not having to reset clocks manually is a
major boon to consumers. Given that there are now hundreds of thousands of
consumer devices with receivers out there and soon there will be millions, I
see the WWVB service as continuing for many decades, so it is not "obsolete"
at all, but really in its infancy.
Post by R.L. Horn
Post by Jack Denver
However, self-synchrononizing radio controlled watches require a
special digitally encoded low frequency signal This is a relatively new
technology
Well, 39 years old. Which, I suppose, is pretty darn new by horological
standards. It's true that the ca. 1999 upgrades did a lot to make receivers
with dinky little ferrite rod antennas viable.
Post by Jack Denver
So the radio controlled watch is a relatively new thing and you'll have to
be patient before such signals blanket the whole globe
I can't see that happening. Low-bitrate (1 bps!) LF services like WWVB are
extremely big, expensive, and technologically (though not practically)
obsolete.
Paul Cooper
2004-01-09 21:45:02 UTC
Permalink
On Fri, 9 Jan 2004 11:40:33 -0500, "Jack Denver"
Post by Jack Denver
crystal set, headphones and paper and pencil). Only LF has the ability to
propagate across a whole continent with a single transmitter and the
bandwith of LF is inherently limited anyway (not to 1BPS, but you can't
modulate a huge amount of data onto a 60 khz signal and have it received
100% reliably 1500 miles away). By putting all the cost at the transmitting
end, you make the receiving end very cheap.
The main problem with LF is that to be effective you need a large
antenna - typically half the wavelength - for efficient reception.
Now, obviously it is feasible to put it in a watch or clock, but it
will be hideously inefficient, requiring very high transmitted power.
Indeed, I'd quite like to know how they build a LF receiver in the
confines of a clock or watch.

Paul
Russell W. Barnes
2004-01-09 23:07:02 UTC
Permalink
Post by Paul Cooper
On Fri, 9 Jan 2004 11:40:33 -0500, "Jack Denver"
Post by Jack Denver
crystal set, headphones and paper and pencil). Only LF has the ability to
propagate across a whole continent with a single transmitter and the
bandwith of LF is inherently limited anyway (not to 1BPS, but you can't
modulate a huge amount of data onto a 60 khz signal and have it received
100% reliably 1500 miles away). By putting all the cost at the transmitting
end, you make the receiving end very cheap.
The main problem with LF is that to be effective you need a large
antenna - typically half the wavelength - for efficient reception.
Now, obviously it is feasible to put it in a watch or clock, but it
will be hideously inefficient, requiring very high transmitted power.
Indeed, I'd quite like to know how they build a LF receiver in the
confines of a clock or watch.
You use a ferrite rod antenna, which puts a high-permeability core inside a
wound antenna inductance and concentrates the flux. It does, however,
become directional with the strongest signals perpendicular to the length of
the ferrite rod, and the null-points aligned along the length (so don't keep
your watch-arm pointing the same way all the time....)
:o)

Probably such an antenna is coupled with very selective tuning and
pre-amplification. Interestingly, the transmitter station where I work
receives a 60 KHz MSF signal from Rugby, UK (about 200 miles away), using a
pre-amplified ferrite antenna. Now some of the transmitters on our site are
PWM (pulse-width modulated) using free-running (adjustable individually and
not locked to any reference) switching frequencies of...... 60 KHz!

Despite the 'shash' floating about the place generated by the extreme
proximity of several 28 kV supplies dumping tens-of-amps at 60 KHz into
whopping big inductor/capacitor smoothing ccts (and all the noise generated
by the derived 60 KHz clockery which controls the modulators), the MSF
time-keeping (used for the station control system) is very robust and
reliable.

The LF time-stations were not originally designed for reception by consumer
clock and watch electronics, and I suppose the watch and clock manufacturers
build, design and sell their products on the back of the service provided by
LF time-stations - which just happened to be around anyway.

They'll be doing GPS watches next - if they don't already do so!
--

Regds,

Russell W. B.
http://www.huttonrow.co.uk
John Rowland
2004-01-09 23:09:24 UTC
Permalink
Post by Russell W. Barnes
They'll be doing GPS watches next -
if they don't already do so!
Here's the manual for one...

http://www.casio.co.jp/ww-e/pdf/qw2240.pdf
--
John Rowland - Spamtrapped
Transport Plans for the London Area, updated 2001
http://www.geocities.com/Athens/Acropolis/7069/tpftla.html
A man's vehicle is a symbol of his manhood.
That's why my vehicle's the Piccadilly Line -
It's the size of a county and it comes every two and a half minutes
Jack Denver
2004-01-09 23:26:57 UTC
Permalink
Same way that you can have an AM (medium frequency) receiver that doesnt
have a 250m long antenna ,even though that is 1/2 wavelength for a station
at 600 khz. The usual solution is a ferrite coil. Some of the watches have
the antenna in the band, as a metal case will tend to block the signal.
Simple 1/2 wavelength long wire receiving antennas for medium and low
frequencies haven't been used since the crystal set days since they are
obvious impractical at the huge wavelengths involved. The other part of the
answer is that the antenna doesn't have to be that efficient - you aren't
trying to listen to static free classical music with this - all you are
trying to do is once per second determining whether the transmitter is ON or
OFF. The proof that this is feasible is that such devices actually exist and
are available for very cheap.
Post by Paul Cooper
On Fri, 9 Jan 2004 11:40:33 -0500, "Jack Denver"
Post by Jack Denver
crystal set, headphones and paper and pencil). Only LF has the ability to
propagate across a whole continent with a single transmitter and the
bandwith of LF is inherently limited anyway (not to 1BPS, but you can't
modulate a huge amount of data onto a 60 khz signal and have it received
100% reliably 1500 miles away). By putting all the cost at the transmitting
end, you make the receiving end very cheap.
The main problem with LF is that to be effective you need a large
antenna - typically half the wavelength - for efficient reception.
Now, obviously it is feasible to put it in a watch or clock, but it
will be hideously inefficient, requiring very high transmitted power.
Indeed, I'd quite like to know how they build a LF receiver in the
confines of a clock or watch.
Paul
R.L. Horn
2004-01-10 07:23:00 UTC
Permalink
Post by Jack Denver
I was pretty clear in my original post that WWVB existed earlier,
Yeah, but I couldn't resist a tongue-in-cheek comment about a 40-year-old
technology being new for horologists.
Post by Jack Denver
I don't think WWVB is particularly expensive to operate, if you consider
that this single station covers most of N. America. NIST also put the
system together using bits and pieces - surplus Navy LF transmitters,
Had surplus parts not been available, it's unlikely NIST could have gotten
the funding for the upgrade. Moreover, the equipment requires a good deal
of maintenance. There's also a fair amount of real estate involved.
Fortunately, NIST already had the parts, labor, and land.

No way the government would fund WWVB (or WWV) from scratch these days,
particularly as they're pumping billions into GPS.

Likewise, why would a cash-strapped South American or African government
spend money on something as trivial as consumer-level time synchronization?
Anyone who really needs the correct time can get it from GPS or via
shortwave.

Of course, this doesn't really answer your original question. Seems to me
like a great bloody blowtorch of an LF timecode station would be something
that Aussies could get excited about. Preferably one powerful enough that
you could tell time with your fillings.
Post by Jack Denver
I agree that in this age of broadband and fiber optics, 1 BPS seems like a
ridiculously low bitrate, but the bitrate is sufficient for the purpose of
transmitting a time signal.
A higher, though still modest, bitrate would probably be more reliable, but
that's not the big problem.

The big problem is that there's no provision for error detection (there may
be a parity bit, but I can't find it in any of the NIST documentation).
That being the case, several WWVB frames must be read to ensure that you
have the correct time. Even if only half the frames are corrupt, it may be
impossible for a particularly stupid receiver to resolve the time. With a
really iffy signal, even a really clever machine may take hours to get
there.

By adding some form of digest to the bitstream it would be possible to
simply read the frames until you get a good one (or even correct the errors
in a corrupt frame).

It's moot, of course, since were stuck with what we have.
Post by Jack Denver
Eventually most consumer products with a clock (clock radios, VCRs,
cofeemakers with timer, car clocks, etc.) will probably offer RC as it
only adds a couple of $
I don't know why RC clocks haven't been marketed more aggressively. While
there's little consumer demand for a really accurate clock, the convenience
would seem to be a major selling point.

A serious lack of OTS components is probably a factor. I recently wanted to
build a digital RC clock into a product but couldn't obtain parts at a
reasonable price.

A curious effect of this has been that analog clocks (which anyone can
assemble using movements that costs no more than $20) tend to be cheaper
than simpler digital clocks.
--
If you can see the FNORD, remove it to reply by email.
John Rowland
2004-01-10 09:40:26 UTC
Permalink
Post by R.L. Horn
I don't know why RC clocks haven't been marketed
more aggressively. While there's little consumer
demand for a really accurate clock, the convenience
would seem to be a major selling point.
The selling point is not improved accuracy, but automatic switchover when
the clocks change.
--
John Rowland - Spamtrapped
Transport Plans for the London Area, updated 2001
http://www.geocities.com/Athens/Acropolis/7069/tpftla.html
A man's vehicle is a symbol of his manhood.
That's why my vehicle's the Piccadilly Line -
It's the size of a county and it comes every two and a half minutes
R.L. Horn
2004-01-11 01:11:07 UTC
Permalink
On Sat, 10 Jan 2004 09:40:26 -0000, John Rowland
Post by John Rowland
Post by R.L. Horn
I don't know why RC clocks haven't been marketed
more aggressively. While there's little consumer
demand for a really accurate clock, the convenience
would seem to be a major selling point.
The selling point is not improved accuracy, but automatic switchover when
the clocks change.
I was thinking more about not having to manually set the clock (which,
itself, couldn't be 100% worldwide since there are so many oddball local
time districts).

DST/Summer Time is a whole other can of worms. Some cheap clocks and
watches require manual intervention if you happen to live in Indiana,
Arizona, or Hawaii (and that's just in the U.S.).
--
If you can see the FNORD, remove it to reply by email.
Jack Denver
2004-01-11 03:51:58 UTC
Permalink
I have a Timex pager watch that changes timezones by itself as it travels
and always displays the correct local time (or at least that of the nearest
paging tower - I suppose if you were near the border of 2 zones it might get
confused).

One of the problems with the WWVB clocks is that they make lousy travel
clocks. There is but one national signal broadcast (UTC) and you have to
manually tell the clock which timezone you are in and whether or not that
timezone observes DST or not.
Post by R.L. Horn
On Sat, 10 Jan 2004 09:40:26 -0000, John Rowland
Post by John Rowland
Post by R.L. Horn
I don't know why RC clocks haven't been marketed
more aggressively. While there's little consumer
demand for a really accurate clock, the convenience
would seem to be a major selling point.
The selling point is not improved accuracy, but automatic switchover when
the clocks change.
I was thinking more about not having to manually set the clock (which,
itself, couldn't be 100% worldwide since there are so many oddball local
time districts).
DST/Summer Time is a whole other can of worms. Some cheap clocks and
watches require manual intervention if you happen to live in Indiana,
Arizona, or Hawaii (and that's just in the U.S.).
--
If you can see the FNORD, remove it to reply by email.
R.L. Horn
2004-01-11 05:07:20 UTC
Permalink
Post by Jack Denver
I have a Timex pager watch that changes timezones by itself as it travels
and always displays the correct local time
How well does it keep time? I've been closely watching the clock on my cell
phone (Sprint) and have found it to be wildly inaccurate (off by anything
from .5 to 16 seconds). You'd think that the telco guys would at least be
running ntp...
--
If you can see the FNORD, remove it to reply by email.
Jack Denver
2004-01-11 15:21:12 UTC
Permalink
It is very accurate, as far as I can tell, vs. my RC clock. The watch uses
the Skytel network and Skytel claims to acquire UTC from the GPS system. If
there is any deviation it is less than I am able to measure - less than 1
second. The one downside of the watch is it has a very high power draw
because of the pager functions. It runs on hearing aid batteries (which are
cheap zinc-air cells) and chews through one every couple of months. There is
a separate battery hatch so you dont have to open the whole back (I wish
more watches had such a hatch, but I suspect watch sellers like the battery
changing trade). You can control the hours that the paging (and therefore
time synch) operates to increase battery life.

Many cell phones will soon have GPS as part of the e911 system and I assume
will be able to pull the correct time from it.
Post by R.L. Horn
Post by Jack Denver
I have a Timex pager watch that changes timezones by itself as it travels
and always displays the correct local time
How well does it keep time? I've been closely watching the clock on my cell
phone (Sprint) and have found it to be wildly inaccurate (off by anything
from .5 to 16 seconds). You'd think that the telco guys would at least be
running ntp...
--
If you can see the FNORD, remove it to reply by email.
Jack Denver
2004-01-10 17:41:46 UTC
Permalink
Post by R.L. Horn
Likewise, why would a cash-strapped South American or African government
spend money on something as trivial as consumer-level time
synchronization?
Post by R.L. Horn
Anyone who really needs the correct time can get it from GPS or via
shortwave.
Of course, this doesn't really answer your original question. Seems to me
like a great bloody blowtorch of an LF timecode station would be something
that Aussies could get excited about. Preferably one powerful enough that
you could tell time with your fillings.
Seems to me that if you are running a civilized country and not some third
world hellhole that consumer level time synch is an excellent use of tax
money - exactly the kind of collective good that governments are supposed to
provide, like roads and parks. If you figure the growth in the economy from
all the RC products sold and the taxes paid on those goods, it is probably a
net moneymaker for the government. My share of the WWVB budget amounts to a
fraction of a penny and I am more than glad to have my tax dollars
(pennies?) be used for that purpose. Probably the best spent taxes that I
pay, in terms of value received per $ paid. GPS or shortwave doesn't quite
fill the bill yet, though if GPS receiver costs keep coming down (a stripped
down GPS that just gets the time) I suppose it would do.
Post by R.L. Horn
Post by Jack Denver
I agree that in this age of broadband and fiber optics, 1 BPS seems like a
ridiculously low bitrate, but the bitrate is sufficient for the purpose of
transmitting a time signal.
A higher, though still modest, bitrate would probably be more reliable, but
that's not the big problem.
I think that's wrong - the lower the bitrate, the more reliable, which is
one reason why it is low to begin with (that and the fact that it's 1960's
technology). I agree that if they speeded up the bitrate in order to add
error detection, the reliability would be higher, but an increase in speed
by itself would actually cut reliability. But as you say we are stuck with
the present bitrate so it is moot.
Post by R.L. Horn
The big problem is that there's no provision for error detection (there may
be a parity bit, but I can't find it in any of the NIST documentation).
That being the case, several WWVB frames must be read to ensure that you
have the correct time. Even if only half the frames are corrupt, it may be
impossible for a particularly stupid receiver to resolve the time. With a
really iffy signal, even a really clever machine may take hours to get
there.
By adding some form of digest to the bitstream it would be possible to
simply read the frames until you get a good one (or even correct the errors
in a corrupt frame).
It's moot, of course, since were stuck with what we have.
No error detection that I can see. Here is the timecode:
http://www.boulder.nist.gov/timefreq/stations/wwvbtimecode.htm

However, notice that there are some unused bits in between the significant
data. If you were clever, you might be able to squeeze in some checkbits and
still keep the signal backward compatible, as I assume the existing
receivers disregard the state of the unused bits. If you wanted to upgrade
the service there are other things you could do to make it backward
compatabile. For example, you can play with the timing of the signal - the
current code calls for .5 seconds pulse as 1 and .2 seconds as 0. There
must be greater or shorter lengths that the existing receivers would
continue to resolve as 1 or 0 (i.e. .4, .1) but which newer receivers could
recognize as being distinct (e.g. .5 = 1 1, .4 = 1 0). This would double the
bandwith, which would be more than ample for an error correction scheme.
Post by R.L. Horn
Post by Jack Denver
Eventually most consumer products with a clock (clock radios, VCRs,
cofeemakers with timer, car clocks, etc.) will probably offer RC as it
only adds a couple of $
I don't know why RC clocks haven't been marketed more aggressively. While
there's little consumer demand for a really accurate clock, the convenience
would seem to be a major selling point.
A serious lack of OTS components is probably a factor. I recently wanted to
build a digital RC clock into a product but couldn't obtain parts at a
reasonable price.
A curious effect of this has been that analog clocks (which anyone can
assemble using movements that costs no more than $20) tend to be cheaper
than simpler digital clocks.
I see the digitals as low as $12 for a complete alarm clock with LCD
display. Worst case you could buy one of these and hack it apart for parts
for your project.
Post by R.L. Horn
--
If you can see the FNORD, remove it to reply by email.
R.L. Horn
2004-01-11 04:28:55 UTC
Permalink
Post by Jack Denver
Seems to me that if you are running a civilized country and not some third
world hellhole that consumer level time synch is an excellent use of tax
money
I was addressing the idea of a worldwide LF timebase network. A majority of
governments fall more-or-less into the hellhole category (and they're not
all in the third world). Even in the so-called "civilized" countries,
getting everyone on board for a common (or at least semi-compatible)
standard would be a major diplomatic coup.
Post by Jack Denver
though if GPS receiver costs keep coming down (a stripped down GPS that
just gets the time) I suppose it would do.
GPS may already be cheaper for some applications. For example, the cheapest
WWVB clock with a computer interface I've been able to find is $169, which
is more than some NMEA-compliant GPS receivers (though how well the latter
function as timekeepers is an open question).
Post by Jack Denver
Post by R.L. Horn
A higher, though still modest, bitrate would probably be more reliable,
but that's not the big problem.
I think that's wrong - the lower the bitrate, the more reliable,
Depends on the temporal nature of the noise affecting the signal. 1 bps
just feels sub-optimal to me, but I could be wrong, seeing as I've never
dealt with anything quite as odd as WWVB.
Post by Jack Denver
I agree that if they speeded up the bitrate in order to add error
detection, the reliability would be higher
It seems as though WWVB bandwidth is underutilized. How many folks are
actually using the time interval service?
Post by Jack Denver
No error detection that I can see.
There are some position bits, which help, but that's it.
Post by Jack Denver
However, notice that there are some unused bits in between the significant
data.
I saw that. 11 of them, in fact. Enough for a pretty beefy checksum.
Post by Jack Denver
I see the digitals as low as $12 for a complete alarm clock with LCD
display. Worst case you could buy one of these and hack it apart for parts
for your project.
I thought about that, but decided it was a waste of time to do anything that
couldn't be put into production.
--
If you can see the FNORD, remove it to reply by email.
Chris Malcolm
2004-01-11 13:42:14 UTC
Permalink
Post by R.L. Horn
Post by Jack Denver
Post by R.L. Horn
A higher, though still modest, bitrate would probably be more reliable,
but that's not the big problem.
I think that's wrong - the lower the bitrate, the more reliable,
Depends on the temporal nature of the noise affecting the signal. 1 bps
just feels sub-optimal to me, but I could be wrong, seeing as I've never
dealt with anything quite as odd as WWVB.
The 1bps rate also has another function, of providing a 1bps
beat. That provides a very simple method of keeping a clock in
synchronisation if it can be known that it won't have drifted as much
as half a sec since last timecheck, i.e., last time it decoded the
entire time signal.

The reason for the lack of error checking is that time signals aren't
context-free, they have a regular development. Certain poor
implementations don't utilise this, and so sometimes lose a bit and
get themselves wrong by some power of two, e.g., 2, 4, 8
etc. minutes. The regular progression of the time is supposed to be
used for error checking.

--
Chris Malcolm ***@infirmatics.ed.ac.uk +44 (0)131 651 3445 DoD #205
IPAB, Informatics, JCMB, King's Buildings, Edinburgh, EH9 3JZ, UK
[http://www.dai.ed.ac.uk/homes/cam/]
Jack Denver
2004-01-11 15:27:06 UTC
Permalink
I lost you here. Regarding 1bps are you saying that the clock can take its
beat directly from the time signal instead of the internal crystal? You'd
have to receive every single beat for this to work.

And regarding the "regular progression", I'm not following you. I know that
there are marker bits which help you to know where in the sequence you are,
but this is not the same thing as error checking. You could know correctly
that you are in the "year" sequence and yet without a checksum you don't
have a clue whether the number you got is the correct one or not.
Post by Chris Malcolm
Post by R.L. Horn
Post by Jack Denver
Post by R.L. Horn
A higher, though still modest, bitrate would probably be more reliable,
but that's not the big problem.
I think that's wrong - the lower the bitrate, the more reliable,
Depends on the temporal nature of the noise affecting the signal. 1 bps
just feels sub-optimal to me, but I could be wrong, seeing as I've never
dealt with anything quite as odd as WWVB.
The 1bps rate also has another function, of providing a 1bps
beat. That provides a very simple method of keeping a clock in
synchronisation if it can be known that it won't have drifted as much
as half a sec since last timecheck, i.e., last time it decoded the
entire time signal.
The reason for the lack of error checking is that time signals aren't
context-free, they have a regular development. Certain poor
implementations don't utilise this, and so sometimes lose a bit and
get themselves wrong by some power of two, e.g., 2, 4, 8
etc. minutes. The regular progression of the time is supposed to be
used for error checking.
--
IPAB, Informatics, JCMB, King's Buildings, Edinburgh, EH9 3JZ, UK
[http://www.dai.ed.ac.uk/homes/cam/]
Chris Malcolm
2004-01-11 21:08:04 UTC
Permalink
Post by R.L. Horn
Post by Chris Malcolm
Post by R.L. Horn
Post by Jack Denver
Post by R.L. Horn
A higher, though still modest, bitrate would probably be more
reliable,
Post by Chris Malcolm
Post by R.L. Horn
Post by Jack Denver
Post by R.L. Horn
but that's not the big problem.
I think that's wrong - the lower the bitrate, the more reliable,
Depends on the temporal nature of the noise affecting the signal. 1 bps
just feels sub-optimal to me, but I could be wrong, seeing as I've never
dealt with anything quite as odd as WWVB.
The 1bps rate also has another function, of providing a 1bps
beat. That provides a very simple method of keeping a clock in
synchronisation if it can be known that it won't have drifted as much
as half a sec since last timecheck, i.e., last time it decoded the
entire time signal.
The reason for the lack of error checking is that time signals aren't
context-free, they have a regular development. Certain poor
implementations don't utilise this, and so sometimes lose a bit and
get themselves wrong by some power of two, e.g., 2, 4, 8
etc. minutes. The regular progression of the time is supposed to be
used for error checking.
I lost you here. Regarding 1bps are you saying that the clock can take its
beat directly from the time signal instead of the internal crystal? You'd
have to receive every single beat for this to work.
No, what is intended is that if, for example, taking time corrections
every hour or so, then since in less than an hour the clock will have
drifted less than 1/2 sec, it can be corrected by moving to the
nearest 1bps beat, without needing to decode the whole time signal
sequence.
Post by R.L. Horn
And regarding the "regular progression", I'm not following you. I know that
there are marker bits which help you to know where in the sequence you are,
but this is not the same thing as error checking. You could know correctly
that you are in the "year" sequence and yet without a checksum you don't
have a clue whether the number you got is the correct one or not.
No, what I mean by regular progression is that 3 o'clock follows 2
o'clock, etc., so that the clock can be alerted to the likelihood of
errors in reception by unexpectedly large jumps in time/date.

So a clock or watch could synchronise itself on the 1bps tickrate
hourly, and only bother to receive and decode the entire time signal
once a day.

If it could rely on better than 1/2 sec drift a day in its internal
clock, it could synchronise on the bitrate once a day, and only bother
to decode the entire signal once a week, or once a month.

Of course the software also has to take into account occasional poor
conditions which don't permit synchronisation, because of being inside
a metallic building, badly oriented, severe fog, bad local
interference, etc..

Since the receiver function is much more expensive in battery use than
timekeeping, use of these features allows much more economical use of
the battery.

It's clear that certain clocks and watches do not exploit these
features of the time signal, and use very crude software which not
infrequently in poor conditions ends up with the wrong time. That's
not because the specification of the signal is old-fashioned and lacks
error correction. The theory of bitrate and error and error correction
was very well understood when these low bitrate LW time signals were
designed. The problem is cheap ignorant crude software in the watches
or clocks.
--
Chris Malcolm ***@infirmatics.ed.ac.uk +44 (0)131 651 3445 DoD #205
IPAB, Informatics, JCMB, King's Buildings, Edinburgh, EH9 3JZ, UK
[http://www.dai.ed.ac.uk/homes/cam/]
Jack Denver
2004-01-12 00:21:27 UTC
Permalink
Ah, I'm following you now.

Regarding staying "on beat", I suppose if you drift more than 1/2 second how
would the clock know which beat to synch to...should it jump forward or
back? AFAIK, the clocks I've seen do not use this feature but decode the
whole signal (generally 1x/day or actually night), depending on the internal
crystal (which sometimes has a "learning" trimmer) in between.

Regarding what you call progression, that makes a lot of sense and should be
relatively easy to implement in software. The signal should never under
normal circumstance require a large change in the time. ....the clock could
easily be programmed to reject as erroneous(except for the initial synch
and DST changes) any synch that calls for the time to change by more than X
seconds (or better yet X times the number of days since the last synch - X
being a low value such as 1 ) from the value given by the internal crystal
or to synch a couple of more times to be really really sure.... I suppose
multiple synchs are a form of error correction in themselves, though each
extra full synch wastes power.
Post by Chris Malcolm
Post by R.L. Horn
Post by Chris Malcolm
On Sat, 10 Jan 2004 12:41:46 -0500, Jack Denver
Post by Jack Denver
Post by R.L. Horn
A higher, though still modest, bitrate would probably be more
reliable,
Post by Chris Malcolm
Post by Jack Denver
Post by R.L. Horn
but that's not the big problem.
I think that's wrong - the lower the bitrate, the more reliable,
Depends on the temporal nature of the noise affecting the signal. 1 bps
just feels sub-optimal to me, but I could be wrong, seeing as I've never
dealt with anything quite as odd as WWVB.
The 1bps rate also has another function, of providing a 1bps
beat. That provides a very simple method of keeping a clock in
synchronisation if it can be known that it won't have drifted as much
as half a sec since last timecheck, i.e., last time it decoded the
entire time signal.
The reason for the lack of error checking is that time signals aren't
context-free, they have a regular development. Certain poor
implementations don't utilise this, and so sometimes lose a bit and
get themselves wrong by some power of two, e.g., 2, 4, 8
etc. minutes. The regular progression of the time is supposed to be
used for error checking.
I lost you here. Regarding 1bps are you saying that the clock can take its
beat directly from the time signal instead of the internal crystal? You'd
have to receive every single beat for this to work.
No, what is intended is that if, for example, taking time corrections
every hour or so, then since in less than an hour the clock will have
drifted less than 1/2 sec, it can be corrected by moving to the
nearest 1bps beat, without needing to decode the whole time signal
sequence.
Post by R.L. Horn
And regarding the "regular progression", I'm not following you. I know that
there are marker bits which help you to know where in the sequence you are,
but this is not the same thing as error checking. You could know correctly
that you are in the "year" sequence and yet without a checksum you don't
have a clue whether the number you got is the correct one or not.
No, what I mean by regular progression is that 3 o'clock follows 2
o'clock, etc., so that the clock can be alerted to the likelihood of
errors in reception by unexpectedly large jumps in time/date.
So a clock or watch could synchronise itself on the 1bps tickrate
hourly, and only bother to receive and decode the entire time signal
once a day.
If it could rely on better than 1/2 sec drift a day in its internal
clock, it could synchronise on the bitrate once a day, and only bother
to decode the entire signal once a week, or once a month.
Of course the software also has to take into account occasional poor
conditions which don't permit synchronisation, because of being inside
a metallic building, badly oriented, severe fog, bad local
interference, etc..
Since the receiver function is much more expensive in battery use than
timekeeping, use of these features allows much more economical use of
the battery.
It's clear that certain clocks and watches do not exploit these
features of the time signal, and use very crude software which not
infrequently in poor conditions ends up with the wrong time. That's
not because the specification of the signal is old-fashioned and lacks
error correction. The theory of bitrate and error and error correction
was very well understood when these low bitrate LW time signals were
designed. The problem is cheap ignorant crude software in the watches
or clocks.
--
IPAB, Informatics, JCMB, King's Buildings, Edinburgh, EH9 3JZ, UK
[http://www.dai.ed.ac.uk/homes/cam/]
Kent Betts
2004-01-14 07:26:07 UTC
Permalink
On a practical note, I placed two WWVB style clocks on the wall and expected
to observe up to a half-second difference at some point. (Figuring that the
correction mechanism would allow that much error before bothering to make a
correction.)

In fact the difference between the two clocks varied from zero to about a
quarter second. This implementation was in a low-priced product from Asia.
R.L. Horn
2004-01-16 09:08:59 UTC
Permalink
On Sun, 11 Jan 2004 13:42:14 +0000 (UTC), Chris Malcolm
The 1bps rate also has another function, of providing a 1bps beat.
True.
The reason for the lack of error checking is that time signals aren't
context-free,
True also. This was the rationale behind my optimum bitrate comment, viz.
the tradeoffs between (S/N)o and bitrate as concerns redundant or otherwise
predictable bitstreams. Unfortunately, I didn't elucidate that very well
and may end up being taken to the cleaners as a result.

I'm wearing my asbestos pants just in case.
--
If you can see the FNORD, remove it to reply by email.
Kent Betts
2004-01-14 07:16:04 UTC
Permalink
Post by Jack Denver
Seems to me that if you are running a civilized country
Well, that leaves Australia right out. But seriously, there is no need to
rely on gov't service. A regular broadcast station could be re-tuned for
two hours a night to deliver the time info.
(though how well the [NMEA-compliant GPS receivers]
function as timekeepers is an open question).
The GPS system relies on microsecond-accurate time synchronization to
operate as a position finder. If they are not a reliable source of the
current time, then I am missing something.
Chris Malcolm
2004-01-14 16:18:37 UTC
Permalink
Post by Kent Betts
Post by Jack Denver
Seems to me that if you are running a civilized country
Well, that leaves Australia right out. But seriously, there is no need to
rely on gov't service. A regular broadcast station could be re-tuned for
two hours a night to deliver the time info.
(though how well the [NMEA-compliant GPS receivers]
function as timekeepers is an open question).
The GPS system relies on microsecond-accurate time synchronization to
operate as a position finder. If they are not a reliable source of the
current time, then I am missing something.
What you're missing is that a) there is currently about 13 secs diff
between the GPS time standard and UT (GMT), because GPS time does not
observe the empirically derived leap seconds, and unless the system is
specifically intended to provide a source of external standard clock
timing, then it may not bother doing this subtraction.

The second thing you're missing is that most consumer GPSRs consider
the display of the time to the user to be a low priority computing
task, and given that there are very high urgent computational demands
on these systems, it is commonplace for them to be 0.5 second out, and
some can be as much as two seconds out, by the time the system gets
around to telling the user what it is (was when it started off).

If you want a reliable high-accuracy time from a GPSR, then you need
one which has been specifically designed to provide that.

--
Chris Malcolm ***@infirmatics.ed.ac.uk +44 (0)131 651 3445 DoD #205
IPAB, Informatics, JCMB, King's Buildings, Edinburgh, EH9 3JZ, UK
[http://www.dai.ed.ac.uk/homes/cam/]
Jack Denver
2004-01-14 18:14:05 UTC
Permalink
What you are saying may have been true of some early receivers, though I
think that in more up to date units, the delays you speak of no longer occur
and the leap second correction is taken into account automatically in the
time display. Also as you say, a dedicated time only GPS synched clock or
watch would certainly deal with leap seconds and could put priority on time
computation (or even skip position location completely), and so would be a
very accurate time source and one that is available worldwide. The Casio
GPS "watch" is a first crude implementation - bulky, heavy, expensive,
hardly qualifying as a wristwatch at all. Removing the location functions
might help a little, but the need for a satellite antenna and power draw may
preclude ever making a truly compact receiver that would be acceptable as a
watch for most people. GPS clocks should be no problem and as costs for GPS
chipsets drop, perhaps will become competitive with RC clocks. Also, a
"smart" receiver would use the computed position and the date to decide what
the correct local time was in relation to GPS time, precluding the need to
manually change the clock as you travel from zone to zone as the RC clocks
now require.
Post by Chris Malcolm
Post by Kent Betts
Post by Jack Denver
Seems to me that if you are running a civilized country
Well, that leaves Australia right out. But seriously, there is no need to
rely on gov't service. A regular broadcast station could be re-tuned for
two hours a night to deliver the time info.
(though how well the [NMEA-compliant GPS receivers]
function as timekeepers is an open question).
The GPS system relies on microsecond-accurate time synchronization to
operate as a position finder. If they are not a reliable source of the
current time, then I am missing something.
What you're missing is that a) there is currently about 13 secs diff
between the GPS time standard and UT (GMT), because GPS time does not
observe the empirically derived leap seconds, and unless the system is
specifically intended to provide a source of external standard clock
timing, then it may not bother doing this subtraction.
The second thing you're missing is that most consumer GPSRs consider
the display of the time to the user to be a low priority computing
task, and given that there are very high urgent computational demands
on these systems, it is commonplace for them to be 0.5 second out, and
some can be as much as two seconds out, by the time the system gets
around to telling the user what it is (was when it started off).
If you want a reliable high-accuracy time from a GPSR, then you need
one which has been specifically designed to provide that.
--
IPAB, Informatics, JCMB, King's Buildings, Edinburgh, EH9 3JZ, UK
[http://www.dai.ed.ac.uk/homes/cam/]
Chris Malcolm
2004-01-15 12:27:14 UTC
Permalink
Post by Jack Denver
Post by Chris Malcolm
Post by Kent Betts
The GPS system relies on microsecond-accurate time synchronization to
operate as a position finder. If they are not a reliable source of the
current time, then I am missing something.
What you're missing is that a) there is currently about 13 secs diff
between the GPS time standard and UT (GMT), because GPS time does not
observe the empirically derived leap seconds, and unless the system is
specifically intended to provide a source of external standard clock
timing, then it may not bother doing this subtraction.
The second thing you're missing is that most consumer GPSRs consider
the display of the time to the user to be a low priority computing
task, and given that there are very high urgent computational demands
on these systems, it is commonplace for them to be 0.5 second out, and
some can be as much as two seconds out, by the time the system gets
around to telling the user what it is (was when it started off).
If you want a reliable high-accuracy time from a GPSR, then you need
one which has been specifically designed to provide that.
What you are saying may have been true of some early receivers, though I
think that in more up to date units, the delays you speak of no longer occur
and the leap second correction is taken into account automatically in the
time display.
I've only checked a few GPSRs, those that friends have, but they're
all quite recnt models, and none displays time more accurately than to
the nearest quarter second.
Post by Jack Denver
Also as you say, a dedicated time only GPS synched clock or
watch would certainly deal with leap seconds and could put priority on time
computation (or even skip position location completely), and so would be a
very accurate time source and one that is available worldwide.
The problem is that leap seconds are derived empirically, because
there are unpredictable minute wobbles in the way the planet is
slowing down. The GPS satellite system is not informed of these, and
there's nowhere in the signal format for them, so there's no way a
GPSR can take them into account. What many do take into account is the
standard predictive model of leap seconds, but the problem is that the
planet doesn't always adhere to that model. This was an unexpected
problem which will probably be addressed soon by a change in
standards, but it currently is a problem.
Post by Jack Denver
The Casio
GPS "watch" is a first crude implementation - bulky, heavy, expensive,
hardly qualifying as a wristwatch at all. Removing the location functions
might help a little, but the need for a satellite antenna and power draw may
preclude ever making a truly compact receiver that would be acceptable as a
watch for most people. GPS clocks should be no problem and as costs for GPS
chipsets drop, perhaps will become competitive with RC clocks. Also, a
"smart" receiver would use the computed position and the date to decide what
the correct local time was in relation to GPS time, precluding the need to
manually change the clock as you travel from zone to zone as the RC clocks
now require.
But GPS signals don't penetrate buildings (except in some special
cases). You'd need an external aerial. Not financially competitive
with the LF signals.

If you wanted to avoid the LF signals you'd probably be better off
using the time signals available from mobile phone relay stations,
which penetrate buildings better than GPS, but not nearly as well as
the LF time signals. They mobile phone networks might want you to pay
for that, however.

--
Chris Malcolm ***@infirmatics.ed.ac.uk +44 (0)131 651 3445 DoD #205
IPAB, Informatics, JCMB, King's Buildings, Edinburgh, EH9 3JZ, UK
[http://www.dai.ed.ac.uk/homes/cam/]
Jack Denver
2004-01-15 17:39:35 UTC
Permalink
You make some good points. That the satellite system does not penetrate
buildings is somewhat of a problem. For home units, you could have a
wireless external antenna similar to the sending unit I have for my remote
thermometer (in fact you could put the outdoor temp sensor and antenna in
one, but this adds costs). For a travel clock, it's a problem.

The diff. between GPS and UTC represents the leap seconds inserted since 6
Jan 1980. I did not realize that the GPS signal does not give the leap
second count. I assume the data stream could at some point be improved to
give the correction factor. If nothing else, given how rarely it changes,
you could manually set a counter for the correction factor on your clock.

On my Timex pager watch, even if your subscription to the paging service
lapses, the time signal continues to be received for free. Likewise, even
unsubscribed cell phones still give the time (and call 911). At least at
present, there is no charge for these signals and I think they may be stuck
giving them away for free as changing the encoding would obsolete all
existing equipment. The problem with cellular/paging signals is that they
are regional and do not penetrate some remote areas (unlike GPS). Also,
standards differ from country to country.

In addition to the cellular network, there are more and more wireless
hotspots which have open internet connections. You could, I suppose design a
device that accesses NTP services wirelessly.
Post by Chris Malcolm
Post by Jack Denver
Post by Chris Malcolm
Post by Kent Betts
The GPS system relies on microsecond-accurate time synchronization to
operate as a position finder. If they are not a reliable source of the
current time, then I am missing something.
What you're missing is that a) there is currently about 13 secs diff
between the GPS time standard and UT (GMT), because GPS time does not
observe the empirically derived leap seconds, and unless the system is
specifically intended to provide a source of external standard clock
timing, then it may not bother doing this subtraction.
The second thing you're missing is that most consumer GPSRs consider
the display of the time to the user to be a low priority computing
task, and given that there are very high urgent computational demands
on these systems, it is commonplace for them to be 0.5 second out, and
some can be as much as two seconds out, by the time the system gets
around to telling the user what it is (was when it started off).
If you want a reliable high-accuracy time from a GPSR, then you need
one which has been specifically designed to provide that.
What you are saying may have been true of some early receivers, though I
think that in more up to date units, the delays you speak of no longer occur
and the leap second correction is taken into account automatically in the
time display.
I've only checked a few GPSRs, those that friends have, but they're
all quite recnt models, and none displays time more accurately than to
the nearest quarter second.
Post by Jack Denver
Also as you say, a dedicated time only GPS synched clock or
watch would certainly deal with leap seconds and could put priority on time
computation (or even skip position location completely), and so would be a
very accurate time source and one that is available worldwide.
The problem is that leap seconds are derived empirically, because
there are unpredictable minute wobbles in the way the planet is
slowing down. The GPS satellite system is not informed of these, and
there's nowhere in the signal format for them, so there's no way a
GPSR can take them into account. What many do take into account is the
standard predictive model of leap seconds, but the problem is that the
planet doesn't always adhere to that model. This was an unexpected
problem which will probably be addressed soon by a change in
standards, but it currently is a problem.
Post by Jack Denver
The Casio
GPS "watch" is a first crude implementation - bulky, heavy, expensive,
hardly qualifying as a wristwatch at all. Removing the location functions
might help a little, but the need for a satellite antenna and power draw may
preclude ever making a truly compact receiver that would be acceptable as a
watch for most people. GPS clocks should be no problem and as costs for GPS
chipsets drop, perhaps will become competitive with RC clocks. Also, a
"smart" receiver would use the computed position and the date to decide what
the correct local time was in relation to GPS time, precluding the need to
manually change the clock as you travel from zone to zone as the RC clocks
now require.
But GPS signals don't penetrate buildings (except in some special
cases). You'd need an external aerial. Not financially competitive
with the LF signals.
If you wanted to avoid the LF signals you'd probably be better off
using the time signals available from mobile phone relay stations,
which penetrate buildings better than GPS, but not nearly as well as
the LF time signals. They mobile phone networks might want you to pay
for that, however.
--
IPAB, Informatics, JCMB, King's Buildings, Edinburgh, EH9 3JZ, UK
[http://www.dai.ed.ac.uk/homes/cam/]
Jack Denver
2004-01-15 18:06:55 UTC
Permalink
PS Upon further research, you are wrong about the GPS/UTC offset not being
transmitted. See this page for a further explanation:

http://www.seis.com.au/TechNotes/TN199901A_GPS_UTC.html

"The difference between GPS time and UTC ....is referred to as the UTC
offset. This value is obtained by the GPS receiver from the satellites in a
"navigation message".

The navigation message is transmitted by the satellite on the L1 data link
at a rate of 50 bits per second. The message structure uses a basic format
of a 1500 bit long frame (30 seconds) made up of the five subframes
mentioned in TN199812A with each subframe being 30 bits long. Subframes 4
and 5 are repeated 25 times each, so that a complete message will require
the transmission of 25 full frames (12 1/2 minutes). The UTC offset is only
transmitted once per frame, so it is only transmitted once every 12 1/2
minutes. Depending where you were in the cycle when the GPS receiver is
first switched on, it can thus be up to 12 1/2 minutes until the UTC offset
value is transmitted from a satellite."


It may be that some implementations of GPS receivers apply a stored or
assumed correction until they obtain the transmitted offset. But any device
worth its salt can extract the offset eventually, which is definitely part
of the current GPS signal (and I think always has been). It may take a while
though since it comes by only once every 12.5 minutes.

Again, I think you have a muddled view of what GPS is capable of - 1/4
second is much, much more than the GPS error factor. It may well be that the
consumer navigation devices that you saw were wrong by this much because
they just plain don't care about displaying time correctly, being navigation
oriented, but that's a problem with the device, not the signal. According to
this, the GPS time signal currently transmits the correct time vis a vis UTC
within 10 nanoseconds, which is more than good enough for almost any
conceivable use. Even 1/4 second is much better than most of us need.

http://tycho.usno.navy.mil/gpstt.html
Post by Chris Malcolm
Post by Jack Denver
Post by Chris Malcolm
Post by Kent Betts
The GPS system relies on microsecond-accurate time synchronization to
operate as a position finder. If they are not a reliable source of the
current time, then I am missing something.
What you're missing is that a) there is currently about 13 secs diff
between the GPS time standard and UT (GMT), because GPS time does not
observe the empirically derived leap seconds, and unless the system is
specifically intended to provide a source of external standard clock
timing, then it may not bother doing this subtraction.
The second thing you're missing is that most consumer GPSRs consider
the display of the time to the user to be a low priority computing
task, and given that there are very high urgent computational demands
on these systems, it is commonplace for them to be 0.5 second out, and
some can be as much as two seconds out, by the time the system gets
around to telling the user what it is (was when it started off).
If you want a reliable high-accuracy time from a GPSR, then you need
one which has been specifically designed to provide that.
What you are saying may have been true of some early receivers, though I
think that in more up to date units, the delays you speak of no longer occur
and the leap second correction is taken into account automatically in the
time display.
I've only checked a few GPSRs, those that friends have, but they're
all quite recnt models, and none displays time more accurately than to
the nearest quarter second.
Post by Jack Denver
Also as you say, a dedicated time only GPS synched clock or
watch would certainly deal with leap seconds and could put priority on time
computation (or even skip position location completely), and so would be a
very accurate time source and one that is available worldwide.
The problem is that leap seconds are derived empirically, because
there are unpredictable minute wobbles in the way the planet is
slowing down. The GPS satellite system is not informed of these, and
there's nowhere in the signal format for them, so there's no way a
GPSR can take them into account. What many do take into account is the
standard predictive model of leap seconds, but the problem is that the
planet doesn't always adhere to that model. This was an unexpected
problem which will probably be addressed soon by a change in
standards, but it currently is a problem.
Post by Jack Denver
The Casio
GPS "watch" is a first crude implementation - bulky, heavy, expensive,
hardly qualifying as a wristwatch at all. Removing the location functions
might help a little, but the need for a satellite antenna and power draw may
preclude ever making a truly compact receiver that would be acceptable as a
watch for most people. GPS clocks should be no problem and as costs for GPS
chipsets drop, perhaps will become competitive with RC clocks. Also, a
"smart" receiver would use the computed position and the date to decide what
the correct local time was in relation to GPS time, precluding the need to
manually change the clock as you travel from zone to zone as the RC clocks
now require.
But GPS signals don't penetrate buildings (except in some special
cases). You'd need an external aerial. Not financially competitive
with the LF signals.
If you wanted to avoid the LF signals you'd probably be better off
using the time signals available from mobile phone relay stations,
which penetrate buildings better than GPS, but not nearly as well as
the LF time signals. They mobile phone networks might want you to pay
for that, however.
--
IPAB, Informatics, JCMB, King's Buildings, Edinburgh, EH9 3JZ, UK
[http://www.dai.ed.ac.uk/homes/cam/]
Chris Malcolm
2004-01-16 13:41:17 UTC
Permalink
Post by Jack Denver
Post by Chris Malcolm
Post by Chris Malcolm
The second thing you're missing is that most consumer GPSRs consider
the display of the time to the user to be a low priority computing
task, and given that there are very high urgent computational demands
on these systems, it is commonplace for them to be 0.5 second out, and
some can be as much as two seconds out, by the time the system gets
around to telling the user what it is (was when it started off).
If you want a reliable high-accuracy time from a GPSR, then you need
one which has been specifically designed to provide that.
I've only checked a few GPSRs, those that friends have, but they're
all quite recnt models, and none displays time more accurately than to
the nearest quarter second.
Again, I think you have a muddled view of what GPS is capable of - 1/4
second is much, much more than the GPS error factor. It may well be that the
consumer navigation devices that you saw were wrong by this much because
they just plain don't care about displaying time correctly, being navigation
oriented, but that's a problem with the device, not the signal.
Which is exactly what I wrote, above. They have the correct time to
nonoseconds internally, and use it, but user time display is scheduled
at low priority in a device which has a lot of intensive time-critical
calculation to do every second.
Post by Jack Denver
According to
this, the GPS time signal currently transmits the correct time vis a vis UTC
within 10 nanoseconds, which is more than good enough for almost any
conceivable use.
But which, as I pointee out, is often *not* available to the user in a
consumer navigational device. They get a time display which for purely
computational schedulitng reasons can sometimes be as much as two
seconds out, and is often 1/4 sec out.
Post by Jack Denver
Even 1/4 second is much better than most of us need.
Which is what the manufacturers presumed. But the original questioner
supposed that because GPSRS got and used a time signal correct to
nanoseconds, that a *very* accurate time reference would be available
to the user.

The answer is *could* be, but not in at least most of the GPSRs
currently on sale.

--
Chris Malcolm ***@infirmatics.ed.ac.uk +44 (0)131 651 3445 DoD #205
IPAB, Informatics, JCMB, King's Buildings, Edinburgh, EH9 3JZ, UK
[http://www.dai.ed.ac.uk/homes/cam/]
R.L. Horn
2004-01-18 04:36:47 UTC
Permalink
On Fri, 16 Jan 2004 13:41:17 +0000 (UTC), Chris Malcolm
Post by Chris Malcolm
Which is exactly what I wrote, above. They have the correct time to
nonoseconds internally, and use it, but user time display is scheduled at
low priority in a device which has a lot of intensive time-critical
calculation to do every second.
I'm not sure that the display latency is terribly important. More important
to me would be the latencies encountered over a computer interface, i.e.
using a GPS receiver as an NTP reference clock.

Unfortunately, there doesn't seem to be much in the way of information
available on the subject. I suppose you just have to buy a receiver and see
if it works out (or spend a bundle on a GPS clock).

Probably simpler to just build an LF receiver. If I can figure out how to
buy stuff from Galleon I plan on doing just that.
--
If you can see the FNORD, remove it to reply by email.
Chris Malcolm
2004-01-16 13:50:49 UTC
Permalink
Post by Jack Denver
PS Upon further research, you are wrong about the GPS/UTC offset not being
http://www.seis.com.au/TechNotes/TN199901A_GPS_UTC.html
"The difference between GPS time and UTC ....is referred to as the UTC
offset. This value is obtained by the GPS receiver from the satellites in a
"navigation message".
The navigation message is transmitted by the satellite on the L1 data link
at a rate of 50 bits per second. The message structure uses a basic format
of a 1500 bit long frame (30 seconds) made up of the five subframes
mentioned in TN199812A with each subframe being 30 bits long. Subframes 4
and 5 are repeated 25 times each, so that a complete message will require
the transmission of 25 full frames (12 1/2 minutes). The UTC offset is only
transmitted once per frame, so it is only transmitted once every 12 1/2
minutes. Depending where you were in the cycle when the GPS receiver is
first switched on, it can thus be up to 12 1/2 minutes until the UTC offset
value is transmitted from a satellite."
It may be that some implementations of GPS receivers apply a stored or
assumed correction until they obtain the transmitted offset. But any device
worth its salt can extract the offset eventually, which is definitely part
of the current GPS signal (and I think always has been). It may take a while
though since it comes by only once every 12.5 minutes.
You're right. Note, however, that some folk for battery saving reasons
operate their GPSRs in occasional mode, switching them on for a minute
or two until they get a position fix, and then switching them
off. Depending on how the internal software works, this can lead to an
inaccurate time display. On sci-geo-satellite-nav, you will find some
users reporting occassional strange time displays on their units. The
reasons can get very technical and model-specific, but the main point
is this: if you want a highly accurate time reference, you won't
necessarily get one from a consumer GPSR unit which does not
specifically claim to provide such a signal/display. It *could*
theoretically do so, but it may not have been designed to do so.
--
Chris Malcolm ***@infirmatics.ed.ac.uk +44 (0)131 651 3445 DoD #205
IPAB, Informatics, JCMB, King's Buildings, Edinburgh, EH9 3JZ, UK
[http://www.dai.ed.ac.uk/homes/cam/]
Jack Denver
2004-01-14 17:41:50 UTC
Permalink
Retuning a regular station nightly would not work. Time synch works best on
low frequency (typically 60khz) signals for the reasons discussed in this
thread. Ordinary broadcast transmitters and antennas, which start at about
500khz, are not suitable for such low frequencies.
Post by Kent Betts
Post by Jack Denver
Seems to me that if you are running a civilized country
Well, that leaves Australia right out. But seriously, there is no need to
rely on gov't service. A regular broadcast station could be re-tuned for
two hours a night to deliver the time info.
(though how well the [NMEA-compliant GPS receivers]
function as timekeepers is an open question).
The GPS system relies on microsecond-accurate time synchronization to
operate as a position finder. If they are not a reliable source of the
current time, then I am missing something.
Walter Spector
2004-01-11 04:57:24 UTC
Permalink
...
Post by R.L. Horn
Of course, this doesn't really answer your original question. Seems to me
like a great bloody blowtorch of an LF timecode station would be something
that Aussies could get excited about. Preferably one powerful enough that
you could tell time with your fillings.
The Aussies had a time station, on shortwave, until about a year ago. Then
it was shut down. (See: http://tufi.alphalink.com.au/time/time_hf.html)
... GPS or shortwave doesn't quite
fill the bill yet, though if GPS receiver costs keep coming down (a stripped
down GPS that just gets the time) I suppose it would do.
Note that the old Heathkit company marketed a "Most Accurate Clock" kit for
years before going defunct. It listened to one of the shortwave freqs for
WWV and was pretty reasonably priced. However most folks would rather spend
the same dollars for a cheap SW radio and listen to the signals.
Post by R.L. Horn
Post by Jack Denver
I agree that in this age of broadband and fiber optics, 1 BPS seems like
a
Post by R.L. Horn
Post by Jack Denver
ridiculously low bitrate, but the bitrate is sufficient for the purpose
of
Post by R.L. Horn
Post by Jack Denver
transmitting a time signal.
A higher, though still modest, bitrate would probably be more reliable,
but
Post by R.L. Horn
that's not the big problem.
I think that's wrong - the lower the bitrate, the more reliable, which is
one reason why it is low to begin with (that and the fact that it's 1960's
technology). I agree that if they speeded up the bitrate in order to add
error detection, the reliability would be higher, but an increase in speed
by itself would actually cut reliability. But as you say we are stuck with
the present bitrate so it is moot.
Receiving a very long wavelength with a short antenna, tiny in the case of a R/C
watch, is a compromise. And a major compromise is that the antenna has a
very narrow bandwidth. Narrow bandwidth = low bit rate (ref: Shannon).
Post by R.L. Horn
The big problem is that there's no provision for error detection (there
may
Post by R.L. Horn
be a parity bit, but I can't find it in any of the NIST documentation).
Yeah - I have an Eton clock/radio. Every now and then it is 8 minutes off...
Have not had this problem with my watches yet.

Walt
-...-
Walt Spector
(w6ws at earthlink dot net)
Chris Malcolm
2004-01-10 11:23:39 UTC
Permalink
Post by Jack Denver
I agree that in this age of broadband and fiber optics, 1 BPS seems like a
ridiculously low bitrate, but the bitrate is sufficient for the purpose of
transmitting a time signal.
The important thing about a low bitrate is that it allows lower
tranmitter power to be used for a given quality of reception at a
given distance. Using a higher bit rate than strictly necessary,
especially in long distance wide coverage broadcasts, is very
expensive folly. It's got nothing to do with the age of the
technology. GPS signals are very low bitrate for the same reason.

--
Chris Malcolm ***@infirmatics.ed.ac.uk +44 (0)131 651 3445 DoD #205
IPAB, Informatics, JCMB, King's Buildings, Edinburgh, EH9 3JZ, UK
[http://www.dai.ed.ac.uk/homes/cam/]
Kent Betts
2004-01-06 21:39:06 UTC
Permalink
I think a superior approach would be to standardize at least the data part
of the time services, send it tone encoded, and let the watch update
acoustically via an internal microphone and an external radio and speaker.
That way the watch would not need a receiver inside the watch, and it would
be possible to use a better antenna (on the radio). And the radio could be
tuned to whatever signal wa available.

The downside would be you would have to carry a radio around to set your
watch.
Jack Denver
2004-01-07 00:02:10 UTC
Permalink
Given the growing ubiquity of internet connections, I think a better
solution would be to have a watch that synched to the internet via some
common wireless method e.g. Bluetooth. A Bluetooth USB adapter is about the
size of a house key and can plug into just about any computer, PC or Mac.
Eventually, most computers will come standard with Bluetooth so that
printers, etc. can be connected without cables. You'd also need some kind of
software to broadcast the time over bluetooth but if such watches became
common then the signal would be present in every office, every cafe with
wireless hotspot, ever library, university, train station etc.

I'm not sure whether the form factor and power needs of bluetooth chips
would allow this at present but given the size of the Bluetooth USB adapter
it would not be a major task to shrink it to fit inside a watch.
Post by Kent Betts
I think a superior approach would be to standardize at least the data part
of the time services, send it tone encoded, and let the watch update
acoustically via an internal microphone and an external radio and speaker.
That way the watch would not need a receiver inside the watch, and it would
be possible to use a better antenna (on the radio). And the radio could be
tuned to whatever signal wa available.
The downside would be you would have to carry a radio around to set your
watch.
Kent Betts
2004-01-07 06:38:17 UTC
Permalink
"Jack Denver"
I think a better
solution would be to have a watch that synched...via..Bluetooth.
Yeah that could work. When you set up Bluetooth in your machine it could
display a msg "Do you want Bluetooth to transmit time signals Y/N?"
j***@gmail.com
2016-08-26 23:27:14 UTC
Permalink
Post by Alasdair Baxter
I was looking for a radio-controlled watch which works all over the
world. Does anyone know if such a beast exists? According to my
information there are 4 standard time signals in the whole world and
these are UK, Germany, United States and Japan. I find it hard to
accept that developed countries like Australia, New Zealand, Canada
and Russia don't have an atomic clock which transmits a standard time
signal.
Can anyone enlighten me?
--
Alasdair Baxter, Nottingham, UK.Tel +44 115 9705100; Fax +44 115 9423263
"It's not what you say that matters but how you say it.
It's not what you do that matters but how you do it"
A full summary of all the options available to those in Australia wanting to syncronise thier Atomic Time watches:

http://www.twentytwoten.com/666/will-atomic-time-watch-sync-australia/

There are quite a few options including Apps that are helpful and can be used all over the world!
Loading...