A Home cinema forum. HomeCinemaBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HomeCinemaBanter forum » Home cinema newsgroups » High definition TV
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Clock accuracy & auto setting ?



 
 
Thread Tools Display Modes
  #1  
Old April 19th 06, 03:50 AM posted to alt.tv.tech.hdtv
external usenet poster
 
Posts: n/a
Default Clock accuracy & auto setting ?

i think my Sanyo HDTV switches to analog to set clock.

ATSC time is always late ?


  #2  
Old April 19th 06, 04:19 AM posted to alt.tv.tech.hdtv
external usenet poster
 
Posts: n/a
Default Clock accuracy & auto setting ?

Brass Hopper wrote:
i think my Sanyo HDTV switches to analog to set clock.

ATSC time is always late ?


How late is late? 1 second? 1 minute? I can imagine it being late a few
seconds because of processing delays.

GG

  #3  
Old April 21st 06, 07:49 AM posted to alt.satellite.gps,alt.tv.tech.hdtv,comp.protocols.time.ntp
external usenet poster
 
Posts: n/a
Default Clock accuracy & auto setting : digital television does a crap job of providing time services...

ATSC and DVB-T (DVB in general) are devoid of a 64 bit (or 80 bit) clock
packet (based on NTP and 'Unix Time').

For people to be forced to rely upon GNSS (Glonass, GPS, Galileo) for a time
signal is rather immoral when TV transmitters (and radio too, remember RDS)
pump out many megawatts of signal each day (globally).

i think my Sanyo HDTV switches to analog to set clock.

ATSC time is always late ?



  #4  
Old April 21st 06, 09:40 AM posted to alt.satellite.gps,alt.tv.tech.hdtv,comp.protocols.time.ntp
external usenet poster
 
Posts: n/a
Default Clock accuracy & auto setting : digital television does a crap job of providing time services...

On Thu, 20 Apr 2006 22:49:52 -0700, "Max Power"
wrote:
i think my Sanyo HDTV switches to analog to set clock.

ATSC time is always late ?


ATSC and DVB-T (DVB in general) are devoid of a 64 bit (or 80 bit) clock
packet (based on NTP and 'Unix Time').

For people to be forced to rely upon GNSS (Glonass, GPS, Galileo) for a time
signal is rather immoral when TV transmitters (and radio too, remember RDS)
pump out many megawatts of signal each day (globally).


Immoral? That's a bit strong, innit? You don't NEED to rely on satellites for
time signals. There are many LF radio time signals, such as WWV, WWVB, WWVH,
CHU, MSF, DCF77, and others. CDMA cell phone towers broadcast a time signal.

  #5  
Old April 26th 06, 02:09 AM posted to alt.satellite.gps,alt.tv.tech.hdtv,comp.protocols.time.ntp
external usenet poster
 
Posts: n/a
Default Clock accuracy & auto setting : digital television does a crap job of providing time services...

In North America, the FM RDS time service is of very low quality.
In Europe (I understand) this is not the case with RDS.
Stations running RDS should be mandated by law to provide a quality
service -- based on transmitter power and coverage area.
Over time RDS's time service should be uniform.
DRM (on MW and SW) time service is of a lower quality than RDS -- but could
be upgraded with a specialized "80 bit" NTP-UNIX time packet.

ATSC and DVB-T (& DVB-H/M) need a uniform ~"80 bit"...~"128 bit" time packet
service that is well thought out.
Futureproofing is important, so probably 128 bits or more is preferable.

LF time services are OK, and are necessary over large transnational
regions -- like Sub Saharan Africa, Australasia and South America ... but
any new LF service needs to be more technologically advanced than WWVB, MSF
or DCF77 and its Swiss twin. In these regions 10 LF frequencies need to be
allocated, but the signal to be transmitted needs to be more modern than
WWVB or DCF77 -- maybe using some form of low complexity PSK or low
complexity QAM and 240 hz to 480 hz of bandwidth. The signal must be
futureproofed -- as above.

I wish I had all of the email addresses of the Canadian National Research
Council -- so that I could email them my CHU upgrade proposal at
* http://CBC.am/CHU.htm
* This proposal probably, if implemented -- would require a signal upgrade
to account for 2 transmitters on all of the frequencies used.
* A properly designed signal upgrade could make CHU a more powerful
[technological design] than WWV[H] -- but this would probably require 2 or 3
years of experimentation. I would love to see support for high speed ECC
polytone and MD63 (MD?) with its Walsh coding.

ATSC time is always late ?

========================
ATSC and DVB-T (DVB in general) are devoid of a 64 bit (or 80 bit) clock
packet (based on NTP and 'Unix Time').

For people to be forced to rely upon GNSS (Glonass, GPS, Galileo) for a
time
signal is rather immoral when TV transmitters (and radio too, remember
RDS)
pump out many megawatts of signal each day (globally).

/////////////////////////////////////////////////
Immoral? That's a bit strong, innit? You don't NEED to rely on
satellites for
time signals. There are many LF radio time signals, such as WWV, WWVB,
WWVH,
CHU, MSF, DCF77, and others. CDMA cell phone towers broadcast a time
signal.



  #6  
Old April 27th 06, 06:24 AM posted to alt.satellite.gps,alt.tv.tech.hdtv,comp.protocols.time.ntp
external usenet poster
 
Posts: n/a
Default Clock accuracy & auto setting : digital television does a crap job of providing time services...

In article "Max Power" writes:
In North America, the FM RDS time service is of very low quality.
In Europe (I understand) this is not the case with RDS.
Stations running RDS should be mandated by law to provide a quality
service -- based on transmitter power and coverage area.
Over time RDS's time service should be uniform.
DRM (on MW and SW) time service is of a lower quality than RDS -- but could
be upgraded with a specialized "80 bit" NTP-UNIX time packet.

ATSC and DVB-T (& DVB-H/M) need a uniform ~"80 bit"...~"128 bit" time packet
service that is well thought out.
Futureproofing is important, so probably 128 bits or more is preferable.


128 bits? What do you want - to specify the time of the heat death of the
universe (long after our sun dies) to nanosecond resolution, then be able to
tell the time of the death of the next universe (if there is one)?

64 bit NTP is probably quite adequate, and definitely enough bits if
one doesn't insist on the NTP nanosecond precision.

However, the time information doesn't need to be NTP, or IP based.
By the way, NTP is not a part of unix.


Presently, the analog TV stations transmitting time don't even seem
to consider it worth keeping the clocks set accurately -- some use a
PC's clock, with no external source to deal with its drift.

The time sent with ATSC seems to be random as well. Some stations
seem to have gone ahead with DST, but the only way to get the program
data to be correct is to manually force (and set) the receiver to use
local standard time.

Of course, not using UTC is astoundingly stupid -- folks who live next
to a time zone boundary are out of luck if some stations are on each side.


However, counting on broadcasters to get the time right is a fantasy.
They cannot even get their program guide information correct in the data
stream.


Alan
  #7  
Old April 29th 06, 04:14 AM posted to alt.satellite.gps,alt.tv.tech.hdtv,comp.protocols.time.ntp
external usenet poster
 
Posts: n/a
Default Clock accuracy & auto setting : digital television does a crap job of providing time services...

In North America, the FM RDS time service is of very low quality.
In Europe (I understand) this is not the case with RDS.
Stations running RDS should be mandated by law to provide a quality
service -- based on transmitter power and coverage area.
Over time RDS's time service should be uniform.
DRM (on MW and SW) time service is of a lower quality than RDS -- but
could
be upgraded with a specialized "80 bit" NTP-UNIX time packet.

ATSC and DVB-T (& DVB-H/M) need a uniform ~"80 bit"...~"128 bit" time
packet
service that is well thought out.
Futureproofing is important, so probably 128 bits or more is preferable.


No I am not suggesting this.
64 extra bits can help futureproof a signal however...
================
128 bits? What do you want - to specify the time of the heat death of
the
universe (long after our sun dies) to nanosecond resolution, then be able
to
tell the time of the death of the next universe (if there is one)?


Turing NTP into a very high precision signal is possible -- but I advocate
split versions of the signal.
Otherwise, for consumers the extra 64 bits could be used for other time
related services and futureproofing.
================
64 bit NTP is probably quite adequate, and definitely enough bits if
one doesn't insist on the NTP nanosecond precision.


NTP does have its origins with Unix Time, and Unix / Linix / Minix ... need
to be upgraded to account for 64 bit time [to help avert the 2038 crisis] --
but not NTP time. However, Unix Time and NTP have always had near perfect
interoperability for at least 2 decades...
================
However, the time information doesn't need to be NTP, or IP based.
By the way, NTP is not a part of unix.


This 'time ignorance' socally acceptable now, but bad public policy.
Lack of use of UTC is a bad policy, but this is an ATSC problem -- not a
DVB-T problem.
HDTV is very late Beta -- but when HDTV is omnipresent the lack of accurate
time avalability should not be the case.
Your HDTV set should be able to sync its own clock after being turned on for
5 minutes.
HDTV sets shoul be able to set the clocks of other devices [VCR's, DVD's,
set top boxes] using whatever [TV] connection technology that will exist in
future.
Commerical stations not providing an accurate time signal [to within 1.0s]
should be punished (based on transmitter power) -- the ethics of this being
that that keeping broacast clock accuracy requires virturally no cost
overehead and provides a public service function. A lot of ATSC transmission
energy goes into sending 'empty packets' or 'nulls' -- replacing 1% {per
hour} of nulls with time signal packets would yield instant time sync.
I am considering the UK, Erie, Australia, NZ and Canada in my point -- not
just the US.
================
Presently, the analog TV stations transmitting time don't even seem
to consider it worth keeping the clocks set accurately -- some use a
PC's clock, with no external source to deal with its drift.

The time sent with ATSC seems to be random as well. Some stations
seem to have gone ahead with DST, but the only way to get the program
data to be correct is to manually force (and set) the receiver to use
local standard time.

Of course, not using UTC is astoundingly stupid -- folks who live next
to a time zone boundary are out of luck if some stations are on each side.

However, counting on broadcasters to get the time right is a fantasy.
They cannot even get their program guide information correct in the data
stream.



  #8  
Old April 29th 06, 06:25 AM posted to alt.satellite.gps,alt.tv.tech.hdtv,comp.protocols.time.ntp
external usenet poster
 
Posts: n/a
Default Clock accuracy & auto setting : digital television does a crap job of providing time services...


Max Power wrote:
In North America, the FM RDS time service is of very low quality.
In Europe (I understand) this is not the case with RDS.
Stations running RDS should be mandated by law to provide a quality
service -- based on transmitter power and coverage area.
Over time RDS's time service should be uniform.
DRM (on MW and SW) time service is of a lower quality than RDS -- but
could
be upgraded with a specialized "80 bit" NTP-UNIX time packet.

ATSC and DVB-T (& DVB-H/M) need a uniform ~"80 bit"...~"128 bit" time
packet
service that is well thought out.
Futureproofing is important, so probably 128 bits or more is preferable.


No I am not suggesting this.
64 extra bits can help futureproof a signal however...
================
128 bits? What do you want - to specify the time of the heat death of
the
universe (long after our sun dies) to nanosecond resolution, then be able
to
tell the time of the death of the next universe (if there is one)?


Turing NTP into a very high precision signal is possible -- but I advocate
split versions of the signal.
Otherwise, for consumers the extra 64 bits could be used for other time
related services and futureproofing.
================
64 bit NTP is probably quite adequate, and definitely enough bits if
one doesn't insist on the NTP nanosecond precision.


NTP does have its origins with Unix Time, and Unix / Linix / Minix ... need
to be upgraded to account for 64 bit time [to help avert the 2038 crisis] --
but not NTP time. However, Unix Time and NTP have always had near perfect
interoperability for at least 2 decades...
================
However, the time information doesn't need to be NTP, or IP based.
By the way, NTP is not a part of unix.


This 'time ignorance' socally acceptable now, but bad public policy.
Lack of use of UTC is a bad policy, but this is an ATSC problem -- not a
DVB-T problem.
HDTV is very late Beta -- but when HDTV is omnipresent the lack of accurate
time avalability should not be the case.
Your HDTV set should be able to sync its own clock after being turned on for
5 minutes.
HDTV sets shoul be able to set the clocks of other devices [VCR's, DVD's,
set top boxes] using whatever [TV] connection technology that will exist in
future.
Commerical stations not providing an accurate time signal [to within 1.0s]
should be punished (based on transmitter power) -- the ethics of this being
that that keeping broacast clock accuracy requires virturally no cost
overehead and provides a public service function. A lot of ATSC transmission
energy goes into sending 'empty packets' or 'nulls' -- replacing 1% {per
hour} of nulls with time signal packets would yield instant time sync.
I am considering the UK, Erie, Australia, NZ and Canada in my point -- not
just the US.
================
Presently, the analog TV stations transmitting time don't even seem
to consider it worth keeping the clocks set accurately -- some use a
PC's clock, with no external source to deal with its drift.

The time sent with ATSC seems to be random as well. Some stations
seem to have gone ahead with DST, but the only way to get the program
data to be correct is to manually force (and set) the receiver to use
local standard time.

Of course, not using UTC is astoundingly stupid -- folks who live next
to a time zone boundary are out of luck if some stations are on each side.

However, counting on broadcasters to get the time right is a fantasy.
They cannot even get their program guide information correct in the data
stream.


You actually think broadcasters DON'T know what time it is? They know
better than nearly anyone else precisely what time it is. The fact that
they don't make an effort for the 25 of us actually watching the
digital feeds means what? You don't remember last months thread about
DTV being a sideline?

http://groups.google.com/group/alt.t...35cf5b6d6f8c77

Be patient. They'll get to it as it gets closer to analog cutoff. I'm
sure you can sort out what time it is on your own. Maybe this will help
in the meantime.

http://www.time.gov/timezone.cgi?Pacific/d/-8/java

Perhaps the fines should be on the people out here who can't figure out
what time it is. Just what broadcasters need. More regulations.

GG

  #9  
Old May 1st 06, 11:33 AM posted to alt.satellite.gps,alt.tv.tech.hdtv,comp.protocols.time.ntp
external usenet poster
 
Posts: n/a
Default Clock accuracy & auto setting : digital television does a crap job of providing time services...

On Tue, 25 Apr 2006 17:09:33 -0700, Max Power wrote:

[SNIP]

LF time services are OK, and are necessary over large transnational
regions -- like Sub Saharan Africa, Australasia and South America ... but
any new LF service needs to be more technologically advanced than WWVB, MSF
or DCF77 and its Swiss twin. In these regions 10 LF frequencies need to be
allocated, but the signal to be transmitted needs to be more modern than
WWVB or DCF77 -- maybe using some form of low complexity PSK or low
complexity QAM and 240 hz to 480 hz of bandwidth. The signal must be
futureproofed -- as above.


OK? What other technology can provide such accurate signals at such a low
cost? The simple amplitude modulated nature of the LW signals makes for
such low cost implementations. A PSK or QAM modulation is going to put the
cost of hardware to decode the signal up to much to be useful.

If you need greater accuracy than a LW signal can provide then
GPS/Glonass/Galileo is the best way to go.

Further more since 1983 the DCF77 signal has been phase modulated in
addition to amplitude modulated further improving the time signal while
being 100% backwards compatible.

What I will admit is that all LW time signals are hugely wasteful of the
available bits as they all seem to used BCD to encode the time. It would
be much better to use straight binary, as then you need just 30 bits to
encode the date/time to the nearest minute for the next 1000 years while
keeping the time/day/year separate for simple decoding (11bits for the
minute of the day, 9 bits for the day of the year and 10 bits for the
year). Throw in another five bits for DUT, another bit for daylight
savings and something for signalling leap seconds any you have still have
20+ bits for error checking and correction and future growth.

What advantages do the SW time signals have over the LW ones? As far as I
can see between LW signals and satellite systems there is only a tiny
market left for any other time signal system.


JAB.

--
Jonathan A. Buzzard Email: jonathan (at) buzzard.me.uk
Northumberland, United Kingdom. Tel: +44 1661-832195

  #10  
Old May 1st 06, 02:42 PM posted to alt.satellite.gps,alt.tv.tech.hdtv,comp.protocols.time.ntp
external usenet poster
 
Posts: n/a
Default Clock accuracy & auto setting : digital television does a crap job of providing time services...

LF time services are OK, and are necessary over large transnational
regions -- like Sub Saharan Africa, Australasia and South America ... but
any new LF service needs to be more technologically advanced than WWVB,
MSF
or DCF77 and its Swiss twin. In these regions 10 LF frequencies need to
be
allocated, but the signal to be transmitted needs to be more modern than
WWVB or DCF77 -- maybe using some form of low complexity PSK or low
complexity QAM and 240 hz to 480 hz of bandwidth. The signal must be
futureproofed -- as above.

================================
OK? What other technology can provide such accurate signals at such a low
cost? The simple amplitude modulated nature of the LW signals makes for
such low cost implementations. A PSK or QAM modulation is going to put the
cost of hardware to decode the signal up to much to be useful.


For consumer purposes (save cellular telephony) GNSS technology is too
costly.
NTP is OK for computers, but has a learning curve.
===============
If you need greater accuracy than a LW signal can provide then
GPS/Glonass/Galileo is the best way to go.


I am unaware of WWVB doing this.
If WWVB does this, its signalling must be very quaint.
Typical 1983 signalling medthods (bitfields transmitted) would probably not
pass modern muster.
==============
Further more since 1983 the DCF77 signal has been phase modulated in
addition to amplitude modulated further improving the time signal while
being 100% backwards compatible.


BCD is the universal flaw of LW time stations.
NTP could fix the problem, but NTP is not perfect for this [without
tweaking].
BCD could be said to use odd signalling in not using {seconds} within a day,
albeit seconds are not suitable for LW signalling rates. I assume both NTP
and BCD signalling medthods are aimiable to 'bit averaging' for LW time
service's slow signal rate and error correction condtions.
==============
What I will admit is that all LW time signals are hugely wasteful of the
available bits as they all seem to used BCD to encode the time. It would
be much better to use straight binary, as then you need just 30 bits to
encode the date/time to the nearest minute for the next 1000 years while
keeping the time/day/year separate for simple decoding (11bits for the
minute of the day, 9 bits for the day of the year and 10 bits for the
year). Throw in another five bits for DUT, another bit for daylight
savings and something for signalling leap seconds any you have still have
20+ bits for error checking and correction and future growth.


DRM on MW and SW only sends the Julian date and time.
There is some BCD coding similar to WWVB and DCF77 -- for time and date.
However, there is no [univeral] NTP or UNIX time packet in DRM.
SW signals do cover areas LW cannot reach (like Antartica) -- and with
adiquate frequency diversity provide a higher time resolution vs LW. DRM can
cope with ionspheric delay via indicating TX (and at the reciver RX) coords.
LW time stations are trapped in late 1960s style signalling.
============
What advantages do the SW time signals have over the LW ones? As far as I
can see between LW signals and satellite systems there is only a tiny
market left for any other time signal system.



 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Tivo's clock off? Jennyanniedots Tivo personal television 9 March 30th 06 09:31 AM
home theater auto clock set [email protected] Home theater (general) 4 September 16th 05 02:19 PM
Need clock setting help [email protected] Tivo personal television 6 October 23rd 03 05:06 AM
Clock update with Tivo (series-1, software 2.0.1) Wes Newell Tivo personal television 6 October 5th 03 10:16 AM
Can the TiVo clock be wrong? SamNHisDog Tivo personal television 9 September 1st 03 09:27 PM


All times are GMT +1. The time now is 06:32 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
Copyright ©2004-2021 HomeCinemaBanter.
The comments are property of their posters.