A Home cinema forum. HomeCinemaBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HomeCinemaBanter forum » Home cinema newsgroups » UK digital tv
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Why interlaced HDTV?



 
 
Thread Tools Display Modes
  #1  
Old August 16th 05, 09:29 PM
Staiger
external usenet poster
 
Posts: n/a
Default Why interlaced HDTV?

In a discussion today with a colleague I argued that it was illogical to
carry interlacing forward into the forthcoming HD standards. After some
debate it became clear that neither of us understood what we were talking
about!

I believe that interlacing was introduced decades ago to provide a
flicker-free image whilst still requiring only 25 (or 30 in USA) frames to
be broadcast per second. In other words, a primitive way of controlling
bandwidth requirements. Is this right or wrong? And is there any more to
it?

But now that we have 100Hz TVs, digital transmissions, and various amounts
of digital processing at both the broadcaster and inside a modern TV, I
can't understand what interlacing brings to the party, apart from extra
complications.

Backward compatibility doesn't seem a very strong argument, as the HD
interlaced standard appears to be higher definition than 'legacy' interlaced
TVs can manage anyway.

The people who design these standards aren't stupid, so obviously I'm
missing something. Can anyone elucidate?

Thanks!

Staiger


  #2  
Old August 16th 05, 10:47 PM
DAB sounds worse than FM
external usenet poster
 
Posts: n/a
Default

Staiger wrote:
In a discussion today with a colleague I argued that it was illogical
to carry interlacing forward into the forthcoming HD standards. After
some debate it became clear that neither of us understood what
we were talking about!

I believe that interlacing was introduced decades ago to provide a
flicker-free image whilst still requiring only 25 (or 30 in USA)
frames to be broadcast per second. In other words, a primitive way
of controlling bandwidth requirements. Is this right or wrong? And
is there any more to it?



FM was introduced about 50 years ago, but it remains the highest quality
source of radio in the UK.

The argument between 720p and 1080i is this:

720p (720 lines progressive) provides better motion portrayal and
doesn't suffer from interline twitter.

1080i has a higher static resolution.

For example:

Resolution of 720p is

1280 x 720 = 921,600 pixels

Resolution of 1080i is

1920 x 1080 x 0.741 = 1536000 pixels

Therefore, 1080i has a 67% higher static resolution than 720p.




--
Steve - www.digitalradiotech.co.uk - Digital Radio News & Info

Find the cheapest Freeview, DAB & MP3 Player Prices:
http://www.digitalradiotech.co.uk/fr..._receivers.htm
http://www.digitalradiotech.co.uk/da...tal_radios.htm
http://www.digitalradiotech.co.uk/mp...rs_1GB-5GB.htm
http://www.digitalradiotech.co.uk/mp...e_capacity.htm


  #3  
Old August 17th 05, 12:13 AM
Agamemnon
external usenet poster
 
Posts: n/a
Default


"Staiger" wrote in message
...
In a discussion today with a colleague I argued that it was illogical to
carry interlacing forward into the forthcoming HD standards. After some
debate it became clear that neither of us understood what we were talking
about!

I believe that interlacing was introduced decades ago to provide a
flicker-free image whilst still requiring only 25 (or 30 in USA) frames to
be broadcast per second. In other words, a primitive way of controlling
bandwidth requirements. Is this right or wrong? And is there any more to
it?

But now that we have 100Hz TVs, digital transmissions, and various amounts
of digital processing at both the broadcaster and inside a modern TV, I
can't understand what interlacing brings to the party, apart from extra
complications.


Ahhhh, but you forgot one thing. Interlaced is actually 50Hz and the motion
looks more natural than non-interlaced at 25Hz, except when showing the bend
on an athletics track in which case the entire picture breaks up.

Don't believe the crap they give out about film at 24 fps being enough to
deceive the human eye. Film at 12 fps which they use for cartons can do that
as well but neither of them look natural. 50 fps is the bare minimum which
can fool your brain into thinking you are watching natural looking motion
(just as long as its not showing interlaced bends on athletics tracks).


Backward compatibility doesn't seem a very strong argument, as the HD
interlaced standard appears to be higher definition than 'legacy'
interlaced TVs can manage anyway.

The people who design these standards aren't stupid, so obviously I'm
missing something. Can anyone elucidate?

Thanks!

Staiger



  #4  
Old August 17th 05, 12:28 AM
Roderick Stewart
external usenet poster
 
Posts: n/a
Default

In article , Staiger wrote:
In a discussion today with a colleague I argued that it was illogical to
carry interlacing forward into the forthcoming HD standards. After some
debate it became clear that neither of us understood what we were talking
about!

I believe that interlacing was introduced decades ago to provide a
flicker-free image whilst still requiring only 25 (or 30 in USA) frames to
be broadcast per second. In other words, a primitive way of controlling
bandwidth requirements. Is this right or wrong? And is there any more to
it?

But now that we have 100Hz TVs, digital transmissions, and various amounts
of digital processing at both the broadcaster and inside a modern TV, I
can't understand what interlacing brings to the party, apart from extra
complications.

Backward compatibility doesn't seem a very strong argument, as the HD
interlaced standard appears to be higher definition than 'legacy' interlaced
TVs can manage anyway.

The people who design these standards aren't stupid, so obviously I'm
missing something. Can anyone elucidate?


Interlace gives more than just a reduction in flicker, i.e. doubling in the
frequency at which brightness variations occur. It also doubles the frequency
at which picture information is updated, which makes moving objects appear to
move in a much smoother and more lifelike way. Even though only half the
picture lines are updated each field they are updated twice as often as if
interlace was not used, and this is enough to give the smoothing effect.
Vertical edges moving sideways are more ragged because they are depicted using
half as many lines as when standing still, but this is similar to the blurring
of moving objects which occurs naturally in real life.

Rod.

  #5  
Old August 17th 05, 12:31 AM
Agamemnon
external usenet poster
 
Posts: n/a
Default


"DAB sounds worse than FM" wrote in message
...
Staiger wrote:
In a discussion today with a colleague I argued that it was illogical
to carry interlacing forward into the forthcoming HD standards. After
some debate it became clear that neither of us understood what
we were talking about!

I believe that interlacing was introduced decades ago to provide a
flicker-free image whilst still requiring only 25 (or 30 in USA)
frames to be broadcast per second. In other words, a primitive way
of controlling bandwidth requirements. Is this right or wrong? And
is there any more to it?



FM was introduced about 50 years ago, but it remains the highest quality
source of radio in the UK.

The argument between 720p and 1080i is this:

720p (720 lines progressive) provides better motion portrayal and


Only when showing sport or very fast action, and even then it still doesn't
look as natural as 720i (50 Hz ie. 50 full 720 line frames per second) when
showing regular speed motion since at that speed the refresh rate is
equivalent to 100Hz.

doesn't suffer from interline twitter.

1080i has a higher static resolution.


And also provides more natural looking motion at the same frame rate but
only at 504 lines resolution. But considering the Americans had to put up
with that standard for years its not bad. The only trouble is that 1080i is
a 25Hz system so 720p at 50Hz will surpass it on most content. If 1080i were
at 50Hz then it would out do 720p.

Now what I don't understand is why the idiots who designed the MPEG-4 system
didn't combine progressive and interlaced encoding so that when fast action
was being shown it would switch to progressive at half the frame rate to
eliminate twitter, when normal speed action was shown it would switch to
interlaced at 50Hz to give natural looking motion and when still frames or
very slow motion was being shown it would switch to progressive again to
improve resolution.


For example:

Resolution of 720p is

1280 x 720 = 921,600 pixels

Resolution of 1080i is

1920 x 1080 x 0.741 = 1536000 pixels

Therefore, 1080i has a 67% higher static resolution than 720p.




--
Steve - www.digitalradiotech.co.uk - Digital Radio News & Info

Find the cheapest Freeview, DAB & MP3 Player Prices:
http://www.digitalradiotech.co.uk/fr..._receivers.htm
http://www.digitalradiotech.co.uk/da...tal_radios.htm
http://www.digitalradiotech.co.uk/mp...rs_1GB-5GB.htm
http://www.digitalradiotech.co.uk/mp...e_capacity.htm



  #6  
Old August 17th 05, 12:52 AM
Dave Plowman (News)
external usenet poster
 
Posts: n/a
Default

In article ,
DAB sounds worse than FM wrote:
FM was introduced about 50 years ago, but it remains the highest quality
source of radio in the UK.


You never miss a turn, do you? ;-)

--
*Why are they called apartments, when they're all stuck together? *

Dave Plowman London SW
To e-mail, change noise into sound.
  #7  
Old August 17th 05, 01:11 AM
tony sayer
external usenet poster
 
Posts: n/a
Default

In article , Dave Plowman (News)
writes
In article ,
DAB sounds worse than FM wrote:
FM was introduced about 50 years ago, but it remains the highest quality
source of radio in the UK.


You never miss a turn, do you? ;-)


Tis true though.....


--
Tony Sayer

  #8  
Old August 17th 05, 01:32 AM
T1000
external usenet poster
 
Posts: n/a
Default

Agamemnon wrote:
"DAB sounds worse than FM" wrote in message
...

Staiger wrote:

In a discussion today with a colleague I argued that it was illogical
to carry interlacing forward into the forthcoming HD standards. After
some debate it became clear that neither of us understood what
we were talking about!

I believe that interlacing was introduced decades ago to provide a
flicker-free image whilst still requiring only 25 (or 30 in USA)
frames to be broadcast per second. In other words, a primitive way
of controlling bandwidth requirements. Is this right or wrong? And
is there any more to it?



FM was introduced about 50 years ago, but it remains the highest quality
source of radio in the UK.

The argument between 720p and 1080i is this:

720p (720 lines progressive) provides better motion portrayal and



Only when showing sport or very fast action, and even then it still doesn't
look as natural as 720i (50 Hz ie. 50 full 720 line frames per second) when
showing regular speed motion since at that speed the refresh rate is
equivalent to 100Hz.


I think you are mistaken. The American 720p standard used for sport is
720p60 (ie. 720 full, progressive frames, at 60 frames per second).



doesn't suffer from interline twitter.

1080i has a higher static resolution.



And also provides more natural looking motion at the same frame rate but
only at 504 lines resolution. But considering the Americans had to put up
with that standard for years its not bad. The only trouble is that 1080i is
a 25Hz system so 720p at 50Hz will surpass it on most content. If 1080i were
at 50Hz then it would out do 720p.


The American 1080i system (used for everything other than sport) is
usually 1080i60, ie. 60 fields per second. So it's a 60hz system, not
25hz. The screen is updated with new information 60 times per second
(though only every other line is sent, each 60th of a second). It's
similar to the way current interlaced tv works, but at a higher resolution.



Now what I don't understand is why the idiots who designed the MPEG-4 system
didn't combine progressive and interlaced encoding so that when fast action
was being shown it would switch to progressive at half the frame rate to
eliminate twitter, when normal speed action was shown it would switch to
interlaced at 50Hz to give natural looking motion and when still frames or
very slow motion was being shown it would switch to progressive again to
improve resolution.


For example:

Resolution of 720p is

1280 x 720 = 921,600 pixels

Resolution of 1080i is

1920 x 1080 x 0.741 = 1536000 pixels

Therefore, 1080i has a 67% higher static resolution than 720p.




--
Steve - www.digitalradiotech.co.uk - Digital Radio News & Info

Find the cheapest Freeview, DAB & MP3 Player Prices:
http://www.digitalradiotech.co.uk/fr..._receivers.htm
http://www.digitalradiotech.co.uk/da...tal_radios.htm
http://www.digitalradiotech.co.uk/mp...rs_1GB-5GB.htm
http://www.digitalradiotech.co.uk/mp...e_capacity.htm




  #9  
Old August 17th 05, 03:23 AM
WDino
external usenet poster
 
Posts: n/a
Default

This is a UK newsgroup not US.


T1000 wrote:
Agamemnon wrote:

"DAB sounds worse than FM" wrote in message
...

Staiger wrote:

In a discussion today with a colleague I argued that it was illogical
to carry interlacing forward into the forthcoming HD standards.
After some debate it became clear that neither of us understood what
we were talking about!

I believe that interlacing was introduced decades ago to provide a
flicker-free image whilst still requiring only 25 (or 30 in USA)
frames to be broadcast per second. In other words, a primitive way
of controlling bandwidth requirements. Is this right or wrong? And
is there any more to it?



FM was introduced about 50 years ago, but it remains the highest
quality source of radio in the UK.

The argument between 720p and 1080i is this:

720p (720 lines progressive) provides better motion portrayal and




Only when showing sport or very fast action, and even then it still
doesn't look as natural as 720i (50 Hz ie. 50 full 720 line frames per
second) when showing regular speed motion since at that speed the
refresh rate is equivalent to 100Hz.


I think you are mistaken. The American 720p standard used for sport is
720p60 (ie. 720 full, progressive frames, at 60 frames per second).



doesn't suffer from interline twitter.

1080i has a higher static resolution.




And also provides more natural looking motion at the same frame rate
but only at 504 lines resolution. But considering the Americans had to
put up with that standard for years its not bad. The only trouble is
that 1080i is a 25Hz system so 720p at 50Hz will surpass it on most
content. If 1080i were at 50Hz then it would out do 720p.



The American 1080i system (used for everything other than sport) is
usually 1080i60, ie. 60 fields per second. So it's a 60hz system, not
25hz. The screen is updated with new information 60 times per second
(though only every other line is sent, each 60th of a second). It's
similar to the way current interlaced tv works, but at a higher resolution.



Now what I don't understand is why the idiots who designed the MPEG-4
system didn't combine progressive and interlaced encoding so that when
fast action was being shown it would switch to progressive at half the
frame rate to eliminate twitter, when normal speed action was shown it
would switch to interlaced at 50Hz to give natural looking motion and
when still frames or very slow motion was being shown it would switch
to progressive again to improve resolution.


For example:

Resolution of 720p is

1280 x 720 = 921,600 pixels

Resolution of 1080i is

1920 x 1080 x 0.741 = 1536000 pixels

Therefore, 1080i has a 67% higher static resolution than 720p.




--
Steve - www.digitalradiotech.co.uk - Digital Radio News & Info

Find the cheapest Freeview, DAB & MP3 Player Prices:
http://www.digitalradiotech.co.uk/fr..._receivers.htm
http://www.digitalradiotech.co.uk/da...tal_radios.htm
http://www.digitalradiotech.co.uk/mp...rs_1GB-5GB.htm
http://www.digitalradiotech.co.uk/mp...e_capacity.htm




  #10  
Old August 17th 05, 03:33 AM
Stephen
external usenet poster
 
Posts: n/a
Default

In a discussion today with a colleague I argued that it was illogical to
carry interlacing forward into the forthcoming HD standards. [snip]
The people who design these standards aren't stupid, so obviously I'm
missing something. Can anyone elucidate?

I think the main reason for interlace is that the higher number of lines
sounds better and will sell better, just as a 3GHz computer processor will
sell better than 2GHz, even if everything else about it is worse. The
"headline figure" of 1080 is what makes the difference, more than any real
advantage in the perceived quality of the picture over 720p. We might get
1080 progressive in the future, which would be the best of both worlds, but
once the technology is up to the task it will also be possible to do 2000
lines interlaced and so on, so I'm afraid we will be stuck with interlace
for a long time, just because it makes the numbers bigger.


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Sky's HDTV {{{{{Welcome}}}}} UK digital tv 105 March 15th 05 07:40 PM
HDTV - after one year, I'm unimpressed magnulus High definition TV 102 December 27th 04 02:36 AM
Getting the masses to buy HDTV CygnusX-1 High definition TV 6 December 6th 04 06:14 AM
HDTV - after one year, I'm unimpressed using a 17" monitor imjohnny High definition TV 0 December 1st 04 10:43 AM
Completing the HDTV Picture Ben Thomas High definition TV 0 July 22nd 03 10:55 PM


All times are GMT +1. The time now is 09:52 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
Copyright ©2004-2021 HomeCinemaBanter.
The comments are property of their posters.