A Home cinema forum. HomeCinemaBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HomeCinemaBanter forum » Home cinema newsgroups » UK digital tv
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Why interlaced HDTV?



 
 
Thread Tools Display Modes
  #51  
Old August 24th 05, 10:48 PM
Roger R
external usenet poster
 
Posts: n/a
Default


"Stan The Man" wrote in message
...
This September, Philips will be releasing their 1920x1080 "True

HD"
sets.
Check out the 37PF9830 for example. Around 4000 euro retail here

on
the continent.


Link please



http://www.digitaldirectuk.com/produ...p?product_id=9
142


Thanks for the link.
For some reason I'd thought this special reference was to a CRT set,
but of course its another flat panel.

Roger


  #52  
Old August 24th 05, 11:21 PM
JC
external usenet poster
 
Posts: n/a
Default

On Wed, 24 Aug 2005 10:24:09 +0100, Kennedy McEwen
wrote:

After watching an LCD i'm now like this even with some 100 Hz TVs.


You need to eat more vegetables then, especially carrots!


Sorry, watching TV and healthy eating seem to be mutually exclusive.
;-)

The fact that they sell to the unwashed masses means very little in
terms of image quality - slow rise time LCD panels without any colour
management have been selling in volume too, but the picture quality is
complete crap.


I think we've already established that the great unwashed will buy
substandard products. What I'm saying is that one of the advantages of
LCD and 100 Hz TVs IS a noticeable reduction in flicker. If there was
no benefit at all I don't think 100 Hz TVs would have been on sale for
the last 15 years. I notice the difference.

Yes, if you ignore the fact that interlace is already 50% data
compression to start with! This has been debated extensively on this
group.


Not compression, removal. 576i and 1080i have half the temporal
resolution of 720p.

Coupled with
the fact that 720p has double the temporal resolution of 1080i


No it doesn't! 720p has exactly the same temporal resolution of 1080i,
they both have 50Hz refresh rates. The fact is that 1080i provides
additional information in that frame time, in the form of significantly
enhanced horizontal resolution and additional vertical resolution
through the interlace.


720p will only have the same apparent temporal resolution as 1080i if
the 1080i source and display are none progressive. The 1080i field
resolution will only be 540 lines though compared to 720 with 720p.
1080i provides half the FIELD rate of 720p.

making
it the only real choice for any material with movement


So all those sports stations that have adopted 720p for it's better
motion rendering have got things the wrong way round?

What crap - interlace sources have been the format of choice for
movement for over half a century.


Because it's a convenient form of lossey compression in the analogue
world. It's the technology of the 1930's. We've moved on from there.

That is exactly the point: nobody would need them - if high quality
backwards compatibility were delivered. Whilst that is certainly
possible, indeed just as simple to achieve, it isn't what most flat
panels provide. Consequently the push for progressive standards
alienates about half a century of existing video heritage.


I don't follow this.


That is clear, from your previous posts. Perhaps you need to consider
it more.


1080p production converts nicely to all current and proposed
interlaced and progressive standards. Choosing a progressive format
for HD broadcast is a natural progression of this. I don't follow how
this is in any way less backwards compatible than any other new HD
standard.

The push for progressive standards is as much
about moving forward with technology as the move to high definition.

But progressive is NOT a move forward with technology in itself.
particularly when the option is between 720p and 1080i. Both formats
provide similar vertical resolution in the presence of motion, but the
interlaced option provides much higher horizontal resolution in all


You keep repeating this but it doesn't make it true. 720p has twice
the number of FRAMES as 1080i.

This is a flawed logic. 25frames per second, but 50 fields per second,
the temporal resolution is exactly the same as a 50 frame per second
system. Yes, the full *spatial* resolution is not available
simultaneously with the full vertical resolution, but that is no worse
than most digital codecs, which drop horizontal and vertical resolution
when full temporal resolution is required.


25 frames displayed as 50 half fields. In future both camera and
display will be natively progressive. This will show the true limits
of a 25 FRAME interlaced system.

This no more alienates existing
video heritage than any other part of HD. HD may consign much of what
you and I are familiar with to a museum, but that's progress for you.

If it means that it cannot be rebroadcast without introduction of
artefacts than it is not progress, it is anarchy.


1080p production allows easy conversion to all lower standards. To fix
on 1080i as a broadcast standard would be to immediately throw away
half the information while introducing significant display artifacts.
720p may not have the static resolution of 1080i but wipes the floor
with it in all other respects.

I really agree that we should be looking to 1080p but as production is
moving in that area anyway it would appear logical to adopt the
broadcast standard that's the closest technical match.

Which is 1080i - simply drop every other line in alternate fields: no
interpolative downsampling (with the consequential loss of resolution
inherent in all interpolation techniques) required.


....and then try and reconstruct them in the displays memory from the
two time different half fields, while the nature of the interlace also
causes more artifacts for a given bandwidth at the digital compression
stage. 1080i is 25 fields per second. I do not believe this to be
enough for a modern broadcast system.

With 720p you have a down sampling of the source to from 1080 to 720
at the broadcaster


Losing vertical resolution in the process - unlike sampling a 1080i
field from a 1080p frame, getting a 720p frame is not an integer spatial
division!


So how does a 540 line half field system beat a 720 line full field
system when the two half fields are taken effectively taken at
different points in time, resulting in major interlace artifacts when
combined to create 1080 line full frame?

With 1080i you first have to throw away half the temporal information,


Wrong - and this is the mistake that seems to underpin most of the
"progressive is superior to interlace" logic! You do NOT throw away
half of the temporal information. What you throw away is the
information which requires *BOTH* the full temporal and the full spatial
resolution. That is very much *LESS* than a quarter of the information
in a real world image - and, given the temporal response of the eye, is
even further reduced at the point of viewing. This is why interlace was
adopted in the first place - halving the transmission bandwidth resulted
in a very small amount of the perceivable information being lost.


Half the frames are removed and those that remain are split in to two
time separated fields. This not only throws away half the available
information but introduces significant interlace artifacts on the what
remains.

As i've said. Interlace was a 1930's lossy compressions system to
solve a problem that not only no longer exists, but actually causes
more problems with modern equipment.

transmit this at 1080i and then the domestic TV has to reconstruct a
1080 line progressive frame in memory from two 540 line halves of the
interlace each taken 1/50 of a second apart.


As already explained, that is trivial to accomplish if done correctly.


You can't create information that's not there. Some of the best
systems apparently average between the two fields and try to guess
what should have been present, but at the end of the day, if you have
two half FRAMES taken 1/50 sec apart with different content, there is
no accurate way to guess what was in the other half of each field.

Since these frames may
have completely different information (depending on the amount of
movement etc) this then leads to an apparent quality reduction.


Back to your error - these frames (fields actually) do not have
*completely* different information. Even in a high motion content
scene, the fast majority of the information is identical in both fields.


If the scene is static they'll have the same information. Where there
is the slightest movement they will differ. I regularly see this on
material that has been captured from conventional TV. There are some
very good software de-interlacing routines around but they always add
other artifacts.

The spatial resolution of a 1080 format is more than double that of a
720 format. Your logic appears to argue that since the temporal
resolution of interlace is half that of progressive this cancels out the
spatial advantage - but the temporal resolution of progressive is NOT
twice that of progressive. The difference between the two lies only in


When the interlaced material is converted back to progressive 1080i,
effectively contains half the temporal resolution of 720p. Yes in the
original 1080i signal there are 50 time separate half fields, but when
merged to create the 1080 line picture, this is lost and causes a
reduction of visual resolution.

Which domestic display is currently available that has a 1080 line
vertical resolution?


Plenty. The Samsung LW46G15W springs to mind as being one of the newer
larger sets:

http://www.samsung.com/he/presscente...0000042331.asp


Rgds
Jonathan

  #53  
Old August 24th 05, 11:28 PM
JC
external usenet poster
 
Posts: n/a
Default

On Wed, 24 Aug 2005 12:47:34 +0100, Roderick Stewart
wrote:

Calling a non-interlaced signal "progressive" is simply a rhetorical
use of words to give the impresssion of progress to those who don't
understand what is really being described.


Oh dear. It's progressive because the scan starts at the top an
progresses down a line at a time. It's no worse a description than
interlace.

This isn't an argument over system names. It could be called Craporama
scan for all I care, I still believe (in 720p over 1080i terms) it to
be the better system.

Unfortunately the ignorant are numerous, so those who extoll their
inferior system behind this particular banner may eventually win, and
the realism of televison pictures will be the worse for it.


No, this is the manufacturers trying to foist the substandard 1080i
system over the generally better 720p (or 1080p) system just because
it's a larger number.

Rgds
Jonathan

  #54  
Old August 24th 05, 11:30 PM
Roderick Stewart
external usenet poster
 
Posts: n/a
Default

In article , Jc wrote:
I'm not sure how simple I have to make this but with a progressive
standard it's easy to refresh the screen as many or as few times as
you like without introducing additional artifacts. With interlace, any
change from the native refresh leads to artifacts. A progressive
broadcast on a modern pixel based screen can lead to the highest field
rate with no flicker or other artifacts.


Maybe we're talking at cross-purposes, but I feel I'm struggling to
simplify something too. *Flicker* and *intermittency* (or "jerkiness" of
moving objects) are not the same thing. Flicker in a display can be
absolutely eliminated, but the rate at which pictures are updated cannot
be increased beyond what was properly sampled by the camera, and
maintained throughout the system.

Rod.

  #55  
Old August 24th 05, 11:30 PM
Roderick Stewart
external usenet poster
 
Posts: n/a
Default

In article , Jc wrote:
Imagine taking two separate photographs at half resolution, a short
time apart of something that's moving and then trying to merge them to
get double resolution. The moving object will be in a different
position on the second field to the first.


That's right. That's the way its been for the past 70 years and despite
the fact that you can't easily derive a full resolution still image from
it (something television was never designed for), it shows movement far
more naturally than a full resolution non-interlaced signal using the
same bandwidth.

Now imagine rapidly
flicking between these two images or showing them merged, before
moving on to the next two.


I'm not sure why you would want to do that. Why not just show them as
they are output from the camera?

With a progressive system you just reshow the same FRAME multiple
times or in the case of a memory type display such as LCD, just change
the pixels as required.


Show them as many times as you like, the picture information will only
change as often as it was scanned by the camera. Increasing the display
scanning rate without increasing the camera scanning rate or the
transmission bandwidth will only result in groups of successive displays
of the same picture information, and movement will be no less jerky than
before.

Rod.

  #56  
Old August 24th 05, 11:45 PM
Roderick Stewart
external usenet poster
 
Posts: n/a
Default

In article , Jc wrote:
Calling a non-interlaced signal "progressive" is simply a rhetorical
use of words to give the impresssion of progress to those who don't
understand what is really being described.


Oh dear. It's progressive because the scan starts at the top an
progresses down a line at a time. It's no worse a description than
interlace.

This isn't an argument over system names. It could be called Craporama
scan for all I care, I still believe (in 720p over 1080i terms) it to
be the better system.


Unfortunately names can sell things to the masses. I couldn't comment on
1080i versus 720p because I haven't had the opportunity to make this
comparison, but if either emerges as a "winner" it won't be technical
considerations that decide the matter.

Rod.

  #57  
Old August 25th 05, 12:09 AM
JC
external usenet poster
 
Posts: n/a
Default

On Wed, 24 Aug 2005 22:30:45 +0100, Roderick Stewart
wrote:

Maybe we're talking at cross-purposes, but I feel I'm struggling to
simplify something too. *Flicker* and *intermittency* (or "jerkiness" of
moving objects) are not the same thing. Flicker in a display can be
absolutely eliminated, but the rate at which pictures are updated cannot
be increased beyond what was properly sampled by the camera, and
maintained throughout the system.


I understand the broadcast world be moving to progressive production
technology. That's ideally 1080p50 with 50 full frames per second.

My point was that with a progressive standard it's easier to reshow
the same frame multiple times for higher refresh rates without motion
or other artifacts to eliminate flicker. On pixel based displays such
as LCD, flicker is not an issue but smooth motion still is. In this
case 50 frames per second would again appear to be better than 25.

Rgds
Jonathan



  #58  
Old August 25th 05, 12:16 AM
JC
external usenet poster
 
Posts: n/a
Default

On Wed, 24 Aug 2005 22:30:45 +0100, Roderick Stewart
wrote:

That's right. That's the way its been for the past 70 years and despite
the fact that you can't easily derive a full resolution still image from
it (something television was never designed for), it shows movement far
more naturally than a full resolution non-interlaced signal using the
same bandwidth.


But deriving a full resolution image is what you need to do on any
large progressive display. The common interlaced standards have half
the frame rate of the progressive standards, this worse for movement
in this case.

Now imagine rapidly
flicking between these two images or showing them merged, before
moving on to the next two.


I'm not sure why you would want to do that. Why not just show them as
they are output from the camera?


Because it's the only way to increase the display refresh rate for a
given broadcast rate. On a progressive or memory based screen the two
half images have to be merged for display.

Rgds
Jonathan

  #59  
Old August 25th 05, 12:17 AM
JC
external usenet poster
 
Posts: n/a
Default

On Wed, 24 Aug 2005 22:45:30 +0100, Roderick Stewart
wrote:

Unfortunately names can sell things to the masses. I couldn't comment on
1080i versus 720p because I haven't had the opportunity to make this
comparison, but if either emerges as a "winner" it won't be technical
considerations that decide the matter.


I certainly have to agree here...

Rgds
Jonathan

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Sky's HDTV {{{{{Welcome}}}}} UK digital tv 105 March 15th 05 07:40 PM
HDTV - after one year, I'm unimpressed magnulus High definition TV 102 December 27th 04 02:36 AM
Getting the masses to buy HDTV CygnusX-1 High definition TV 6 December 6th 04 06:14 AM
HDTV - after one year, I'm unimpressed using a 17" monitor imjohnny High definition TV 0 December 1st 04 10:43 AM
Completing the HDTV Picture Ben Thomas High definition TV 0 July 22nd 03 10:55 PM


All times are GMT +1. The time now is 09:52 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
Copyright ©2004-2021 HomeCinemaBanter.
The comments are property of their posters.