![]() |
| If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|||||||
|
|
Thread Tools | Display Modes |
|
#11
|
|||
|
|||
|
Judging from the replies to your query, very few people actually understand the
difference, and what the real visual effects are. When you consider all of the effects that the brain processing of the visual information from the human eye, then you realise that there is not a simple answer. Understand that the visual display varies with the person, the brightness levels, the speed of the motion, where the eye is looking in relation to the motion and even the content of the picture. In general, interlacing produces a higher vertical resolution in the same bandwidth, regardless of whether it is an analogue or digital picture. For some people there is a visible artifact with some moving pictures. However, when the bulk of the population is considered, along with the amount of time that that type of artifact would be present, then the argument is overwhelmingly in favour of the use of interlacing. Now there is a further point to consider. The majority of digital TV sets (particularly plasma and LCD) convert any interlaced image to progressive anyway. The artifacts then present are different to that from an analogue TV set. The visual difference is reduced between progressive and interlaced images. Most of the people who say that progressive is better than interlaced are people who sit close to computer monitors, particularly those who play fast action computer games. Unfortunately the conditions that they face are quite different to that for the average TV viewer. We need to keep the two situations separate. Staiger wrote: In a discussion today with a colleague I argued that it was illogical to carry interlacing forward into the forthcoming HD standards. After some debate it became clear that neither of us understood what we were talking about! I believe that interlacing was introduced decades ago to provide a flicker-free image whilst still requiring only 25 (or 30 in USA) frames to be broadcast per second. In other words, a primitive way of controlling bandwidth requirements. Is this right or wrong? And is there any more to it? But now that we have 100Hz TVs, digital transmissions, and various amounts of digital processing at both the broadcaster and inside a modern TV, I can't understand what interlacing brings to the party, apart from extra complications. Backward compatibility doesn't seem a very strong argument, as the HD interlaced standard appears to be higher definition than 'legacy' interlaced TVs can manage anyway. The people who design these standards aren't stupid, so obviously I'm missing something. Can anyone elucidate? Thanks! Staiger |
|
#12
|
|||
|
|||
|
In article , Stephen wrote:
In a discussion today with a colleague I argued that it was illogical to carry interlacing forward into the forthcoming HD standards. [snip] The people who design these standards aren't stupid, so obviously I'm missing something. Can anyone elucidate? I think the main reason for interlace is that the higher number of lines sounds better and will sell better, just as a 3GHz computer processor will sell better than 2GHz, even if everything else about it is worse. Surely you're joking?! All other things being equal, interlaced video looks unquestionably better. Have you ever seen a 25Hz display? To transmit a non-interlaced image with the same picture update rate as an interlaced one (i.e. 50Hz), and the same number of lines in each picture (i.e. 575) you would need twice the bandwidth. Vertical resolution affects picture sharpness but picture update rate affects movement portrayal. Interlace is a clever way of accommodating both requirements reasonably well without complex technology and withoiut doubling the bandwidth requirement. Converting video from one frame rate to another (something that wasn't thought of when it was invented) is more complicated if it is interlaced, which is why many advocates of new video standards are suggesting they should not use interlace. Also, computer displays don't benefit from it because they normally show static information and are used at a much smaller viewing difference, and many people think that computer displays and television displays ought to use the same standards. I think it is the use of the word "progressive" which is used because it "sounds better", because all it really means is "non-interlaced", and without increasing the available bandwidth, the only "progress" it offers is towards a less realistic portrayal of moving objects. Rod. |
|
#13
|
|||
|
|||
|
Roderick Stewart wrote:
I think it is the use of the word "progressive" which is used because it "sounds better", because all it really means is "non-interlaced", and without increasing the available bandwidth, the only "progress" it offers is towards a less realistic portrayal of moving objects. Don't you remember the long and tedious thread we had discussing 720p50 (50 1280x720 progressive frames per second) vs 1080i25 (25 1920x1080 interlaced frames, i.e. 50 1920x540 fields per second) vs 1080p50 (50 1920x1080 progressive frames per second)? What I got from that discussion was that, if codecs were genuinely intelligent, then interlacing would be redundant, but current codecs are _not_ so intelligent as to render interlacing redundant. 1080i25 has a slightly higher _raw_ datarate than 720p50. 1080p50 has roughly double the _raw_ datarate of either. Going from 1080p50 to 1080i25 is trivial, while "deinterlacing" 1080i25 to 1080p50 (e.g. for viewing via a progressive display) is impossible to perfect, but possible to do very well. It's often done very poorly! Currently 720p50 requires a slightly lower bitrate (MPEG-2) than 1080i25 to encode at a given quality wrt the original, while 1080p50 requires nearly double the bitrate. However, at a given (high-ish) bitrate, 1080i25 looks better than 720p50 for much content. When the source is film or progressive 25 frames per second material, 1080i25 will actually carry a 1080p25 signal; this is trivial to deinterlace and will look better than 720p50 (which will be carrying a 720p25 signal!) because the resolution is double. However, the bitrate required to encode 720p25 well will be about half that for 1080p25. It would be ideal to dump interlacing, and to allow video codecs to use it internally on all or part of the image if it were beneficial - but video codecs don't yet do this, so for now interlacing buys a useful bandwidth advantage, but comes with some disadvantages, not least that most progressive displays don't deinterlace very well! FWIW 50Hz progressive content displayed on a CRT can flicker badly (depending on the phosphors) unless it's interlaced to 100Hz, and there's an argument for using more than 50fps anyway to reduce flicker on CRTs and motion blur on LCDs. Cheers, David. |
|
#14
|
|||
|
|||
|
WDino wrote:
Judging from the replies to your query, very few people actually understand the difference, and what the real visual effects are. When you consider all of the effects that the brain processing of the visual information from the human eye, then you realise that there is not a simple answer. Understand that the visual display varies with the person, the brightness levels, the speed of the motion, where the eye is looking in relation to the motion and even the content of the picture. In general, interlacing produces a higher vertical resolution in the same bandwidth, regardless of whether it is an analogue or digital picture. For some people there is a visible artifact with some moving pictures. However, when the bulk of the population is considered, along with the amount of time that that type of artifact would be present, then the argument is overwhelmingly in favour of the use of interlacing. It looks very much like the BBC are favouring 720p. I think they've already started rigged demonstrations trying to prove that it's better for HDTV. In reality, 720p allows them to use a lower bit rate, so they're favouring it because it's cheaper for them to transmit. This all stems from the EBU's support of using 720p, which is also motivated by the fact that 720p requies a lower bit rate and so is cheaper to transmit. And the head of the EBU Technical Department is an ex-Head of BBC R&D. So, it looks very much like we're going to get the inferior format. -- Steve - www.digitalradiotech.co.uk - Digital Radio News & Info Find the cheapest Freeview, DAB & MP3 Player Prices: http://www.digitalradiotech.co.uk/fr..._receivers.htm http://www.digitalradiotech.co.uk/da...tal_radios.htm http://www.digitalradiotech.co.uk/mp...rs_1GB-5GB.htm http://www.digitalradiotech.co.uk/mp...e_capacity.htm |
|
#15
|
|||
|
|||
|
In article , DAB sounds worse
than FM wrote: It looks very much like the BBC are favouring 720p. I think they've already started rigged demonstrations trying to prove that it's better for HDTV. In reality, 720p allows them to use a lower bit rate, so they're favouring it because it's cheaper for them to transmit. This all stems from the EBU's support of using 720p, which is also motivated by the fact that 720p requies a lower bit rate and so is cheaper to transmit. And the head of the EBU Technical Department is an ex-Head of BBC R&D. So, it looks very much like we're going to get the inferior format. What difference would it make to HDTV specs if the screen was required only to display 720 optimally rather than 1080? And what are current 'HD-ready' TVs optimised for? Are they over-specced? Stan |
|
#16
|
|||
|
|||
|
Stan The Man wrote:
: What difference would it make to HDTV specs if the screen was required : only to display 720 optimally rather than 1080? And what are current : 'HD-ready' TVs optimised for? Are they over-specced? I await anyone who knows if *ANY* of the current "HD Ready" TVs could resolve 1080 lines!! Most VERY recent ones can do typically ~1368x768 at best. I think that for the "HD Ready" label they just need to do = 720 lines |
|
#17
|
|||
|
|||
|
Brian McIlwrath wrote:
Stan The Man wrote: What difference would it make to HDTV specs if the screen was required only to display 720 optimally rather than 1080? And what are current 'HD-ready' TVs optimised for? Are they over-specced? I await anyone who knows if *ANY* of the current "HD Ready" TVs could resolve 1080 lines!! Most VERY recent ones can do typically ~1368x768 at best. I think that for the "HD Ready" label they just need to do = 720 lines I won't be buying one that is less than 1920x1080 -- Adrian A |
|
#18
|
|||
|
|||
|
Adrian wrote:
: I won't be buying one that is less than 1920x1080 You will have a long wait! |
|
#19
|
|||
|
|||
|
Brian McIlwrath wrote:
Adrian wrote: I won't be buying one that is less than 1920x1080 You will have a long wait! I don't mind waiting. |
|
#20
|
|||
|
|||
|
Agamemnon,
Do you know the MPEG4 - Part10, aka H.264 spec? There are features like PAFF and MBAFF, i.e. Picture Adaptive frame/field and Macroblock Adaptive Frame/Field. I have seen one encoder already doing PAFF. Doing it on the macroblock level is more complex, so it will come in future releases and also it is more heavy on the decoder side (more expensive DSPs = more expensive set top). Vladimir Agamemnon wrote: "DAB sounds worse than FM" wrote in message ... Staiger wrote: In a discussion today with a colleague I argued that it was illogical to carry interlacing forward into the forthcoming HD standards. After some debate it became clear that neither of us understood what we were talking about! I believe that interlacing was introduced decades ago to provide a flicker-free image whilst still requiring only 25 (or 30 in USA) frames to be broadcast per second. In other words, a primitive way of controlling bandwidth requirements. Is this right or wrong? And is there any more to it? FM was introduced about 50 years ago, but it remains the highest quality source of radio in the UK. The argument between 720p and 1080i is this: 720p (720 lines progressive) provides better motion portrayal and Only when showing sport or very fast action, and even then it still doesn't look as natural as 720i (50 Hz ie. 50 full 720 line frames per second) when showing regular speed motion since at that speed the refresh rate is equivalent to 100Hz. doesn't suffer from interline twitter. 1080i has a higher static resolution. And also provides more natural looking motion at the same frame rate but only at 504 lines resolution. But considering the Americans had to put up with that standard for years its not bad. The only trouble is that 1080i is a 25Hz system so 720p at 50Hz will surpass it on most content. If 1080i were at 50Hz then it would out do 720p. Now what I don't understand is why the idiots who designed the MPEG-4 system didn't combine progressive and interlaced encoding so that when fast action was being shown it would switch to progressive at half the frame rate to eliminate twitter, when normal speed action was shown it would switch to interlaced at 50Hz to give natural looking motion and when still frames or very slow motion was being shown it would switch to progressive again to improve resolution. For example: Resolution of 720p is 1280 x 720 = 921,600 pixels Resolution of 1080i is 1920 x 1080 x 0.741 = 1536000 pixels Therefore, 1080i has a 67% higher static resolution than 720p. -- Steve - www.digitalradiotech.co.uk - Digital Radio News & Info Find the cheapest Freeview, DAB & MP3 Player Prices: http://www.digitalradiotech.co.uk/fr..._receivers.htm http://www.digitalradiotech.co.uk/da...tal_radios.htm http://www.digitalradiotech.co.uk/mp...rs_1GB-5GB.htm http://www.digitalradiotech.co.uk/mp...e_capacity.htm |
| Thread Tools | |
| Display Modes | |
|
|
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Sky's HDTV | {{{{{Welcome}}}}} | UK digital tv | 105 | March 15th 05 07:40 PM |
| HDTV - after one year, I'm unimpressed | magnulus | High definition TV | 102 | December 27th 04 02:36 AM |
| Getting the masses to buy HDTV | CygnusX-1 | High definition TV | 6 | December 6th 04 06:14 AM |
| HDTV - after one year, I'm unimpressed using a 17" monitor | imjohnny | High definition TV | 0 | December 1st 04 10:43 AM |
| Completing the HDTV Picture | Ben Thomas | High definition TV | 0 | July 22nd 03 10:55 PM |