|
|
Why interlaced HDTV?
In a discussion today with a colleague I argued that it was illogical to
carry interlacing forward into the forthcoming HD standards. After some debate it became clear that neither of us understood what we were talking about! I believe that interlacing was introduced decades ago to provide a flicker-free image whilst still requiring only 25 (or 30 in USA) frames to be broadcast per second. In other words, a primitive way of controlling bandwidth requirements. Is this right or wrong? And is there any more to it? But now that we have 100Hz TVs, digital transmissions, and various amounts of digital processing at both the broadcaster and inside a modern TV, I can't understand what interlacing brings to the party, apart from extra complications. Backward compatibility doesn't seem a very strong argument, as the HD interlaced standard appears to be higher definition than 'legacy' interlaced TVs can manage anyway. The people who design these standards aren't stupid, so obviously I'm missing something. Can anyone elucidate? Thanks! Staiger |
Staiger wrote:
In a discussion today with a colleague I argued that it was illogical to carry interlacing forward into the forthcoming HD standards. After some debate it became clear that neither of us understood what we were talking about! I believe that interlacing was introduced decades ago to provide a flicker-free image whilst still requiring only 25 (or 30 in USA) frames to be broadcast per second. In other words, a primitive way of controlling bandwidth requirements. Is this right or wrong? And is there any more to it? FM was introduced about 50 years ago, but it remains the highest quality source of radio in the UK. The argument between 720p and 1080i is this: 720p (720 lines progressive) provides better motion portrayal and doesn't suffer from interline twitter. 1080i has a higher static resolution. For example: Resolution of 720p is 1280 x 720 = 921,600 pixels Resolution of 1080i is 1920 x 1080 x 0.741 = 1536000 pixels Therefore, 1080i has a 67% higher static resolution than 720p. -- Steve - www.digitalradiotech.co.uk - Digital Radio News & Info Find the cheapest Freeview, DAB & MP3 Player Prices: http://www.digitalradiotech.co.uk/fr..._receivers.htm http://www.digitalradiotech.co.uk/da...tal_radios.htm http://www.digitalradiotech.co.uk/mp...rs_1GB-5GB.htm http://www.digitalradiotech.co.uk/mp...e_capacity.htm |
"Staiger" wrote in message ... In a discussion today with a colleague I argued that it was illogical to carry interlacing forward into the forthcoming HD standards. After some debate it became clear that neither of us understood what we were talking about! I believe that interlacing was introduced decades ago to provide a flicker-free image whilst still requiring only 25 (or 30 in USA) frames to be broadcast per second. In other words, a primitive way of controlling bandwidth requirements. Is this right or wrong? And is there any more to it? But now that we have 100Hz TVs, digital transmissions, and various amounts of digital processing at both the broadcaster and inside a modern TV, I can't understand what interlacing brings to the party, apart from extra complications. Ahhhh, but you forgot one thing. Interlaced is actually 50Hz and the motion looks more natural than non-interlaced at 25Hz, except when showing the bend on an athletics track in which case the entire picture breaks up. Don't believe the crap they give out about film at 24 fps being enough to deceive the human eye. Film at 12 fps which they use for cartons can do that as well but neither of them look natural. 50 fps is the bare minimum which can fool your brain into thinking you are watching natural looking motion (just as long as its not showing interlaced bends on athletics tracks). Backward compatibility doesn't seem a very strong argument, as the HD interlaced standard appears to be higher definition than 'legacy' interlaced TVs can manage anyway. The people who design these standards aren't stupid, so obviously I'm missing something. Can anyone elucidate? Thanks! Staiger |
In article , Staiger wrote:
In a discussion today with a colleague I argued that it was illogical to carry interlacing forward into the forthcoming HD standards. After some debate it became clear that neither of us understood what we were talking about! I believe that interlacing was introduced decades ago to provide a flicker-free image whilst still requiring only 25 (or 30 in USA) frames to be broadcast per second. In other words, a primitive way of controlling bandwidth requirements. Is this right or wrong? And is there any more to it? But now that we have 100Hz TVs, digital transmissions, and various amounts of digital processing at both the broadcaster and inside a modern TV, I can't understand what interlacing brings to the party, apart from extra complications. Backward compatibility doesn't seem a very strong argument, as the HD interlaced standard appears to be higher definition than 'legacy' interlaced TVs can manage anyway. The people who design these standards aren't stupid, so obviously I'm missing something. Can anyone elucidate? Interlace gives more than just a reduction in flicker, i.e. doubling in the frequency at which brightness variations occur. It also doubles the frequency at which picture information is updated, which makes moving objects appear to move in a much smoother and more lifelike way. Even though only half the picture lines are updated each field they are updated twice as often as if interlace was not used, and this is enough to give the smoothing effect. Vertical edges moving sideways are more ragged because they are depicted using half as many lines as when standing still, but this is similar to the blurring of moving objects which occurs naturally in real life. Rod. |
"DAB sounds worse than FM" wrote in message ... Staiger wrote: In a discussion today with a colleague I argued that it was illogical to carry interlacing forward into the forthcoming HD standards. After some debate it became clear that neither of us understood what we were talking about! I believe that interlacing was introduced decades ago to provide a flicker-free image whilst still requiring only 25 (or 30 in USA) frames to be broadcast per second. In other words, a primitive way of controlling bandwidth requirements. Is this right or wrong? And is there any more to it? FM was introduced about 50 years ago, but it remains the highest quality source of radio in the UK. The argument between 720p and 1080i is this: 720p (720 lines progressive) provides better motion portrayal and Only when showing sport or very fast action, and even then it still doesn't look as natural as 720i (50 Hz ie. 50 full 720 line frames per second) when showing regular speed motion since at that speed the refresh rate is equivalent to 100Hz. doesn't suffer from interline twitter. 1080i has a higher static resolution. And also provides more natural looking motion at the same frame rate but only at 504 lines resolution. But considering the Americans had to put up with that standard for years its not bad. The only trouble is that 1080i is a 25Hz system so 720p at 50Hz will surpass it on most content. If 1080i were at 50Hz then it would out do 720p. Now what I don't understand is why the idiots who designed the MPEG-4 system didn't combine progressive and interlaced encoding so that when fast action was being shown it would switch to progressive at half the frame rate to eliminate twitter, when normal speed action was shown it would switch to interlaced at 50Hz to give natural looking motion and when still frames or very slow motion was being shown it would switch to progressive again to improve resolution. For example: Resolution of 720p is 1280 x 720 = 921,600 pixels Resolution of 1080i is 1920 x 1080 x 0.741 = 1536000 pixels Therefore, 1080i has a 67% higher static resolution than 720p. -- Steve - www.digitalradiotech.co.uk - Digital Radio News & Info Find the cheapest Freeview, DAB & MP3 Player Prices: http://www.digitalradiotech.co.uk/fr..._receivers.htm http://www.digitalradiotech.co.uk/da...tal_radios.htm http://www.digitalradiotech.co.uk/mp...rs_1GB-5GB.htm http://www.digitalradiotech.co.uk/mp...e_capacity.htm |
In article ,
DAB sounds worse than FM wrote: FM was introduced about 50 years ago, but it remains the highest quality source of radio in the UK. You never miss a turn, do you? ;-) -- *Why are they called apartments, when they're all stuck together? * Dave Plowman London SW To e-mail, change noise into sound. |
In article , Dave Plowman (News)
writes In article , DAB sounds worse than FM wrote: FM was introduced about 50 years ago, but it remains the highest quality source of radio in the UK. You never miss a turn, do you? ;-) Tis true though..... -- Tony Sayer |
Agamemnon wrote:
"DAB sounds worse than FM" wrote in message ... Staiger wrote: In a discussion today with a colleague I argued that it was illogical to carry interlacing forward into the forthcoming HD standards. After some debate it became clear that neither of us understood what we were talking about! I believe that interlacing was introduced decades ago to provide a flicker-free image whilst still requiring only 25 (or 30 in USA) frames to be broadcast per second. In other words, a primitive way of controlling bandwidth requirements. Is this right or wrong? And is there any more to it? FM was introduced about 50 years ago, but it remains the highest quality source of radio in the UK. The argument between 720p and 1080i is this: 720p (720 lines progressive) provides better motion portrayal and Only when showing sport or very fast action, and even then it still doesn't look as natural as 720i (50 Hz ie. 50 full 720 line frames per second) when showing regular speed motion since at that speed the refresh rate is equivalent to 100Hz. I think you are mistaken. The American 720p standard used for sport is 720p60 (ie. 720 full, progressive frames, at 60 frames per second). doesn't suffer from interline twitter. 1080i has a higher static resolution. And also provides more natural looking motion at the same frame rate but only at 504 lines resolution. But considering the Americans had to put up with that standard for years its not bad. The only trouble is that 1080i is a 25Hz system so 720p at 50Hz will surpass it on most content. If 1080i were at 50Hz then it would out do 720p. The American 1080i system (used for everything other than sport) is usually 1080i60, ie. 60 fields per second. So it's a 60hz system, not 25hz. The screen is updated with new information 60 times per second (though only every other line is sent, each 60th of a second). It's similar to the way current interlaced tv works, but at a higher resolution. Now what I don't understand is why the idiots who designed the MPEG-4 system didn't combine progressive and interlaced encoding so that when fast action was being shown it would switch to progressive at half the frame rate to eliminate twitter, when normal speed action was shown it would switch to interlaced at 50Hz to give natural looking motion and when still frames or very slow motion was being shown it would switch to progressive again to improve resolution. For example: Resolution of 720p is 1280 x 720 = 921,600 pixels Resolution of 1080i is 1920 x 1080 x 0.741 = 1536000 pixels Therefore, 1080i has a 67% higher static resolution than 720p. -- Steve - www.digitalradiotech.co.uk - Digital Radio News & Info Find the cheapest Freeview, DAB & MP3 Player Prices: http://www.digitalradiotech.co.uk/fr..._receivers.htm http://www.digitalradiotech.co.uk/da...tal_radios.htm http://www.digitalradiotech.co.uk/mp...rs_1GB-5GB.htm http://www.digitalradiotech.co.uk/mp...e_capacity.htm |
This is a UK newsgroup not US.
T1000 wrote: Agamemnon wrote: "DAB sounds worse than FM" wrote in message ... Staiger wrote: In a discussion today with a colleague I argued that it was illogical to carry interlacing forward into the forthcoming HD standards. After some debate it became clear that neither of us understood what we were talking about! I believe that interlacing was introduced decades ago to provide a flicker-free image whilst still requiring only 25 (or 30 in USA) frames to be broadcast per second. In other words, a primitive way of controlling bandwidth requirements. Is this right or wrong? And is there any more to it? FM was introduced about 50 years ago, but it remains the highest quality source of radio in the UK. The argument between 720p and 1080i is this: 720p (720 lines progressive) provides better motion portrayal and Only when showing sport or very fast action, and even then it still doesn't look as natural as 720i (50 Hz ie. 50 full 720 line frames per second) when showing regular speed motion since at that speed the refresh rate is equivalent to 100Hz. I think you are mistaken. The American 720p standard used for sport is 720p60 (ie. 720 full, progressive frames, at 60 frames per second). doesn't suffer from interline twitter. 1080i has a higher static resolution. And also provides more natural looking motion at the same frame rate but only at 504 lines resolution. But considering the Americans had to put up with that standard for years its not bad. The only trouble is that 1080i is a 25Hz system so 720p at 50Hz will surpass it on most content. If 1080i were at 50Hz then it would out do 720p. The American 1080i system (used for everything other than sport) is usually 1080i60, ie. 60 fields per second. So it's a 60hz system, not 25hz. The screen is updated with new information 60 times per second (though only every other line is sent, each 60th of a second). It's similar to the way current interlaced tv works, but at a higher resolution. Now what I don't understand is why the idiots who designed the MPEG-4 system didn't combine progressive and interlaced encoding so that when fast action was being shown it would switch to progressive at half the frame rate to eliminate twitter, when normal speed action was shown it would switch to interlaced at 50Hz to give natural looking motion and when still frames or very slow motion was being shown it would switch to progressive again to improve resolution. For example: Resolution of 720p is 1280 x 720 = 921,600 pixels Resolution of 1080i is 1920 x 1080 x 0.741 = 1536000 pixels Therefore, 1080i has a 67% higher static resolution than 720p. -- Steve - www.digitalradiotech.co.uk - Digital Radio News & Info Find the cheapest Freeview, DAB & MP3 Player Prices: http://www.digitalradiotech.co.uk/fr..._receivers.htm http://www.digitalradiotech.co.uk/da...tal_radios.htm http://www.digitalradiotech.co.uk/mp...rs_1GB-5GB.htm http://www.digitalradiotech.co.uk/mp...e_capacity.htm |
In a discussion today with a colleague I argued that it was illogical to
carry interlacing forward into the forthcoming HD standards. [snip] The people who design these standards aren't stupid, so obviously I'm missing something. Can anyone elucidate? I think the main reason for interlace is that the higher number of lines sounds better and will sell better, just as a 3GHz computer processor will sell better than 2GHz, even if everything else about it is worse. The "headline figure" of 1080 is what makes the difference, more than any real advantage in the perceived quality of the picture over 720p. We might get 1080 progressive in the future, which would be the best of both worlds, but once the technology is up to the task it will also be possible to do 2000 lines interlaced and so on, so I'm afraid we will be stuck with interlace for a long time, just because it makes the numbers bigger. |
Judging from the replies to your query, very few people actually understand the
difference, and what the real visual effects are. When you consider all of the effects that the brain processing of the visual information from the human eye, then you realise that there is not a simple answer. Understand that the visual display varies with the person, the brightness levels, the speed of the motion, where the eye is looking in relation to the motion and even the content of the picture. In general, interlacing produces a higher vertical resolution in the same bandwidth, regardless of whether it is an analogue or digital picture. For some people there is a visible artifact with some moving pictures. However, when the bulk of the population is considered, along with the amount of time that that type of artifact would be present, then the argument is overwhelmingly in favour of the use of interlacing. Now there is a further point to consider. The majority of digital TV sets (particularly plasma and LCD) convert any interlaced image to progressive anyway. The artifacts then present are different to that from an analogue TV set. The visual difference is reduced between progressive and interlaced images. Most of the people who say that progressive is better than interlaced are people who sit close to computer monitors, particularly those who play fast action computer games. Unfortunately the conditions that they face are quite different to that for the average TV viewer. We need to keep the two situations separate. Staiger wrote: In a discussion today with a colleague I argued that it was illogical to carry interlacing forward into the forthcoming HD standards. After some debate it became clear that neither of us understood what we were talking about! I believe that interlacing was introduced decades ago to provide a flicker-free image whilst still requiring only 25 (or 30 in USA) frames to be broadcast per second. In other words, a primitive way of controlling bandwidth requirements. Is this right or wrong? And is there any more to it? But now that we have 100Hz TVs, digital transmissions, and various amounts of digital processing at both the broadcaster and inside a modern TV, I can't understand what interlacing brings to the party, apart from extra complications. Backward compatibility doesn't seem a very strong argument, as the HD interlaced standard appears to be higher definition than 'legacy' interlaced TVs can manage anyway. The people who design these standards aren't stupid, so obviously I'm missing something. Can anyone elucidate? Thanks! Staiger |
In article , Stephen wrote:
In a discussion today with a colleague I argued that it was illogical to carry interlacing forward into the forthcoming HD standards. [snip] The people who design these standards aren't stupid, so obviously I'm missing something. Can anyone elucidate? I think the main reason for interlace is that the higher number of lines sounds better and will sell better, just as a 3GHz computer processor will sell better than 2GHz, even if everything else about it is worse. Surely you're joking?! All other things being equal, interlaced video looks unquestionably better. Have you ever seen a 25Hz display? To transmit a non-interlaced image with the same picture update rate as an interlaced one (i.e. 50Hz), and the same number of lines in each picture (i.e. 575) you would need twice the bandwidth. Vertical resolution affects picture sharpness but picture update rate affects movement portrayal. Interlace is a clever way of accommodating both requirements reasonably well without complex technology and withoiut doubling the bandwidth requirement. Converting video from one frame rate to another (something that wasn't thought of when it was invented) is more complicated if it is interlaced, which is why many advocates of new video standards are suggesting they should not use interlace. Also, computer displays don't benefit from it because they normally show static information and are used at a much smaller viewing difference, and many people think that computer displays and television displays ought to use the same standards. I think it is the use of the word "progressive" which is used because it "sounds better", because all it really means is "non-interlaced", and without increasing the available bandwidth, the only "progress" it offers is towards a less realistic portrayal of moving objects. Rod. |
Roderick Stewart wrote:
I think it is the use of the word "progressive" which is used because it "sounds better", because all it really means is "non-interlaced", and without increasing the available bandwidth, the only "progress" it offers is towards a less realistic portrayal of moving objects. Don't you remember the long and tedious thread we had discussing 720p50 (50 1280x720 progressive frames per second) vs 1080i25 (25 1920x1080 interlaced frames, i.e. 50 1920x540 fields per second) vs 1080p50 (50 1920x1080 progressive frames per second)? What I got from that discussion was that, if codecs were genuinely intelligent, then interlacing would be redundant, but current codecs are _not_ so intelligent as to render interlacing redundant. 1080i25 has a slightly higher _raw_ datarate than 720p50. 1080p50 has roughly double the _raw_ datarate of either. Going from 1080p50 to 1080i25 is trivial, while "deinterlacing" 1080i25 to 1080p50 (e.g. for viewing via a progressive display) is impossible to perfect, but possible to do very well. It's often done very poorly! Currently 720p50 requires a slightly lower bitrate (MPEG-2) than 1080i25 to encode at a given quality wrt the original, while 1080p50 requires nearly double the bitrate. However, at a given (high-ish) bitrate, 1080i25 looks better than 720p50 for much content. When the source is film or progressive 25 frames per second material, 1080i25 will actually carry a 1080p25 signal; this is trivial to deinterlace and will look better than 720p50 (which will be carrying a 720p25 signal!) because the resolution is double. However, the bitrate required to encode 720p25 well will be about half that for 1080p25. It would be ideal to dump interlacing, and to allow video codecs to use it internally on all or part of the image if it were beneficial - but video codecs don't yet do this, so for now interlacing buys a useful bandwidth advantage, but comes with some disadvantages, not least that most progressive displays don't deinterlace very well! FWIW 50Hz progressive content displayed on a CRT can flicker badly (depending on the phosphors) unless it's interlaced to 100Hz, and there's an argument for using more than 50fps anyway to reduce flicker on CRTs and motion blur on LCDs. Cheers, David. |
WDino wrote:
Judging from the replies to your query, very few people actually understand the difference, and what the real visual effects are. When you consider all of the effects that the brain processing of the visual information from the human eye, then you realise that there is not a simple answer. Understand that the visual display varies with the person, the brightness levels, the speed of the motion, where the eye is looking in relation to the motion and even the content of the picture. In general, interlacing produces a higher vertical resolution in the same bandwidth, regardless of whether it is an analogue or digital picture. For some people there is a visible artifact with some moving pictures. However, when the bulk of the population is considered, along with the amount of time that that type of artifact would be present, then the argument is overwhelmingly in favour of the use of interlacing. It looks very much like the BBC are favouring 720p. I think they've already started rigged demonstrations trying to prove that it's better for HDTV. In reality, 720p allows them to use a lower bit rate, so they're favouring it because it's cheaper for them to transmit. This all stems from the EBU's support of using 720p, which is also motivated by the fact that 720p requies a lower bit rate and so is cheaper to transmit. And the head of the EBU Technical Department is an ex-Head of BBC R&D. So, it looks very much like we're going to get the inferior format. -- Steve - www.digitalradiotech.co.uk - Digital Radio News & Info Find the cheapest Freeview, DAB & MP3 Player Prices: http://www.digitalradiotech.co.uk/fr..._receivers.htm http://www.digitalradiotech.co.uk/da...tal_radios.htm http://www.digitalradiotech.co.uk/mp...rs_1GB-5GB.htm http://www.digitalradiotech.co.uk/mp...e_capacity.htm |
In article , DAB sounds worse
than FM wrote: It looks very much like the BBC are favouring 720p. I think they've already started rigged demonstrations trying to prove that it's better for HDTV. In reality, 720p allows them to use a lower bit rate, so they're favouring it because it's cheaper for them to transmit. This all stems from the EBU's support of using 720p, which is also motivated by the fact that 720p requies a lower bit rate and so is cheaper to transmit. And the head of the EBU Technical Department is an ex-Head of BBC R&D. So, it looks very much like we're going to get the inferior format. What difference would it make to HDTV specs if the screen was required only to display 720 optimally rather than 1080? And what are current 'HD-ready' TVs optimised for? Are they over-specced? Stan |
Stan The Man wrote:
: What difference would it make to HDTV specs if the screen was required : only to display 720 optimally rather than 1080? And what are current : 'HD-ready' TVs optimised for? Are they over-specced? I await anyone who knows if *ANY* of the current "HD Ready" TVs could resolve 1080 lines!! Most VERY recent ones can do typically ~1368x768 at best. I think that for the "HD Ready" label they just need to do = 720 lines |
Brian McIlwrath wrote:
Stan The Man wrote: What difference would it make to HDTV specs if the screen was required only to display 720 optimally rather than 1080? And what are current 'HD-ready' TVs optimised for? Are they over-specced? I await anyone who knows if *ANY* of the current "HD Ready" TVs could resolve 1080 lines!! Most VERY recent ones can do typically ~1368x768 at best. I think that for the "HD Ready" label they just need to do = 720 lines I won't be buying one that is less than 1920x1080 -- Adrian A |
Adrian wrote:
: I won't be buying one that is less than 1920x1080 You will have a long wait! |
Brian McIlwrath wrote:
Adrian wrote: I won't be buying one that is less than 1920x1080 You will have a long wait! I don't mind waiting. |
Agamemnon,
Do you know the MPEG4 - Part10, aka H.264 spec? There are features like PAFF and MBAFF, i.e. Picture Adaptive frame/field and Macroblock Adaptive Frame/Field. I have seen one encoder already doing PAFF. Doing it on the macroblock level is more complex, so it will come in future releases and also it is more heavy on the decoder side (more expensive DSPs = more expensive set top). Vladimir Agamemnon wrote: "DAB sounds worse than FM" wrote in message ... Staiger wrote: In a discussion today with a colleague I argued that it was illogical to carry interlacing forward into the forthcoming HD standards. After some debate it became clear that neither of us understood what we were talking about! I believe that interlacing was introduced decades ago to provide a flicker-free image whilst still requiring only 25 (or 30 in USA) frames to be broadcast per second. In other words, a primitive way of controlling bandwidth requirements. Is this right or wrong? And is there any more to it? FM was introduced about 50 years ago, but it remains the highest quality source of radio in the UK. The argument between 720p and 1080i is this: 720p (720 lines progressive) provides better motion portrayal and Only when showing sport or very fast action, and even then it still doesn't look as natural as 720i (50 Hz ie. 50 full 720 line frames per second) when showing regular speed motion since at that speed the refresh rate is equivalent to 100Hz. doesn't suffer from interline twitter. 1080i has a higher static resolution. And also provides more natural looking motion at the same frame rate but only at 504 lines resolution. But considering the Americans had to put up with that standard for years its not bad. The only trouble is that 1080i is a 25Hz system so 720p at 50Hz will surpass it on most content. If 1080i were at 50Hz then it would out do 720p. Now what I don't understand is why the idiots who designed the MPEG-4 system didn't combine progressive and interlaced encoding so that when fast action was being shown it would switch to progressive at half the frame rate to eliminate twitter, when normal speed action was shown it would switch to interlaced at 50Hz to give natural looking motion and when still frames or very slow motion was being shown it would switch to progressive again to improve resolution. For example: Resolution of 720p is 1280 x 720 = 921,600 pixels Resolution of 1080i is 1920 x 1080 x 0.741 = 1536000 pixels Therefore, 1080i has a 67% higher static resolution than 720p. -- Steve - www.digitalradiotech.co.uk - Digital Radio News & Info Find the cheapest Freeview, DAB & MP3 Player Prices: http://www.digitalradiotech.co.uk/fr..._receivers.htm http://www.digitalradiotech.co.uk/da...tal_radios.htm http://www.digitalradiotech.co.uk/mp...rs_1GB-5GB.htm http://www.digitalradiotech.co.uk/mp...e_capacity.htm |
H264 goes the world wrote:
Agamemnon, Do you know the MPEG4 - Part10, aka H.264 spec? There are features like PAFF and MBAFF, i.e. Picture Adaptive frame/field and Macroblock Adaptive Frame/Field. I have seen one encoder already doing PAFF. Indeed. This is described on pages 7-9 in he http://www.cs.ubc.ca/~krasic/cpsc538...c-overview.pdf e.g. "To provide high coding efficiency, the H.264/AVC design allows encoders to make any of the following decisions when coding a frame. 1) To combine the two fields together and to code them as one single coded frame (frame mode). 2) To not combine the two fields and to code them as separate coded fields (field mode). 3) To combine the two fields together and compress them as a single frame, but when coding the frame to split the pairs of two vertically adjacent macroblocks into either pairs of two field or frame macroblocks before coding them." and "If a frame consists of mixed regions where some regions are moving and others are not, it is typically more efficient to code the nonmoving regions in frame mode and the moving regions in the field mode." Doing it on the macroblock level is more complex, so it will come in future releases and also it is more heavy on the decoder side (more expensive DSPs = more expensive set top). Surely decoders will include this functionality and it's just a case of waiting for encoders to implement it? -- Steve - www.digitalradiotech.co.uk - Digital Radio News & Info Find the cheapest Freeview, DAB & MP3 Player Prices: http://www.digitalradiotech.co.uk/fr..._receivers.htm http://www.digitalradiotech.co.uk/da...tal_radios.htm http://www.digitalradiotech.co.uk/mp...rs_1GB-5GB.htm http://www.digitalradiotech.co.uk/mp...e_capacity.htm |
"Adrian" wrote in message ... Brian McIlwrath wrote: Adrian wrote: I won't be buying one that is less than 1920x1080 You will have a long wait! I don't mind waiting. Almost exactly a year ago there was some discussion of this issue in this group under the thread 'HDTV sets available now', where Stephen Neal had some useful comments to make. At that time the same question was being asked, are the sets on the market capable of the required resolution? The problem being that the shadow mask had to have a very fine dot pitch similar to that of a computer monitor if the screen was of modest size, say around 20 inch. The industry has moved forward and large screens are the order of the day, but a year ago Stephen Neal wrote: Quote "AIUI the only direct view CRT on sale in the US that fully resolves the 1920x1080 standard is a 34 or 36" Sony - and it is apparently quite a lot dimmer than the softer models. A larger screen means a coarser aperture grille can be used whilst still retaining the resolution across the whole screen area." End quote. I don't know if any Sony, Panasonic or other CRT's are offering the native 1920 x 1080 resolution, but if there are any I'd be interested in model numbers so I can have a look in store to see if they are still dimmer than their softer counterparts. Unfortuntely all HDTV demo's are using plasma screens, presumably because the industry wants to associate HDTV with the latest type of display and not with what joe public might perceive as old fashioned tellys. Roger |
"Brian McIlwrath" wrote in message ... Adrian wrote: : I won't be buying one that is less than 1920x1080 You will have a long wait! This September, Philips will be releasing their 1920x1080 "True HD" sets. Check out the 37PF9830 for example. Around 4000 euro retail here on the continent. |
On 17 Aug 2005 02:49:06 -0700, "
wrote: FWIW 50Hz progressive content displayed on a CRT can flicker badly (depending on the phosphors) unless it's interlaced to 100Hz, and there's an argument for using more than 50fps anyway to reduce flicker on CRTs and motion blur on LCDs. At launch in this country practically nobody will be using a HD CRT set to view HD material and within a few years the CRT % numbers will be even lower. We are moving to a world where all displays (LCD, plasma etc) will be natively progressive and interlaced material will have to be frame stored within the set. IMHO to spec an interlaced system for HD which is then going to have to be de-interlaced using the (variable quality) hardware of the TV is insane. On flat progressive displays 720p looks the same if not better than 1080i and has the advantage of better rendering of movement. 1080p would of course be preferable to both.... Rgds Jonatham |
In article , Jc wrote:
At launch in this country practically nobody will be using a HD CRT set to view HD material and within a few years the CRT % numbers will be even lower. We are moving to a world where all displays (LCD, plasma etc) will be natively progressive and interlaced material will have to be frame stored within the set. IMHO to spec an interlaced system for HD which is then going to have to be de-interlaced using the (variable quality) hardware of the TV is insane. Would you suggest the same policy for gamma correction? As it's a pre-distortion applied in the camera to compensate for the characteristics of a CRT, we shouldn't, in theory, need it in a flat panel broadcasting world, and doing without would make post-production colour correction much simpler. How many established standards and practices do you think it would be wise to abandon all in one go? Rod. |
In article , Roderick
Stewart writes Would you suggest the same policy for gamma correction? As it's a pre-distortion applied in the camera to compensate for the characteristics of a CRT, we shouldn't, in theory, need it in a flat panel broadcasting world, and doing without would make post-production colour correction much simpler. Whilst gamma was originally intended to compensate for the response characteristics of a typical CRT, by one of those amazing coincidences it is almost exactly the correct function to compensate for the perceptual response of the human eye - the eye is close to the inverse response of the CRT. Without gamma we need about 18-bits of linear intensity coded video to produce the same dynamic range that an 8-bit gamma 2.2 picture is capable of producing (ignoring the contrast limitations of the display itself. Even then, most of the data that 18-bit signal will carry is redundant, particularly in the mid tones and highlight regions, because we only need the full precision of that bit depth in the shadows. In other words, if gamma was not already in use, we would need to invent something very similar to it even for linear response displays unless we were to adopt very high bit video encoding. See http://www.poynton.com/notes/colour_.../GammaFAQ.html So, whilst your claim that in theory we don't need gamma is correct, without gamma we would have to use something much more cumbersome, and it certainly wouldn't make post production or colour correction any more simpler. -- Kennedy Yes, Socrates himself is particularly missed; A lovely little thinker, but a bugger when he's ****ed. Python Philosophers (replace 'nospam' with 'kennedym' when replying) |
Kennedy McEwen wrote in
: Whilst gamma was originally intended to compensate for the response characteristics of a typical CRT, by one of those amazing coincidences it is almost exactly the correct function to compensate for the perceptual response of the human eye - the eye is close to the inverse response of the CRT. I was brought up in them days, even to messing with the transfer characteristic of image orthicons snip explanatio but what I don't understand is why flat screens seem so black crushed. (which is a bit of a laugh because my Philips abortion, among it's other lacks, can't get anywhwere near black anyway) But on the grey backgound, I seem to be totally unable to get anything between alleged black and low mid tones. Is this a funtion of all LCDs? and will it always be. IOW, I guess I'm asking if future pictures will always be as bad? mike |
"rookie" wrote in message ... "Brian McIlwrath" wrote in message ... Adrian wrote: : I won't be buying one that is less than 1920x1080 You will have a long wait! This September, Philips will be releasing their 1920x1080 "True HD" sets. Check out the 37PF9830 for example. Around 4000 euro retail here on the continent. Link please I can only find 37PF9830 on the Philips Netherlands site, but the details are in Dutch, and the translation engines don't work for Dutch/English. I tried the search in the French and German sites, (that do work in the language translators) but they just return 'no results'. Roger |
On 22 Aug 2005 10:52:27 GMT, mike ring
wrote: Kennedy McEwen wrote in : Whilst gamma was originally intended to compensate for the response characteristics of a typical CRT, by one of those amazing coincidences it is almost exactly the correct function to compensate for the perceptual response of the human eye - the eye is close to the inverse response of the CRT. I was brought up in them days, even to messing with the transfer characteristic of image orthicons snip explanatio but what I don't understand is why flat screens seem so black crushed. I have a feeling that it's probably marketing. If you look at the default settings for most modern TVs the contrast is way too high and the brightness too low giving a nastyy combination of crushed blacks and blown out highlights, however it gives an initial impression of sharp bright pictures to the uncritical eye. CF the Sony Trinitron effect; when Sony introduced the Trinitron most ordinary viewers said that the picture was better than normal TVs, because the tube could give a brighter image. In other respects, like resolution the average Trinitron was worse than a standard dotty shadowmask. A couple of years ago I bought a projector. The factory setup gives terrible black crushing; in order to get a decent picture you have to get into the menus and tweak the gamma to the opposite end of the range from the factory setting. Again I assume it was set like that to give an initial impression of sharp contrasty pictures. (which is a bit of a laugh because my Philips abortion, among it's other lacks, can't get anywhwere near black anyway) But on the grey backgound, I seem to be totally unable to get anything between alleged black and low mid tones. Is this a funtion of all LCDs? and will it always be. IOW, I guess I'm asking if future pictures will always be as bad? I haven't seen a good flat panel display yet, but they do seem to be improving, and I would expect them to eventually get to an acceptable quality level. Bill |
In article , Kennedy McEwen wrote:
Would you suggest the same policy for gamma correction? As it's a pre-distortion applied in the camera to compensate for the characteristics of a CRT, we shouldn't, in theory, need it in a flat panel broadcasting world, and doing without would make post-production colour correction much simpler. Whilst gamma was originally intended to compensate for the response characteristics of a typical CRT, by one of those amazing coincidences it is almost exactly the correct function to compensate for the perceptual response of the human eye - the eye is close to the inverse response of the CRT. If what you say were true, i.e. if our eyes really did correct for the CRT's characteristic, why would it be necessary also to include electronic correction for it in the camera? That would be two lots of correction, wouldn't it? Actually my previous posting was intended to highlight the unwisdom of abandoning technical standards in broadcasting just because of the invention of one new piece of equipment. A system such as broadcasting that involves a lot of equipment owned by a lot of people needs standards that will not be changed overnight, even if it a better system might hypothetically have resulted from scrapping the entire system and starting again. Interlace, gamma correction, and various other technical features of television have been used for about seventy years and are now in use in millions of items of equipment all over the world, so that changing any of them would have enormous financial consequences and incur widespread conmfusion. Look at he number of wrongly adjusted TV pictures resulting from the simple decision to change something no more complicated or obscure than its shape, something that you would think anyone could see and understand easily. Rod. |
On Mon, 22 Aug 2005 06:58:44 +0100, Roderick Stewart
wrote: Would you suggest the same policy for gamma correction? As it's a pre-distortion applied in the camera to compensate for the characteristics of a CRT, we shouldn't, in theory, need it in a flat panel broadcasting world, and doing without would make post-production colour correction much simpler. How many established standards and practices do you think it would be wise to abandon all in one go? I don't feel qualified to answer this, but I will say that if you have a new system, in which you know that there are no (or very minimal) legacy devices, is it right to compromise the system for the next few decades just in case there are a few people who may have sub-optimal quality on a very small number of legacy displays at launch? Would you have suggested that all 625 line TVs support 405 lines? It was after all the established standard? How about teletext, would you suggest that the number of active teletext lines wasn't increased in the 80s as it may have caused problems for people with 15 year old tvs? Rgds Jonathan |
On Mon, 22 Aug 2005 09:09:34 +0100, Kennedy McEwen
wrote: But it is infinitely easier to display interlaced video correctly, and without a frame store, on a progressive screen than it is to display progressive video correctly on an interlaced screed. All current progressive screens are by their nature a frame store. By displaying interlaced material you are effectively displaying two fields shot at different times simultaneously. This has the effect of degrading the apparent resolution of the system. In the former case, all that is necessary to do the job correctly is to store a line of video at a time, write that onto the line of pixels on the progressive screen and then blank off the pixels on the subsequent line. In the next field, simply blank the pixels on the first line (which were written on the previous field) and write the pixels in the next line (which were blanked on the previous field). This reproduces the interlace structure with the exact time latency and spatial structure of the original format. This is not however how LCD and Plasma screens work and to try and replicate this would create all the traditional TV flicker that these screens are so good at eliminating while introducing other artifacts. You can't do that the other way round, if the display is interlaced by default, without destroying the latency between adjacent lines. It's always a compromise. Even in todays broadcasting world I'd prefer interlaced material to be deinterlaced by an expensive box at the broadcaster than in a £ 5 or less chip in my TV. In future most new productions will be natively progressive so the problem will be eliminated. To take progressive material, interlace it and then de-interlace it would of course be even more crazy. Rgds Jonathan |
In article , JC
writes On Mon, 22 Aug 2005 09:09:34 +0100, Kennedy McEwen wrote: But it is infinitely easier to display interlaced video correctly, and without a frame store, on a progressive screen than it is to display progressive video correctly on an interlaced screed. All current progressive screens are by their nature a frame store. By displaying interlaced material you are effectively displaying two fields shot at different times simultaneously. This has the effect of degrading the apparent resolution of the system. That is indeed how most of them work, but it doesn't have to be that way. A progressive display can be configured to display interlaced material perfectly correctly. I have actually done something similar to this on a progressive OLED display where I had access to the raw panel drive, and I can confirm that it does completely eliminate the motion artefacts that you get when doing the conventional frame store reconstruction that most progressive displays use. Actually, what I did was more like the scheme described below, because of the way the OLED interface worked. This is not however how LCD and Plasma screens work and to try and replicate this would create all the traditional TV flicker that these screens are so good at eliminating You don't get owt for nowt. The price of that flicker elimination has to come from somewhere, and the conventional wisdom of the manufacturers appears to be that it comes at the expense of motion tearing through complete reconstruction of interlaced fields into a single frame for progressive display. That isn't the only possible trade-off. As described, it is possible to design a progressive display that will eliminate motion tearing with interlaced video. That particular trade-off results in 50Hz flicker, but no worse than on a conventional interlaced CRT. However there is a third trade-off which maintains the flicker elimination of these screens. Instead of blanking the lines which are not present in each field, just repeat the data from the previous field. Again, the latency is consistent on all pixels, however the hold time on each pixel is now doubled, which is the analogous to a very long persistence CRT phosphor. Consequently this time you are trading some motion blur to eliminate both flicker and motion tearing. You don't get owt for nowt, but the trade space available is multidimensional - the manufacturers just restrict the options available to you. I am sure that there are other options, but these are a couple that I have played with which I know work. while introducing other artifacts. What other artefacts are introduced? You eliminate motion artefacts by achieving consistent latency and the flicker is no worse than on a conventional interlaced CRT (its actually a little better because they don't have the emission decay that a CRT phosphor has). Based on the OLED system I did this on, I doubt if you would see any flicker at all, but you can fix that as described. You can't do that the other way round, if the display is interlaced by default, without destroying the latency between adjacent lines. It's always a compromise. It *is* a compromise - it doesn't *have* to be one. You can display interlaced images on a progressive screen correctly, and I suspect that the reason it isn't done has more to do with manufacturers agendas than a limitation of interlace per se. You can't display progressive images on an interlaced screen though, without artefacts or throwing half the information away, which defeats any advantage the system has. I was looking at some 720p a couple of days ago, and wasn't overly impressed. Yes, better than 625i, but not dramatically so. When I saw 1080i a while ago, I was completely bowled over by it. That extra horizontal resolution isn't worthless, you know. Equipment was different of course, and perhaps my expectations have changed with time, but that was my impression viewing both options. As always, YMMV. -- Kennedy Yes, Socrates himself is particularly missed; A lovely little thinker, but a bugger when he's ****ed. Python Philosophers (replace 'nospam' with 'kennedym' when replying) |
In article , Roger R
wrote: "rookie" wrote in message ... "Brian McIlwrath" wrote in message ... Adrian wrote: : I won't be buying one that is less than 1920x1080 You will have a long wait! This September, Philips will be releasing their 1920x1080 "True HD" sets. Check out the 37PF9830 for example. Around 4000 euro retail here on the continent. Link please http://www.digitaldirectuk.com/produ...p?product_id=9 142 Stan |
On Tue, 23 Aug 2005 01:05:30 +0100, Kennedy McEwen
wrote: progressive display. That isn't the only possible trade-off. As described, it is possible to design a progressive display that will eliminate motion tearing with interlaced video. That particular trade-off results in 50Hz flicker, but no worse than on a conventional interlaced CRT. But conventional 50 Hz CRT flicker is horrible, especially on larger screens. Hence the number of 100 Hz TVs on the market and all the artifacts that their frame stores cause. However there is a third trade-off which maintains the flicker elimination of these screens. Instead of blanking the lines which are not present in each field, just repeat the data from the previous field. Again, the latency is consistent on all pixels, however the hold time on each pixel is now doubled, which is the analogous to a very long persistence CRT phosphor. Consequently this time you are trading some motion blur to eliminate both flicker and motion tearing. This is effectively what any good broadcast interlace to progressive converter would do and as I said, I'd rather it was done by the broadcaster than a sub £ 5 chip in my TV. The blurring artifacts within the picture reduce the visable resoloution to, it would appear, similar to 720p levels. However the adoption of a progressive broadcast system allows the migration to all progressive production over time eliminating this problem completely. manufacturers agendas than a limitation of interlace per se. You can't display progressive images on an interlaced screen though, without artefacts or throwing half the information away, which defeats any advantage the system has. But nobodies going to be using an interlaced screen for HD. Even an HD CRT should be capable of 50 or 100 Hz progressive refresh without interlace and I'd be very surprised to see an HD CRT set in Dixons etc in a years time. Production is already moving to progressive formats and of course film is natively progressive. Film has been converted to interlaced SD TV for decades without problems (as far a the interlace goes) and ironically often looks better on progressive sets due to this than natively interlaced material. Progressive production and display is the future. To tie our HD broadcast standards to the legacy interlace is even more crazy than tieing DAB to Layer 2. I was looking at some 720p a couple of days ago, and wasn't overly impressed. Yes, better than 625i, but not dramatically so. When I saw 1080i a while ago, I was completely bowled over by it. That extra horizontal resolution isn't worthless, you know. Equipment was different of course, and perhaps my expectations have changed with time, but that was my impression viewing both options. As always, YMMV. Was the 1080i and 720p displayed on a progressive or interlaced screen? I have to say some of the HD material I've seen has been a bit of a disappointment but still a worthwhile improvement. One thing I can say is that in my experience, 720p beats 1080i hands down for what I would call "normal" TV material. Rgds Jonathan |
In article , JC
writes But conventional 50 Hz CRT flicker is horrible, especially on larger screens. No it isn't, it is something you filter out very quickly and the size of the screen has nothing to do with it - how many kids from the 60's and 70's sat a couple of feet from their 26" CRTs watching TV while their parents told them "don't sit too close, johnny, your eyes will go square!"? None of them saw any flicker, yet the angular screen size was far bigger than anything viewed at normal distance. Ever had american visitors to your home for a week or two? At first they complain that TV flickers in this country, but by the time they go home they are marvelling about the picture quality. Hence the number of 100 Hz TVs on the market and all the artifacts that their frame stores cause. Marketing. Looks good in the showroom when you have a bank of TVs stretching out to the extreme periphery of your vision where you have most sensitivity to flicker because you haven't regularly watched it and learned to filter it out. However there is a third trade-off which maintains the flicker elimination of these screens. Instead of blanking the lines which are not present in each field, just repeat the data from the previous field. Again, the latency is consistent on all pixels, however the hold time on each pixel is now doubled, which is the analogous to a very long persistence CRT phosphor. Consequently this time you are trading some motion blur to eliminate both flicker and motion tearing. This is effectively what any good broadcast interlace to progressive converter would do Would do, but don't, hence the motion tearing on most progressive screens when fed an interlaced source - that is, to my mind, the worst option, but for some reason it is what most of them do. and as I said, I'd rather it was done by the broadcaster than a sub £ 5 chip in my TV. What difference does that make - it is just manipulation of digital data so it doesn't make a ha'penny difference if it is implemented in a £50,000 Quantel box or in a £5 chip. The blurring artifacts within the picture reduce the visable resoloution to, it would appear, similar to 720p levels. No it doesn't - the blurring is simply the maximum temporal bandwidth capable of the interlaced structure, with minimal temporal aliasing. The normal interlace display on a CRT, where each pixel is only present for a very short period of the frame time is actually temporally undersampling - which artificially exaggerates the limitations of interlace. However the adoption of a progressive broadcast system allows the migration to all progressive production over time eliminating this problem completely. But the problem doesn't need to be present - there is no reason why a flat panel progressive screen cannot display an interlace signal accurately. There is nothing intrinsically superior about a progressive source which has the same overall bandwidth as the alternate interlace system, and generally it has inferior resolution, as demonstrated by the 1080i/720p debate. manufacturers agendas than a limitation of interlace per se. You can't display progressive images on an interlaced screen though, without artefacts or throwing half the information away, which defeats any advantage the system has. But nobodies going to be using an interlaced screen for HD. Even an HD CRT should be capable of 50 or 100 Hz progressive refresh without interlace and I'd be very surprised to see an HD CRT set in Dixons etc in a years time. That is exactly the point: nobody would need them - if high quality backwards compatibility were delivered. Whilst that is certainly possible, indeed just as simple to achieve, it isn't what most flat panels provide. Consequently the push for progressive standards alienates about half a century of existing video heritage. Progressive production and display is the future. To tie our HD broadcast standards to the legacy interlace is even more crazy than tieing DAB to Layer 2. I would agree if we were discussing a comparison of 1080p versus 1080i, but the option is 720p versus 1080i, so what you are calling for is to tie our HD broadcast standard to little more than the legacy static resolution of the interlaced system we have had for the past half century, when we could be quadrupling it! So much for the term "progressive" - "marginally incremental" would be more appropriate! I was looking at some 720p a couple of days ago, and wasn't overly impressed. Yes, better than 625i, but not dramatically so. When I saw 1080i a while ago, I was completely bowled over by it. That extra horizontal resolution isn't worthless, you know. Equipment was different of course, and perhaps my expectations have changed with time, but that was my impression viewing both options. As always, YMMV. Was the 1080i and 720p displayed on a progressive or interlaced screen? 1080i was on a projection DMD system at the TI plant in Dallas where they are made. That used a progressive DMD but driven using an algorithm similar to those I was explaining. The main difference is that the brightness of each pixel in a DMD is achieved by time multiplexing throughout the available frame period, but the phasing of the interlaced fields was accurate - that was the point of the TI demo. 720p was on 720p compatible plasmas and LCDs - lots of different types. I have to say some of the HD material I've seen has been a bit of a disappointment but still a worthwhile improvement. One thing I can say is that in my experience, 720p beats 1080i hands down for what I would call "normal" TV material. As I say, that isn't my experience. 1080i is superb when displayed correctly - which is often not even attempted because it requires a much higher resolution screen to do it properly. -- Kennedy Yes, Socrates himself is particularly missed; A lovely little thinker, but a bugger when he's ****ed. Python Philosophers (replace 'nospam' with 'kennedym' when replying) |
Roderick Stewart wrote:
: Would you suggest the same policy for gamma correction? As it's a : pre-distortion applied in the camera to compensate for the : characteristics of a CRT, we shouldn't, in theory, need it in a flat : panel broadcasting world, and doing without would make post-production : colour correction much simpler. It might make some kinds of colour correction easier, but if you code video as 'linear light' you need many more bits. Whilst 8-bits (for luminance) are normally reckoned to be sufficient for non-linear (gamma corrected) video, you need something like 12-14 bits for linear video. So even if CRTs had never existed we would have had to invent gamma to allow for efficient digital coding and transmission. Richard. http://www.rtrussell.co.uk/ To reply by email change 'news' to my forename. |
Kennedy McEwen wrote in
: yway) But on the grey backgound, I seem to be totally unable to get anything between alleged black and low mid tones. Is this a funtion of all LCDs? and will it always be. IOW, I guess I'm asking if future pictures will always be as bad? The blacks being crushed and little differentiation between mid tones and black is symptomatic of the effect I was describing on a linear display - and an LCD is fairly linear. snip rest of excellent info Thanks for that , Kennedy, I did wonder in this best of all possible worlds why they couldn't do a bit of gamma correction, but if the system won't support it, that's that. At least I will know now if the fundamental problem is being addressed or the adverts for future sets are another crock like what I fell for. mike |
In article , Kennedy McEwen wrote:
But conventional 50 Hz CRT flicker is horrible, especially on larger screens. No it isn't, it is something you filter out very quickly and the size of the screen has nothing to do with it - how many kids from the 60's and 70's sat a couple of feet from their 26" CRTs watching TV while their parents told them "don't sit too close, johnny, your eyes will go square!"? None of them saw any flicker, yet the angular screen size was far bigger than anything viewed at normal distance. Exactly. I've been watching television with a 50Hz flicker rate since the coronation and it's never bothered me, yet all of a sudden it's supposed to be a such a problem that we need to spend lots of money on 100Hz displays. Meanwhile, another faction is proposing a *reduction* in picture intermittency rate from 50Hz to 25Hz (the most noticeable effect of so-called "progressive" scanning), and somehow this isn't a problem at all! Rod. |
In article , wrote:
: Would you suggest the same policy for gamma correction? As it's a : pre-distortion applied in the camera to compensate for the : characteristics of a CRT, we shouldn't, in theory, need it in a flat : panel broadcasting world, and doing without would make post-production : colour correction much simpler. It might make some kinds of colour correction easier, but if you code video as 'linear light' you need many more bits. Whilst 8-bits (for luminance) are normally reckoned to be sufficient for non-linear (gamma corrected) video, you need something like 12-14 bits for linear video. So even if CRTs had never existed we would have had to invent gamma to allow for efficient digital coding and transmission. I think you missed my real point here (maybe I made it too subtle), which was about changing part of a well-established television standard supported by a vast amount of equipment owned by millions of people. The argument for abandoning the use of interlace seems purely to do with ease of handling the signals in computer equipment, or converting to and from cinematograph film, and nothing to do with how the television pictures look to the eye on a screen, and a similarly specious argument could be put forward for abandoning gamma correction. However, the argument about gamma correction looks more convincing, as there is some logic that actually has something to do with pictures rather than computers. Cathode ray tbes have non-linear amplitude characteristics, but camera tubes and chips are fairly linear, so the linear signals from cameras have to be pre-distorted to match CRTs. Anywhere in the signal chain would do, but it was decided early in the history of television (for very practical reasons) that it should be done in the camera. Thus, gamma correction, or "CRT non-linearity compensation" as it could be called, has become part of the standard specification of all broadcast video signals everywhere in the world.(An incidental effect of applying gamma correction early in the signal chain is a reduction in the visibility of noise in dark picture areas, but this is not the principal reason for applying it). Recently we've invented flat panel displays which are inherently linear, so don't need CRT compensation, but as it's incorporated into all the video signals they will be required to display, it is necessary to include something in the display circuitry to undo it. Both types of display are in use today, but one day they will probably all be flat panel types with linear characteristics and extra circuitry to correct for gamma corrected signals. When this happens, we will have the odd situation that all television displays contain circuitry to undo pre-distortion that is applied to all video signals to compensate for a type of display that is no longer in use. We'll probably still call it "gamma correction" even though there will be nothing in the system with an innate gamma characteristic for which to correct! Rod. |
| All times are GMT +1. The time now is 09:52 PM. |
|
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com