|
On Mon, 22 Aug 2005 06:58:44 +0100, Roderick Stewart
wrote: Would you suggest the same policy for gamma correction? As it's a pre-distortion applied in the camera to compensate for the characteristics of a CRT, we shouldn't, in theory, need it in a flat panel broadcasting world, and doing without would make post-production colour correction much simpler. How many established standards and practices do you think it would be wise to abandon all in one go? I don't feel qualified to answer this, but I will say that if you have a new system, in which you know that there are no (or very minimal) legacy devices, is it right to compromise the system for the next few decades just in case there are a few people who may have sub-optimal quality on a very small number of legacy displays at launch? Would you have suggested that all 625 line TVs support 405 lines? It was after all the established standard? How about teletext, would you suggest that the number of active teletext lines wasn't increased in the 80s as it may have caused problems for people with 15 year old tvs? Rgds Jonathan |
On Mon, 22 Aug 2005 09:09:34 +0100, Kennedy McEwen
wrote: But it is infinitely easier to display interlaced video correctly, and without a frame store, on a progressive screen than it is to display progressive video correctly on an interlaced screed. All current progressive screens are by their nature a frame store. By displaying interlaced material you are effectively displaying two fields shot at different times simultaneously. This has the effect of degrading the apparent resolution of the system. In the former case, all that is necessary to do the job correctly is to store a line of video at a time, write that onto the line of pixels on the progressive screen and then blank off the pixels on the subsequent line. In the next field, simply blank the pixels on the first line (which were written on the previous field) and write the pixels in the next line (which were blanked on the previous field). This reproduces the interlace structure with the exact time latency and spatial structure of the original format. This is not however how LCD and Plasma screens work and to try and replicate this would create all the traditional TV flicker that these screens are so good at eliminating while introducing other artifacts. You can't do that the other way round, if the display is interlaced by default, without destroying the latency between adjacent lines. It's always a compromise. Even in todays broadcasting world I'd prefer interlaced material to be deinterlaced by an expensive box at the broadcaster than in a £ 5 or less chip in my TV. In future most new productions will be natively progressive so the problem will be eliminated. To take progressive material, interlace it and then de-interlace it would of course be even more crazy. Rgds Jonathan |
In article , JC
writes On Mon, 22 Aug 2005 09:09:34 +0100, Kennedy McEwen wrote: But it is infinitely easier to display interlaced video correctly, and without a frame store, on a progressive screen than it is to display progressive video correctly on an interlaced screed. All current progressive screens are by their nature a frame store. By displaying interlaced material you are effectively displaying two fields shot at different times simultaneously. This has the effect of degrading the apparent resolution of the system. That is indeed how most of them work, but it doesn't have to be that way. A progressive display can be configured to display interlaced material perfectly correctly. I have actually done something similar to this on a progressive OLED display where I had access to the raw panel drive, and I can confirm that it does completely eliminate the motion artefacts that you get when doing the conventional frame store reconstruction that most progressive displays use. Actually, what I did was more like the scheme described below, because of the way the OLED interface worked. This is not however how LCD and Plasma screens work and to try and replicate this would create all the traditional TV flicker that these screens are so good at eliminating You don't get owt for nowt. The price of that flicker elimination has to come from somewhere, and the conventional wisdom of the manufacturers appears to be that it comes at the expense of motion tearing through complete reconstruction of interlaced fields into a single frame for progressive display. That isn't the only possible trade-off. As described, it is possible to design a progressive display that will eliminate motion tearing with interlaced video. That particular trade-off results in 50Hz flicker, but no worse than on a conventional interlaced CRT. However there is a third trade-off which maintains the flicker elimination of these screens. Instead of blanking the lines which are not present in each field, just repeat the data from the previous field. Again, the latency is consistent on all pixels, however the hold time on each pixel is now doubled, which is the analogous to a very long persistence CRT phosphor. Consequently this time you are trading some motion blur to eliminate both flicker and motion tearing. You don't get owt for nowt, but the trade space available is multidimensional - the manufacturers just restrict the options available to you. I am sure that there are other options, but these are a couple that I have played with which I know work. while introducing other artifacts. What other artefacts are introduced? You eliminate motion artefacts by achieving consistent latency and the flicker is no worse than on a conventional interlaced CRT (its actually a little better because they don't have the emission decay that a CRT phosphor has). Based on the OLED system I did this on, I doubt if you would see any flicker at all, but you can fix that as described. You can't do that the other way round, if the display is interlaced by default, without destroying the latency between adjacent lines. It's always a compromise. It *is* a compromise - it doesn't *have* to be one. You can display interlaced images on a progressive screen correctly, and I suspect that the reason it isn't done has more to do with manufacturers agendas than a limitation of interlace per se. You can't display progressive images on an interlaced screen though, without artefacts or throwing half the information away, which defeats any advantage the system has. I was looking at some 720p a couple of days ago, and wasn't overly impressed. Yes, better than 625i, but not dramatically so. When I saw 1080i a while ago, I was completely bowled over by it. That extra horizontal resolution isn't worthless, you know. Equipment was different of course, and perhaps my expectations have changed with time, but that was my impression viewing both options. As always, YMMV. -- Kennedy Yes, Socrates himself is particularly missed; A lovely little thinker, but a bugger when he's ****ed. Python Philosophers (replace 'nospam' with 'kennedym' when replying) |
In article , Roger R
wrote: "rookie" wrote in message ... "Brian McIlwrath" wrote in message ... Adrian wrote: : I won't be buying one that is less than 1920x1080 You will have a long wait! This September, Philips will be releasing their 1920x1080 "True HD" sets. Check out the 37PF9830 for example. Around 4000 euro retail here on the continent. Link please http://www.digitaldirectuk.com/produ...p?product_id=9 142 Stan |
On Tue, 23 Aug 2005 01:05:30 +0100, Kennedy McEwen
wrote: progressive display. That isn't the only possible trade-off. As described, it is possible to design a progressive display that will eliminate motion tearing with interlaced video. That particular trade-off results in 50Hz flicker, but no worse than on a conventional interlaced CRT. But conventional 50 Hz CRT flicker is horrible, especially on larger screens. Hence the number of 100 Hz TVs on the market and all the artifacts that their frame stores cause. However there is a third trade-off which maintains the flicker elimination of these screens. Instead of blanking the lines which are not present in each field, just repeat the data from the previous field. Again, the latency is consistent on all pixels, however the hold time on each pixel is now doubled, which is the analogous to a very long persistence CRT phosphor. Consequently this time you are trading some motion blur to eliminate both flicker and motion tearing. This is effectively what any good broadcast interlace to progressive converter would do and as I said, I'd rather it was done by the broadcaster than a sub £ 5 chip in my TV. The blurring artifacts within the picture reduce the visable resoloution to, it would appear, similar to 720p levels. However the adoption of a progressive broadcast system allows the migration to all progressive production over time eliminating this problem completely. manufacturers agendas than a limitation of interlace per se. You can't display progressive images on an interlaced screen though, without artefacts or throwing half the information away, which defeats any advantage the system has. But nobodies going to be using an interlaced screen for HD. Even an HD CRT should be capable of 50 or 100 Hz progressive refresh without interlace and I'd be very surprised to see an HD CRT set in Dixons etc in a years time. Production is already moving to progressive formats and of course film is natively progressive. Film has been converted to interlaced SD TV for decades without problems (as far a the interlace goes) and ironically often looks better on progressive sets due to this than natively interlaced material. Progressive production and display is the future. To tie our HD broadcast standards to the legacy interlace is even more crazy than tieing DAB to Layer 2. I was looking at some 720p a couple of days ago, and wasn't overly impressed. Yes, better than 625i, but not dramatically so. When I saw 1080i a while ago, I was completely bowled over by it. That extra horizontal resolution isn't worthless, you know. Equipment was different of course, and perhaps my expectations have changed with time, but that was my impression viewing both options. As always, YMMV. Was the 1080i and 720p displayed on a progressive or interlaced screen? I have to say some of the HD material I've seen has been a bit of a disappointment but still a worthwhile improvement. One thing I can say is that in my experience, 720p beats 1080i hands down for what I would call "normal" TV material. Rgds Jonathan |
In article , JC
writes But conventional 50 Hz CRT flicker is horrible, especially on larger screens. No it isn't, it is something you filter out very quickly and the size of the screen has nothing to do with it - how many kids from the 60's and 70's sat a couple of feet from their 26" CRTs watching TV while their parents told them "don't sit too close, johnny, your eyes will go square!"? None of them saw any flicker, yet the angular screen size was far bigger than anything viewed at normal distance. Ever had american visitors to your home for a week or two? At first they complain that TV flickers in this country, but by the time they go home they are marvelling about the picture quality. Hence the number of 100 Hz TVs on the market and all the artifacts that their frame stores cause. Marketing. Looks good in the showroom when you have a bank of TVs stretching out to the extreme periphery of your vision where you have most sensitivity to flicker because you haven't regularly watched it and learned to filter it out. However there is a third trade-off which maintains the flicker elimination of these screens. Instead of blanking the lines which are not present in each field, just repeat the data from the previous field. Again, the latency is consistent on all pixels, however the hold time on each pixel is now doubled, which is the analogous to a very long persistence CRT phosphor. Consequently this time you are trading some motion blur to eliminate both flicker and motion tearing. This is effectively what any good broadcast interlace to progressive converter would do Would do, but don't, hence the motion tearing on most progressive screens when fed an interlaced source - that is, to my mind, the worst option, but for some reason it is what most of them do. and as I said, I'd rather it was done by the broadcaster than a sub £ 5 chip in my TV. What difference does that make - it is just manipulation of digital data so it doesn't make a ha'penny difference if it is implemented in a £50,000 Quantel box or in a £5 chip. The blurring artifacts within the picture reduce the visable resoloution to, it would appear, similar to 720p levels. No it doesn't - the blurring is simply the maximum temporal bandwidth capable of the interlaced structure, with minimal temporal aliasing. The normal interlace display on a CRT, where each pixel is only present for a very short period of the frame time is actually temporally undersampling - which artificially exaggerates the limitations of interlace. However the adoption of a progressive broadcast system allows the migration to all progressive production over time eliminating this problem completely. But the problem doesn't need to be present - there is no reason why a flat panel progressive screen cannot display an interlace signal accurately. There is nothing intrinsically superior about a progressive source which has the same overall bandwidth as the alternate interlace system, and generally it has inferior resolution, as demonstrated by the 1080i/720p debate. manufacturers agendas than a limitation of interlace per se. You can't display progressive images on an interlaced screen though, without artefacts or throwing half the information away, which defeats any advantage the system has. But nobodies going to be using an interlaced screen for HD. Even an HD CRT should be capable of 50 or 100 Hz progressive refresh without interlace and I'd be very surprised to see an HD CRT set in Dixons etc in a years time. That is exactly the point: nobody would need them - if high quality backwards compatibility were delivered. Whilst that is certainly possible, indeed just as simple to achieve, it isn't what most flat panels provide. Consequently the push for progressive standards alienates about half a century of existing video heritage. Progressive production and display is the future. To tie our HD broadcast standards to the legacy interlace is even more crazy than tieing DAB to Layer 2. I would agree if we were discussing a comparison of 1080p versus 1080i, but the option is 720p versus 1080i, so what you are calling for is to tie our HD broadcast standard to little more than the legacy static resolution of the interlaced system we have had for the past half century, when we could be quadrupling it! So much for the term "progressive" - "marginally incremental" would be more appropriate! I was looking at some 720p a couple of days ago, and wasn't overly impressed. Yes, better than 625i, but not dramatically so. When I saw 1080i a while ago, I was completely bowled over by it. That extra horizontal resolution isn't worthless, you know. Equipment was different of course, and perhaps my expectations have changed with time, but that was my impression viewing both options. As always, YMMV. Was the 1080i and 720p displayed on a progressive or interlaced screen? 1080i was on a projection DMD system at the TI plant in Dallas where they are made. That used a progressive DMD but driven using an algorithm similar to those I was explaining. The main difference is that the brightness of each pixel in a DMD is achieved by time multiplexing throughout the available frame period, but the phasing of the interlaced fields was accurate - that was the point of the TI demo. 720p was on 720p compatible plasmas and LCDs - lots of different types. I have to say some of the HD material I've seen has been a bit of a disappointment but still a worthwhile improvement. One thing I can say is that in my experience, 720p beats 1080i hands down for what I would call "normal" TV material. As I say, that isn't my experience. 1080i is superb when displayed correctly - which is often not even attempted because it requires a much higher resolution screen to do it properly. -- Kennedy Yes, Socrates himself is particularly missed; A lovely little thinker, but a bugger when he's ****ed. Python Philosophers (replace 'nospam' with 'kennedym' when replying) |
Roderick Stewart wrote:
: Would you suggest the same policy for gamma correction? As it's a : pre-distortion applied in the camera to compensate for the : characteristics of a CRT, we shouldn't, in theory, need it in a flat : panel broadcasting world, and doing without would make post-production : colour correction much simpler. It might make some kinds of colour correction easier, but if you code video as 'linear light' you need many more bits. Whilst 8-bits (for luminance) are normally reckoned to be sufficient for non-linear (gamma corrected) video, you need something like 12-14 bits for linear video. So even if CRTs had never existed we would have had to invent gamma to allow for efficient digital coding and transmission. Richard. http://www.rtrussell.co.uk/ To reply by email change 'news' to my forename. |
Kennedy McEwen wrote in
: yway) But on the grey backgound, I seem to be totally unable to get anything between alleged black and low mid tones. Is this a funtion of all LCDs? and will it always be. IOW, I guess I'm asking if future pictures will always be as bad? The blacks being crushed and little differentiation between mid tones and black is symptomatic of the effect I was describing on a linear display - and an LCD is fairly linear. snip rest of excellent info Thanks for that , Kennedy, I did wonder in this best of all possible worlds why they couldn't do a bit of gamma correction, but if the system won't support it, that's that. At least I will know now if the fundamental problem is being addressed or the adverts for future sets are another crock like what I fell for. mike |
In article , Kennedy McEwen wrote:
But conventional 50 Hz CRT flicker is horrible, especially on larger screens. No it isn't, it is something you filter out very quickly and the size of the screen has nothing to do with it - how many kids from the 60's and 70's sat a couple of feet from their 26" CRTs watching TV while their parents told them "don't sit too close, johnny, your eyes will go square!"? None of them saw any flicker, yet the angular screen size was far bigger than anything viewed at normal distance. Exactly. I've been watching television with a 50Hz flicker rate since the coronation and it's never bothered me, yet all of a sudden it's supposed to be a such a problem that we need to spend lots of money on 100Hz displays. Meanwhile, another faction is proposing a *reduction* in picture intermittency rate from 50Hz to 25Hz (the most noticeable effect of so-called "progressive" scanning), and somehow this isn't a problem at all! Rod. |
In article , wrote:
: Would you suggest the same policy for gamma correction? As it's a : pre-distortion applied in the camera to compensate for the : characteristics of a CRT, we shouldn't, in theory, need it in a flat : panel broadcasting world, and doing without would make post-production : colour correction much simpler. It might make some kinds of colour correction easier, but if you code video as 'linear light' you need many more bits. Whilst 8-bits (for luminance) are normally reckoned to be sufficient for non-linear (gamma corrected) video, you need something like 12-14 bits for linear video. So even if CRTs had never existed we would have had to invent gamma to allow for efficient digital coding and transmission. I think you missed my real point here (maybe I made it too subtle), which was about changing part of a well-established television standard supported by a vast amount of equipment owned by millions of people. The argument for abandoning the use of interlace seems purely to do with ease of handling the signals in computer equipment, or converting to and from cinematograph film, and nothing to do with how the television pictures look to the eye on a screen, and a similarly specious argument could be put forward for abandoning gamma correction. However, the argument about gamma correction looks more convincing, as there is some logic that actually has something to do with pictures rather than computers. Cathode ray tbes have non-linear amplitude characteristics, but camera tubes and chips are fairly linear, so the linear signals from cameras have to be pre-distorted to match CRTs. Anywhere in the signal chain would do, but it was decided early in the history of television (for very practical reasons) that it should be done in the camera. Thus, gamma correction, or "CRT non-linearity compensation" as it could be called, has become part of the standard specification of all broadcast video signals everywhere in the world.(An incidental effect of applying gamma correction early in the signal chain is a reduction in the visibility of noise in dark picture areas, but this is not the principal reason for applying it). Recently we've invented flat panel displays which are inherently linear, so don't need CRT compensation, but as it's incorporated into all the video signals they will be required to display, it is necessary to include something in the display circuitry to undo it. Both types of display are in use today, but one day they will probably all be flat panel types with linear characteristics and extra circuitry to correct for gamma corrected signals. When this happens, we will have the odd situation that all television displays contain circuitry to undo pre-distortion that is applied to all video signals to compensate for a type of display that is no longer in use. We'll probably still call it "gamma correction" even though there will be nothing in the system with an innate gamma characteristic for which to correct! Rod. |
| All times are GMT +1. The time now is 09:52 PM. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com