![]() |
| If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|||||||
|
|
|
Thread Tools | Display Modes |
|
#1
|
|||
|
|||
|
Hi All,
Does anyone know of a company that plans to manufacture (at any price) a TRUE HDTV projection system. I have looked at RCA, Sony, Hitachi, Panasonic, JVC and other websites only to find that EVERY system has a vertical resolution of only 720 lines (they merely downgrade the 1080i signal). RCA briefly produced a CRT style set that could display 1080 lines for real, but since then, not a single manufacturer is producing a set (of any type including plasma) that has a display resolution of 1920 x 1080 pixels. My PC monitor can display this resolution, why the hell can't we get REAL 1080i High Def on a consumer set? Dave |
|
#2
|
|||
|
|||
|
You should have said 1080p instead of 1080i. But yeah I would be curious how much a CRT
HDTV would cost that can do 1080x1920. If anyone knows which models if any can do that right now I would like to know which one just because I'm curious. I'm not going to buy one, because I'm a young adult and don't have the need or money. 1080x1920 is full HDTV resolution. I know the HDTV rear projection my parents have can only do 1080i which is a less expensive format that still qualifies as HDTV. Its funny that sometimes people say that 1080i is the highest HDTV resolution but 1080p at 1080x1920 is. UPN even said star trek enterprise would be in he highest HDTV format which was some format based on 1080i. I realized that the compression to do HDTV isn't really even hear yet. H.264 codec is really needed to get hdtv resolution (full or not) without using too much bandwidth. On another side note I think some PC games should be developed with HDTV in mind. If console games games can be built with hdtv in mind and consoles cost less than pc's and HDTV is very expensive, then people that are willing to spend money on a pc are MORE likely to have the money to spend on an HDTV. This does not consider the fact of wether or not a game would be well suited for a large screen of high resolution but maybe lesser sharpness or dpi. |
|
#3
|
|||
|
|||
|
wrote in message m... On another side note I think some PC games should be developed with HDTV in mind. If console games games can be built with hdtv in mind and consoles cost less than pc's and HDTV is very expensive, then people that are willing to spend money on a pc are MORE likely to have the money to spend on an HDTV. This does not consider the fact of wether or not a game would be well suited for a large screen of high resolution but maybe lesser sharpness or dpi. Most PC games for the last 5 years or more have supported high resolutions (not necessarily a particular HD format or 16:9, but high resolution). Nowadays it's standard for PC games to support various resolutions from 640x480 to around 1600x1200 (800x600, 1024x768, 1280x1024, etc.), and good PC graphic cards can really drive them at these resolutions with decent performance (and often with full-screen anti-aliasing, anisotropic filtering, etc. for a really beautiful picture). Plus don't forget that these are all progressive-scan resolutions, not the 1/2 resolution you'd get with interlace, and that the color is full resolution, not the 1/2 or less resolution in each direction with TV or even HDTV. 1024x768 is sort of a standard base resolution these days (no reason to go any lower unless you have a really old graphic card). PC games also tend to include significantly higher resolution textures than console games (or they would look blurry at the high resolutions they are generally played at). Modern 3D graphic cards generally have as much as 2-4x as much memory as an entire console (typical these days is 64-256MB on the graphic card alone), very high fill rates, and high color quality. In general, PC games are much MORE suited to high resolutions than console games, and are far more likely to have such support built in. I'm sure in the future some PC games might start explicitly supporting 16:9 HDTV modes, but most people don't have their PC hooked up to a TV, and wouldn't want to play using mouse and keyboard when sitting on the sofa! (And if you're going to degrade the image by displaying it on a TV, and use controls that are inferior for many game types, why not just play it on a console in the first place?) |
|
#4
|
|||
|
|||
|
1080i is the highest broadcast resolution under the ATSC standards. You
could have 1080...30 or 24fps/p but that is not higher resolution....It IS a lower frame rate and more susceptible to flicker/artifacts. There is NO 1080p/60fps standard for broadcast television as adopted by the FCC...and you will not likely live long enough to ever see such a standard...eats up WAY too much bandwidth. For spatial resolution 1080i is the highest. For temporal resolution, 720p has the highest resolution. (and it is at 60p) wrote in message m... You should have said 1080p instead of 1080i. But yeah I would be curious how much a CRT HDTV would cost that can do 1080x1920. If anyone knows which models if any can do that right now I would like to know which one just because I'm curious. I'm not going to buy one, because I'm a young adult and don't have the need or money. 1080x1920 is full HDTV resolution. I know the HDTV rear projection my parents have can only do 1080i which is a less expensive format that still qualifies as HDTV. Its funny that sometimes people say that 1080i is the highest HDTV resolution but 1080p at 1080x1920 is. UPN even said star trek enterprise would be in he highest HDTV format which was some format based on 1080i. I realized that the compression to do HDTV isn't really even hear yet. H.264 codec is really needed to get hdtv resolution (full or not) without using too much bandwidth. On another side note I think some PC games should be developed with HDTV in mind. If console games games can be built with hdtv in mind and consoles cost less than pc's and HDTV is very expensive, then people that are willing to spend money on a pc are MORE likely to have the money to spend on an HDTV. This does not consider the fact of wether or not a game would be well suited for a large screen of high resolution but maybe lesser sharpness or dpi. |
|
#5
|
|||
|
|||
|
"Curmudgeon" wrote in message
... 1080i is the highest broadcast resolution under the ATSC standards. You could have 1080...30 or 24fps/p but that is not higher resolution....It IS a lower frame rate and more susceptible to flicker/artifacts. This isn't really true. Yes, 24 or 30p is not going to show motion as well because the lower framerate will make things more jerky (though if the source is a movie, it is already only 24fps). However, it will look noticeably sharper and clearer than 60i, and have significantly less artifacts. It will also not affect flicker, since the screen is still refreshing at 60Hz. There is NO 1080p/60fps standard for broadcast television as adopted by the FCC... This is a shame, and part of the problem with HDTV: the standards were both too ambitious and not ambitious enough. They didn't really get it "just right" on anything. Almost everything done in HDTV is significantly, and unnecessarily, compromised in one way or another. It is possible to get pretty good quality, but at a higher price than should be needed to achieve that level of quality. And for the price, had they waited just a few years for the technology to catch up, yes, we COULD have 1080 x 60p, right now. and you will not likely live long enough to ever see such a standard...eats up WAY too much bandwidth. Only assuming they stick with an obsolete compression standard. It is possible today, but not using MPEG2. For spatial resolution 1080i is the highest. For temporal resolution, 720p has the highest resolution. (and it is at 60p) This is a disingenous way to put it. In fact, 1080i does not always have the highest effective "spatial" resolution. Any area experiencing motion (particularly vertical) displays only 1/2 the effective vertical resolution. This gives an effective vertical resolution much of the time of only about 540 lines. And 1080i in real life only uses 1440 resolution horizontally. So in many circumstances, 720p DOES have higher spatial resolution (1280x720 compared to 1440x540). 1080i does have the highest spatial resolution for absolutely still material, but any motion will defeat this advantage. In addition, it presents an easily visible (and highly annoying, in my opinion) "interlace flicker" characteristic even in still material (which is what makes it look so "TV" or "video"-like--it looks like the screen is alive or something, not at all the "like looking through a window" characteristic that we'd all like to see). Interlace just has too many tradeoffs: you can eliminate the interlace artifacts entirely, but only by halving both the resolution and the framerate (display 1080i with only 540 scan lines and at 30fps progressive scan, and all interlace artifacts should disappear). (You can also eliminate them by backing up far enough from the screen that you wouldn't be able to tell the difference between 1440x1080 and 720x540...) |
|
#6
|
|||
|
|||
|
On Sun, 02 Nov 2003 05:49:15 GMT, "Matthew Vaughan"
wrote: "Curmudgeon" wrote in message . .. 1080i is the highest broadcast resolution under the ATSC standards. You could have 1080...30 or 24fps/p but that is not higher resolution....It IS a lower frame rate and more susceptible to flicker/artifacts. This isn't really true. Yes, 24 or 30p is not going to show motion as well because the lower framerate will make things more jerky (though if the source is a movie, it is already only 24fps). However, it will look noticeably sharper and clearer than 60i, and have significantly less artifacts. It will also not affect flicker, since the screen is still refreshing at 60Hz. There is NO 1080p/60fps standard for broadcast television as adopted by the FCC... This is a shame, and part of the problem with HDTV: the standards were both too ambitious and not ambitious enough. They didn't really get it "just right" on anything. Almost everything done in HDTV is significantly, and unnecessarily, compromised in one way or another. It is possible to get pretty good quality, but at a higher price than should be needed to achieve that level of quality. And for the price, had they waited just a few years for the technology to catch up, yes, we COULD have 1080 x 60p, right now. and you will not likely live long enough to ever see such a standard...eats up WAY too much bandwidth. Only assuming they stick with an obsolete compression standard. It is possible today, but not using MPEG2. For spatial resolution 1080i is the highest. For temporal resolution, 720p has the highest resolution. (and it is at 60p) This is a disingenous way to put it. In fact, 1080i does not always have the highest effective "spatial" resolution. Any area experiencing motion (particularly vertical) displays only 1/2 the effective vertical resolution. This gives an effective vertical resolution much of the time of only about 540 lines. And 1080i in real life only uses 1440 resolution horizontally. So in many circumstances, 720p DOES have higher spatial resolution (1280x720 compared to 1440x540). 1080i does have the highest spatial resolution for absolutely still material, but any motion will defeat this advantage. In addition, it presents an easily visible (and highly annoying, in my opinion) "interlace flicker" characteristic even in still material (which is what makes it look so "TV" or "video"-like--it looks like the screen is alive or something, not at all the "like looking through a window" characteristic that we'd all like to see). This is very subjective. I have never experienced any flicker. Thumper Interlace just has too many tradeoffs: you can eliminate the interlace artifacts entirely, but only by halving both the resolution and the framerate (display 1080i with only 540 scan lines and at 30fps progressive scan, and all interlace artifacts should disappear). (You can also eliminate them by backing up far enough from the screen that you wouldn't be able to tell the difference between 1440x1080 and 720x540...) To reply drop XYZ in address |
|
#7
|
|||
|
|||
|
In article ,
Thumper writes: 1080i does have the highest spatial resolution for absolutely still material, but any motion will defeat this advantage. In addition, it presents an easily visible (and highly annoying, in my opinion) "interlace flicker" characteristic even in still material (which is what makes it look so "TV" or "video"-like--it looks like the screen is alive or something, not at all the "like looking through a window" characteristic that we'd all like to see). This is very subjective. I have never experienced any flicker. Thumper I am VERY sensitive to flicker, and the flicker due to properly encoded 1080i should be nil. All too often, there is a mistake made by neophytes that interlace filtering must always be constant. Interlace filtering NEED NOT be a sledgehammer. Actually, interlace filtering should be a dynamic scheme, where twitter is reduced, but also the vertical resolution isn't severely impacted. Such DYNAMIC filtering is commonly done for dynamic combs in NTSC (PAL) TV sets, and is done in other applications also. Even a good MPEG encoder will do various kinds of filtering to avoid artifacts. If there is 'flicker' on a 1080i presentation, it means that the equipment isn't being used correctly, the equipment might be older with less technology, or other 'excuses' might apply. Given good 'processing', the stairstepping effects coming from the natural aliasing of the sampled system should be approximately as distracting as the interlace twitter. People also forget that the 720p systems ALSO need vertical filtering to avoid ugly aliasing effects. The amount of artifact free resolution isn't as high as the 720p designation might superficially imply. John |
|
#8
|
|||
|
|||
|
"Thumper" wrote in message
... presents an easily visible (and highly annoying, in my opinion) "interlace flicker" characteristic even in still material (which is what makes it look so "TV" or "video"-like--it looks like the screen is alive or something, not at all the "like looking through a window" characteristic that we'd all like to see). This is very subjective. I have never experienced any flicker. You may have a low visual sensitivity to motion and detail. I believe that I have very high sensitivity to visual motion, because low computer monitor refresh rates, as well as interlace, both bother me a lot more than they seem to bother most people. Or you may watch TV from far enough away that you would not be able to tell the difference if it were progressive scan with 1/2 the resolution (that would be the only way for me to stop noticing it). Or you may have a LCD or plasma or DLP that de-interlaces everything before display anyway (it doesn't remove all interlace artifacts, but substantially reduces the flicker in still scenes). |
|
#9
|
|||
|
|||
|
"John S. Dyson" wrote in message
... If there is 'flicker' on a 1080i presentation, it means that the equipment isn't being used correctly, the equipment might be older with less technology, or other 'excuses' might apply. Given good 'processing', the stairstepping effects coming from the natural aliasing of the sampled system should be approximately as distracting as the interlace twitter. I agree with your other points, except to say that the display itself causes a form of small-scale flicker that can't be removed (unless the phosphors have a longer decay time, in which case you'd probably get increased smearing). |
|
#10
|
|||
|
|||
|
Matthew Vaughan wrote:
"Matthew L. Martin" wrote in message s.com... Please do some research. 1080i is not equivalent to 540p. 1080i has considerably more vertical resolution than 540p even in fast action sequences. This is not really true. With motion (particularly vertical motion), the effective vertical resolution of an interlaced image is cut nearly in half. I have never seen a claim that the Kell factor drops as low as .50 in any scene. Do you have a source that supports that claim? Matthew -- http://www.mlmartin.com/bbq/ Thermodynamics For Dummies: You can't win. You can't break even. You can't get out of the game. |
|
| Thread Tools | |
| Display Modes | |
|
|
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Does DVI connection make a big difference on HDTV Rear projection monitor.? | Nospam | Home theater (general) | 1 | December 4th 03 01:17 PM |
| HDTV Newbie wants a little info... | Mandy | High definition TV | 17 | October 11th 03 03:58 PM |
| newbie wants comcast HDTV, but i need "HDTV monitor" (not "HDTV ready")? | Doug | High definition TV | 8 | September 10th 03 04:54 AM |
| Question re projection TV, HDTV, DLP, etc. | Brian Siano | High definition TV | 0 | September 2nd 03 06:08 AM |