|
Do you really like the way HDTV looks?
Dave Oldridge wrote:
Bob Miller wrote in ink.net: Dave Oldridge wrote: Bob Miller wrote in link.net: Dave Oldridge wrote: "HiC" wrote in ink.net: Went into a local Circuit City and took a good long look at their HDTV selections. They had several including 2 1080p sets that I was told were set up correctly and what I was seeing was as good as it gets. Everything HD from the cams to the screen. Both the 1080p's were running some sort of hard drive unit, not off a broadcast. I've been hearing how amazing HDTV is. Well....while there's a certain "pow" when you first see them, I get the sense it's due to some artifically induced phenomena. The colors seem vivid, but it seems to me in an enhanced - i.e. forced way. There seems to be an excessive "whiteness" to the image that adds a certain kind of sparkle/sharpness, but again it seems artificial. The real world as viewed by eyeballs doesn't seem that "sharp" or vivid. The demos that were showing were clearly intended to take advantage of this, all these closeups of brightly colored flowers, snowboarders on glaring snow etc. I don't believe a sky exists anywhere the shade of blue they were depicting in that demo. I see all kinds of artifacts in the images. Yeah, okay, they're not meant to be viewed from 6 inches away. But when I back off to 8 - 10 feet, I still see this odd graininess, especially when the image pans. Plus all these other odd things that happen to the image. Overall I find it harder on my eyes than a sharp picture on a good analog tv. As I understand it, in a few years we're getting all digital whether we like it or not. Is the whole HDTV thing just a bill of goods we got sold/crammed down our throats? When I bought an HDTV-ready TV, I bought a CRT model. CRT and rear projection CRT are proven technologies that can reproduce signals at these resolutions. They've been in use for some time in the computer industry, doing just that. The difference is not HUGE, but my SD signals are actually received, often, at EDTV resolution from a satellite, so what I'm actually comparing is the line-doubled 480p signal from the satellite to the 1080i signal from the same source. My estimate is that the picture clarity is 3db better on the HDTV signals, especially the good ones. That's about twice as good as the SDTV signals. Might that suggest that if the EDTV signal was actually true 480P and had been captured with a good 720P camera that it might be as good as the 1080i signal? Actually, you might suggest it, but it runs counter to my actual experience. I see materials that are converted from HD cameras all the time and, while they are 1000% better than regular SDTV signals, they are still about 3db short of a 1080i or 720p production over the 1080i path from my satellite. Even the best DVD films are about 3db worse. For example, I have the entire LotR trilogy in anamorphic widescreen. It is good, but it still has that 3db clarity loss from the 1080i version broeadcast by my movie supplier. That was a question. I was following your math and maybe misunderstood it. You were saying "line-doubled 480P" which I interpreted as 480i information. And I was then suggesting that if it were true 480P from a very good source, since it has twice the information as the 480i line doubled version, might it not be as good as the 1080i you were comparing it to since you said the 1080i was only twice as good as what I took to be 480i. Wouldn't 480P then equal your 1080i? I think you misunderstand something. The i in 1080i implies not that the resolution is any less, but that the raster is scanned twice to get the full frame. What you actually see depends on the vertical refresh rate of the mode, which I'll assume is 30fps. So you lose some resolution along the time axis to trade for resolution in space. The picture is still 1920x1080 pixels, but, due to the interlace, it's a little blurry where it's moving. Usually the eye doesn't see this and most often, it is obscured by motion blur in the original film source. Even on live baseball it looks OK to me. Scanned twice to get a different half frame if everything is moving. Scanned twice to get a full frame which would then be called 1080P if you are doing a movie and have the luxury of scanning each frame twice or a still image where little moves. Works great for still images. Baseball can be pretty still most of the time. But how about basketball or other active sports where more of the image is in chaotic motion. I understand that 1080i also introduces artifacts due to interlace that would not be present with P. I like progressive and can't wait till interlace leaves the scene altogether. You talk of db as to image quality. I am taking that in the colloquial to mean half as good on the way down or twice as good on the way up. Is that how you are using it? Bob Miller |
Do you really like the way HDTV looks?
big snip
But most TV series, even the 4:3 ones are shot on 35mm film and then transferreb to video modes, often separately for the DVD releases. -- Dave Oldridge+ ICQ 1800667 Back when Lucy and Desi were making series that was true, and it is still true of dramas, but for some reason, sitcoms are often shot on video, some even in super 16. Given the choice of a less expensive medium, the producer will usually take it. |
Do you really like the way HDTV looks?
In article ,
Ray S wrote: rz wrote: I think you need to see a quality set, set up properly. May be out of budget, though. Whats with all the 'quality set, set up properly' talk? You'd think that only engineers and compulsive tweakers bought tv's and watched dvd's. Heres how average people do it. They go to the store, buy a tv, hook up their dvd player using the nice color coded cables, plunk in a dvd and watch it. If DVD did in fact dissolve into pixel clouds and display numerous artifacts like the original poster claimed, nobody would have moved on from VHS tapes. Now if you stick in that copy of Finding Nemo thats on the floor of the kids room thats covered with cheeze wiz and scratches..... What ticks me off is the amazing variation of quality broadcast on the supposedly "high-def" cable channels. When the FCC gave away all that broadcast spectrum for HD, then backed off time and time again in requiring the stations to actually put HD up - it led to this current disaster. Out of the 15 HD channels my cable company provides, I get about 2 that seem to understand how to deliver an actual HD signal. (Discovery and iNHD on my cable provider - and both of THEM have picture quality problems now and then) Virtually everything else is an 70/30 split of god awful pillar boxed repurposed SD network junk, or "kinda HD" that's clearly been compressed and decompressed from satellite - easily identifiable because it LOOKS decent until some producer uses a cross dissolve between scenes - and the dissolve disintegrates into a blocky mess while the codec tries to catch up. Personally, I hope next year, Apple's iTV HDMI interface does to the HDTV Broadcast industry what iTunes did to the vinyl record and plastic CD distribution industry - pushes it rapidly towards oblivion. Sigh. |
Do you really like the way HDTV looks?
Hey Elmo, how did that grammar lesson go? Did you understand what I was saying?
Chip -- -------------------- http://NewsReader.Com/ -------------------- Usenet Newsgroup Service $9.95/Month 30GB |
Do you really like the way HDTV looks?
William Davis wrote:
In article , Ray S wrote: rz wrote: I think you need to see a quality set, set up properly. May be out of budget, though. Whats with all the 'quality set, set up properly' talk? You'd think that only engineers and compulsive tweakers bought tv's and watched dvd's. Heres how average people do it. They go to the store, buy a tv, hook up their dvd player using the nice color coded cables, plunk in a dvd and watch it. If DVD did in fact dissolve into pixel clouds and display numerous artifacts like the original poster claimed, nobody would have moved on from VHS tapes. Now if you stick in that copy of Finding Nemo thats on the floor of the kids room thats covered with cheeze wiz and scratches..... What ticks me off is the amazing variation of quality broadcast on the supposedly "high-def" cable channels. When the FCC gave away all that broadcast spectrum for HD, then backed off time and time again in requiring the stations to actually put HD up - it led to this current disaster. Out of the 15 HD channels my cable company provides, I get about 2 that seem to understand how to deliver an actual HD signal. (Discovery and iNHD on my cable provider - and both of THEM have picture quality problems now and then) Virtually everything else is an 70/30 split of god awful pillar boxed repurposed SD network junk, or "kinda HD" that's clearly been compressed and decompressed from satellite - easily identifiable because it LOOKS decent until some producer uses a cross dissolve between scenes - and the dissolve disintegrates into a blocky mess while the codec tries to catch up. Personally, I hope next year, Apple's iTV HDMI interface does to the HDTV Broadcast industry what iTunes did to the vinyl record and plastic CD distribution industry - pushes it rapidly towards oblivion. Sigh. Congress gave the extra channels to broadcasters for a digital transition and they have to give them back at the end of that transition in 2009. And while members of Congress may have spouted off something about HD being part of the deal it wasn't. We are a nation of laws and the law that broadcasters agreed to only mentions ONE SD program in the free and clear as being required. Broadcasters can do anything they want with the rest of the spectrum after they deliver that ONE SD program including delivering and selling you an HD program. And if you don't think they plan on doing that let me tell you about a bridge I could sell you cheap. Bob Miller |
Do you really like the way HDTV looks?
Bob Miller wrote:
Congress gave the extra channels to broadcasters for a digital transition and they have to give them back at the end of that transition in 2009. And while members of Congress may have spouted off something about HD being part of the deal it wasn't. We are a nation of laws and the law that broadcasters agreed to only mentions ONE SD program in the free and clear as being required. Broadcasters can do anything they want with the rest of the spectrum after they deliver that ONE SD program including delivering and selling you an HD program. And if you don't think they plan on doing that let me tell you about a bridge I could sell you cheap. Bob Miller Oh yeah Bob, like I am going to listen to a business deal you have to offer... Chip -- -------------------- http://NewsReader.Com/ -------------------- Usenet Newsgroup Service $9.95/Month 30GB |
Do you really like the way HDTV looks?
What ticks me off is the amazing variation of quality broadcast on the supposedly "high-def" cable channels. When the FCC gave away all that broadcast spectrum for HD, then backed off time and time again in requiring the stations to actually put HD up - it led to this current disaster. The FCC has not mandated HD at all,,only that in 2009, analog OTA goes away in favor of Digital. HD is a side technical benefit of the switch to digital - but it requires the OTA broadcaster to turn off 4 of his SD digital subs to send out. Broadcaster do no have to send out any HD at all, and many do not. FCC has nothing to do with cable or sat.,,both are free to send out anything they want, or can charge for, to the subscribers. That said, I agree, the digital picture quality on cable and Sat is bad - way over compressed. Instead of giving us 50 channels of pristine SD or even HD pictures, they give us 500 channels of pixels - not likely to change either. |
Do you really like the way HDTV looks?
Dave Oldridge wrote:
.... I think you misunderstand something. The i in 1080i implies not that the resolution is any less, but that the raster is scanned twice to get the full frame. ... All the same, there's some reason to think you get some sort of quality premium for a progressive signal. I don't know why -- it doesn't seem to be just the absence of motion blur. Owners of 480 line plasma displays seem to think they look very good showing downconverted HD signals; some people think 720p looks at least as good as 1080i; and Gamecube owners seem to agree that a 480p picture looks much better than a 480i picture. Greg |
Do you really like the way HDTV looks?
On Fri, 15 Sep 2006 19:27:38 GMT, Dave Oldridge
wrote: I think you misunderstand something. The i in 1080i implies not that the resolution is any less, but that the raster is scanned twice to get the full frame. What you actually see depends on the vertical refresh rate of the mode, which I'll assume is 30fps. So you lose some resolution along the time axis to trade for resolution in space. The picture is still 1920x1080 pixels, but, due to the interlace, it's a little blurry where it's moving. Usually the eye doesn't see this and most often, it is obscured by motion blur in the original film source. Even on live baseball it looks OK to me. At least with computer video cards with interlaced output for TV, there is a deliberate filtering in the vertical direction to avoid the 25/30 Hz flicker. If you display an object, that is only 1 scan line high on a 50/60 Hz interlaced display, that object gets updated only on the other field, i.e. it is updated only 25 or 30 times a second, causing unpleasant flicker on such CRTs. The vertical filtering will stretch the object to adjacent lines, thus partially updating it in both fields. The technical description on some HD cameras seem to indicate that the camera array, which is progressive by nature, two or more adjacent rows are summed (possibly with unequal weights) to generate the interlaced signal and increase the camera signal to noise ratio. This of course reduces the vertical resolution. When viewing the interlaced signal even of a static object, the display does not show the camera pixel rows independently, but rather some kind of weighted sum of the neighbouring pixel rows. With equal weights the odd and even fields would be identical and thus the 1080i would be effectively 540p. With unequal weights, the odd and even field are more different. How common is this summing of pixel rows for interlace signals in current CCD studio cameras ? Paul |
| All times are GMT +1. The time now is 07:42 AM. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com