|
Do you really like the way HDTV looks?
"yea right" wrote in message
... On Tue, 12 Sep 2006 11:37:19 +0000, Bob Miller wrote: Jim Mack wrote: [quoted text muted] The combination of MPEG2, 1080i and 8-VSB is going to kill free OTA TV for channels 2-51 IMO. Bob Miller Dude! Where do you buy your dope! I've been asking the Cofdm Clown about this for years, but he never answers. (L) |
Do you really like the way HDTV looks?
"Bob Miller" wrote
Discovery HD is mostly composed of slow pans and virtually still shots. They also have the luxury of encoding in non real time. What I remember of the Olympics was divers coming off the high board and turning into so many pixels. Where ever there was intense action there was macroblocking. When they hit the water it looked like a pixelated blob. Postcard shots are great in 1080i with MPEG2 at 18 Mbps stuffed in a 6 MHz channel but it can't handle action. Countries like China, the UK and France will have a better codec, MPEG4, better modulation such as DVB-T/H or CDMB-T/H and will also, hopefully, make their OTA receivers upgradeable so that they are not obsolete in a few years like ours are now. The Olympics in 2008 will look great in China but probably not while being watched from China in the US. http://www.hometheaterfocus.com/blog...05/29/207.aspx Bob Miller Perhaps you need to learn the basics regarding composite connections, etc. You know, just for starters. :-) Then you can learn all about the more complicated stuff, like s-video. |
Do you really like the way HDTV looks?
Your comments are really interesting, and I now wish I had the opportunity
to do a direct comparison as you have. It may be my nostalgic and inaccurate memory of the Japanese MUSE system which now leads me to believe it is inherently superior, but I can accept and believe that ATSC,if done correctly, can appear equal or possibly superior. The bitrate is an unfortunate wild card. "Done right", as you say, is the "gotcha". The wisdom of allowing the commercial marketplace to trade quality for programming capacity allows the broadcaster a new and extremely self-serving option, since they can (and do) trade high bit rate HD for multi-channel advertising profits. It's American politics and legislation at its' worst, IMHO. I do believe that flat panel displays do exacerbate the problem, since LCDs in particular have poor gamma and dynamic range, and are sluggish compared to phosphors. On the bright side, there is vast opportunity for improvement if and when the next generation of "super high def" technology emerges. Mot unlike the audio industry which has attempted to obsolete CDs with DVD Audio and SACD super audio, the video and TV industry will eventually make another leap forward in quality at some future time, keenly observing that the present high def systems are, after all, full of problems. Smarty "R Sweeney" wrote in message ... "Smarty" wrote in message ... My first exposure to high definition TV was in the Shinjuku subway station in Tokyo Japan in 1991, when Sony had deployed an analog HD system long before the advent of MPEG, digital broadcasting, or flat panel TVs. It was called the MUSE system, and was installed in train / subway stations to attract consumer attention and build market demand. It blows away anything subsequently introduced based on my recollection. The analog modulation scheme did not rely on macroblocks, compression, or other digital conveniences. The CRTs, extremely fine pitch Trinitrons, were wide aspect ratio, gorgeous displays, which make current LCDs look like the non-linear, smeared displays they truly are. ATSC and the engineering efforts associated with bringing digital broadcasting to the U.S. have made a lot of great achievements, but unfortunately, delivering a truly superb quality end-product is not among them, IMHO. There are enough improvements over standard definition TV that most people, myself included, still buy into the upgrades for lack of better options. Smarty In the late 1990's I saw reference ATSC vs MUSE at Sarnoff, MIT, CableLabs and other development centers (including TV mfg corporate labs in Japan) using the same fine pitch, professional quality Trinitrons, showing the same kinds of full resolution, highest quality NHK programming you refer to. Done right, ATSC was (and is) as beautiful as the analog MUSE Hi-Vision system it replaced. Done right --- the gotcha. When you subtract overcompression, low dynamic range flat panel displays and poor media quality in the first place, the comparison suffers. Poor image quality is not the fault of ATSC - nor it is always the case. If people would just crank up the bits, quality would improve. |
Do you really like the way HDTV looks?
Bob Miller wrote:
What I remember of the Olympics was divers coming off the high board and turning into so many pixels. Where ever there was intense action there was macroblocking. When they hit the water it looked like a pixelated blob. Funny you should mention the divers (2004 Athens)... I remember it exactly as you describe, only I wasn't watching in anything near HDTV. Just regular old analog broadcast. I would conclude that the digital artefacts were introduced somewhere upstream in the signal chain, before the standard broadcast and HD signals diverge... Maybe an overloaded sattelite link between Europe & North America? |
Do you really like the way HDTV looks?
|
Do you really like the way HDTV looks?
On Tue, 12 Sep 2006 03:42:53 GMT, "HiC" wrote:
Went into a local Circuit City and took a good long look at their HDTV selections. They had several including 2 1080p sets that I was told were set up correctly and what I was seeing was as good as it gets. Everything HD from the cams to the screen. Both the 1080p's were running some sort of hard drive unit, not off a broadcast. I've been hearing how amazing HDTV is. Well....while there's a certain "pow" when you first see them, I get the sense it's due to some artifically induced phenomena. The colors seem vivid, but it seems to me in an enhanced - i.e. forced way. There seems to be an excessive "whiteness" to That depends on how the set is "set up". Color saturation, brightness, and contrast all need to be set. the image that adds a certain kind of sparkle/sharpness, but again it seems artificial. The real world as viewed by eyeballs doesn't seem that "sharp" I have a 40" LCD that looks normal to me. or vivid. The demos that were showing were clearly intended to take advantage of this, all these closeups of brightly colored flowers, snowboarders on glaring snow etc. I don't believe a sky exists anywhere the shade of blue they were depicting in that demo. Mountain, Winter and Tropical sky can be a very deep blue. I have photos from Key west and from the top of my tower here in frozen Michigan where the sky is a very deep blue. I see all kinds of artifacts in the images. Yeah, okay, they're not meant to be viewed from 6 inches away. But when I back off to 8 - 10 feet, I still see this odd graininess, especially when the image pans. Plus all these This could have to do with the way who ever configured/set up the sets did the work. other odd things that happen to the image. Overall I find it harder on my eyes than a sharp picture on a good analog tv. I just don't see these things. Sure I see some *stuff* if I get within inches, but not at normal viewing distances. As I understand it, in a few years we're getting all digital whether we like it or not. Is the whole HDTV thing just a bill of goods we got sold/crammed down our throats? I think it's great. It's head and shoulders above the best SD analog signals I've seen. I have a very nice 27" CRT in the basement that has an excellent picture. The 40" HDTV is far better whether off the air or satellite. Cable? That's for the Internet connection. Roger Halstead (K8RI & ARRL life member) (N833R, S# CD-2 Worlds oldest Debonair) www.rogerhalstead.com |
Do you really like the way HDTV looks?
Bob Miller wrote in
ink.net: Dave Oldridge wrote: Bob Miller wrote in link.net: Dave Oldridge wrote: "HiC" wrote in ink.net: Went into a local Circuit City and took a good long look at their HDTV selections. They had several including 2 1080p sets that I was told were set up correctly and what I was seeing was as good as it gets. Everything HD from the cams to the screen. Both the 1080p's were running some sort of hard drive unit, not off a broadcast. I've been hearing how amazing HDTV is. Well....while there's a certain "pow" when you first see them, I get the sense it's due to some artifically induced phenomena. The colors seem vivid, but it seems to me in an enhanced - i.e. forced way. There seems to be an excessive "whiteness" to the image that adds a certain kind of sparkle/sharpness, but again it seems artificial. The real world as viewed by eyeballs doesn't seem that "sharp" or vivid. The demos that were showing were clearly intended to take advantage of this, all these closeups of brightly colored flowers, snowboarders on glaring snow etc. I don't believe a sky exists anywhere the shade of blue they were depicting in that demo. I see all kinds of artifacts in the images. Yeah, okay, they're not meant to be viewed from 6 inches away. But when I back off to 8 - 10 feet, I still see this odd graininess, especially when the image pans. Plus all these other odd things that happen to the image. Overall I find it harder on my eyes than a sharp picture on a good analog tv. As I understand it, in a few years we're getting all digital whether we like it or not. Is the whole HDTV thing just a bill of goods we got sold/crammed down our throats? When I bought an HDTV-ready TV, I bought a CRT model. CRT and rear projection CRT are proven technologies that can reproduce signals at these resolutions. They've been in use for some time in the computer industry, doing just that. The difference is not HUGE, but my SD signals are actually received, often, at EDTV resolution from a satellite, so what I'm actually comparing is the line-doubled 480p signal from the satellite to the 1080i signal from the same source. My estimate is that the picture clarity is 3db better on the HDTV signals, especially the good ones. That's about twice as good as the SDTV signals. Might that suggest that if the EDTV signal was actually true 480P and had been captured with a good 720P camera that it might be as good as the 1080i signal? Actually, you might suggest it, but it runs counter to my actual experience. I see materials that are converted from HD cameras all the time and, while they are 1000% better than regular SDTV signals, they are still about 3db short of a 1080i or 720p production over the 1080i path from my satellite. Even the best DVD films are about 3db worse. For example, I have the entire LotR trilogy in anamorphic widescreen. It is good, but it still has that 3db clarity loss from the 1080i version broeadcast by my movie supplier. That was a question. I was following your math and maybe misunderstood it. You were saying "line-doubled 480P" which I interpreted as 480i information. And I was then suggesting that if it were true 480P from a very good source, since it has twice the information as the 480i line doubled version, might it not be as good as the 1080i you were comparing it to since you said the 1080i was only twice as good as what I took to be 480i. Wouldn't 480P then equal your 1080i? I don't have any actual 480i sources....well, actually I do--I have an S- Video line running from my computer, but it can be replaced with a component line if I need that. The 480i is inferior to MOST of the SD stuff off my satellite (which is sending 480p to the TV). A lot depends on the original production, but I'm talking about a well-transmitted channel without too much compression on it, from a source using a digital uplink. For example today's Jays-Mariners game. -- Dave Oldridge+ ICQ 1800667 |
Do you really like the way HDTV looks?
Dave Oldridge wrote:
Bob Miller wrote in ink.net: Dave Oldridge wrote: Bob Miller wrote in link.net: Dave Oldridge wrote: "HiC" wrote in ink.net: Went into a local Circuit City and took a good long look at their HDTV selections. They had several including 2 1080p sets that I was told were set up correctly and what I was seeing was as good as it gets. Everything HD from the cams to the screen. Both the 1080p's were running some sort of hard drive unit, not off a broadcast. I've been hearing how amazing HDTV is. Well....while there's a certain "pow" when you first see them, I get the sense it's due to some artifically induced phenomena. The colors seem vivid, but it seems to me in an enhanced - i.e. forced way. There seems to be an excessive "whiteness" to the image that adds a certain kind of sparkle/sharpness, but again it seems artificial. The real world as viewed by eyeballs doesn't seem that "sharp" or vivid. The demos that were showing were clearly intended to take advantage of this, all these closeups of brightly colored flowers, snowboarders on glaring snow etc. I don't believe a sky exists anywhere the shade of blue they were depicting in that demo. I see all kinds of artifacts in the images. Yeah, okay, they're not meant to be viewed from 6 inches away. But when I back off to 8 - 10 feet, I still see this odd graininess, especially when the image pans. Plus all these other odd things that happen to the image. Overall I find it harder on my eyes than a sharp picture on a good analog tv. As I understand it, in a few years we're getting all digital whether we like it or not. Is the whole HDTV thing just a bill of goods we got sold/crammed down our throats? When I bought an HDTV-ready TV, I bought a CRT model. CRT and rear projection CRT are proven technologies that can reproduce signals at these resolutions. They've been in use for some time in the computer industry, doing just that. The difference is not HUGE, but my SD signals are actually received, often, at EDTV resolution from a satellite, so what I'm actually comparing is the line-doubled 480p signal from the satellite to the 1080i signal from the same source. My estimate is that the picture clarity is 3db better on the HDTV signals, especially the good ones. That's about twice as good as the SDTV signals. Might that suggest that if the EDTV signal was actually true 480P and had been captured with a good 720P camera that it might be as good as the 1080i signal? Actually, you might suggest it, but it runs counter to my actual experience. I see materials that are converted from HD cameras all the time and, while they are 1000% better than regular SDTV signals, they are still about 3db short of a 1080i or 720p production over the 1080i path from my satellite. Even the best DVD films are about 3db worse. For example, I have the entire LotR trilogy in anamorphic widescreen. It is good, but it still has that 3db clarity loss from the 1080i version broeadcast by my movie supplier. That was a question. I was following your math and maybe misunderstood it. You were saying "line-doubled 480P" which I interpreted as 480i information. And I was then suggesting that if it were true 480P from a very good source, since it has twice the information as the 480i line doubled version, might it not be as good as the 1080i you were comparing it to since you said the 1080i was only twice as good as what I took to be 480i. Wouldn't 480P then equal your 1080i? I don't have any actual 480i sources....well, actually I do--I have an S- Video line running from my computer, but it can be replaced with a component line if I need that. The 480i is inferior to MOST of the SD stuff off my satellite (which is sending 480p to the TV). A lot depends on the original production, but I'm talking about a well-transmitted channel without too much compression on it, from a source using a digital uplink. For example today's Jays-Mariners game. So what is the 480P being captured with? 720P, 1080i or a 480P native camera if there is such a thing? I thought you were saying that the 480P was line doubled 480i which would mean it is really just 480i info. Bob Miller |
Do you really like the way HDTV looks?
In alt.tv.tech.hdtv Mark Crispin wrote:
| Anyone with any sense goes through a proper set adjustment once they get | the set at home. There are various DVDs available to help you do it (at | least do the basic stuff with color bars!). It's much more important with | a large screen HDTV than with a small standard definition TV. In modern | TVs, the factory settings in "standard" (NOT "vivid"!) mode are usually | pretty good, but are rarely exactly right. | | If you have an installer do it for you, watch what they do. If they don't | put up some test patterns and hold up a blue filter to their eye, they're | not doing a proper set adjustment. The problem is, you can't judge which set will produce the better or best at this kind of adjustment point. Like the OP, I want realistic, not garish. I see many sets that the point of realism isn't even within their range of settings. Back in the 1970s and 1980s, I set up many an analog TV to get as close to realism as I could. Many could get there an many could not. One of the best sets I ever worked with required a lot of work, including replacement of a few tubes. It was an old Zenith brand tube type set from the late 1960's that my grandmother originally bought, and we got when she passed away. I set this up in 1975. I was working for a TV station at the time so I had some references to compare with (e.g. Tektronix monitor). I got it to the point that it appeared to look as good as the Tektronix, at least psychologically because I didn't have the opportunity to run them side by side. But when I looked at it, it had that same feel of "this is exactly real". I asked by father to come in an try it out. It was set up for a dark room, so I turned out all the lights and tuned in to the station I worked at (which did have the best signal there). He did say the contrast seemed a bit low. But he was used to high contrast, so that was expected. Then the program faded to black. The room turned absolutely dark. My father thought the power went out. Then some promos started to play and the picture was back. That's one thing a lot of sets these days just can't do at all ... be dead black when the signal is at black level. Any lighter and you are just a tab -- |---------------------------------------/----------------------------------| | Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below | | first name lower case at ipal.net / | |------------------------------------/-------------------------------------| |
Do you really like the way HDTV looks?
In alt.tv.tech.hdtv Paul Keinanen wrote:
| On Tue, 12 Sep 2006 03:42:53 GMT, "HiC" wrote: | |I see all kinds of artifacts in the images. Yeah, okay, they're not meant to |be viewed from 6 inches away. But when I back off to 8 - 10 feet, I still |see this odd graininess, especially when the image pans. | | Most likely a compression artifact in the signal source, i.e. a too | low bit rate was used. | | If the original source was from a (feature) film, which may have been | upconverted from 24p to 1080i x 60 with 3:2 pulldown and then | converted to 1080p x 60 in the display, you may observe some jerkiness | when panned. What I see when they pan, and in some other cases, is the compression artifact. It's the fact that a given 16x16 has too much change taking place and the block can't be encoded in full. I don't know if this is due to the MPEG interleaving and this isn't an I-frame, or if there is just not enough bit capacity. A lot of frames can compress well. If the set were to delay the picture long enough, it would be safe to have _some_ frames take _more_ bits than their frame time slice would account for. You just can't go over that for longer than a standard delay line, or else the picture really will get jerky. But I wouldn't trust the engineers to get the audio delay correctly matched. -- |---------------------------------------/----------------------------------| | Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below | | first name lower case at ipal.net / | |------------------------------------/-------------------------------------| |
| All times are GMT +1. The time now is 07:42 AM. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com