![]() |
| If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|||||||
|
|
Thread Tools | Display Modes |
|
#81
|
|||
|
|||
|
Arny Krueger wrote:
"Scott Dorsey" wrote in message William Sommerwerck wrote: Since the s-video output and the composite output are both NTSC, Only if the source is NTSC. Today we have many common video sources that exceed NTSC limits in many ways. it is impossible for either the s-video output or the composite output to have *more* output than the NTSC output -- they *ARE* NTSC outputs. This might be true in practice, but "it ain't necessarily so". How would they not be NTSC? Only broadcast video *must* be NTSC, right? Well, in terms of the fact that the FCC will only come after you if your broadcast waveform doesn't match the NTSC specs, yes. But in fact, just about everything in use today meets the NTSC specs, other than VHS machines which need a time base corrector to meet timing specifications and which are going away very fast. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
|
#82
|
|||
|
|||
|
"Arny Krueger" wrote in message
... In the NTSC system, this difference shows up in the bandwidth of the color signals. The researchers determined that (for a 480-line, 30-frame system, on a 21" screen, presumably) you could see full red/green/blue-primaries color only up to 0.5MHz, Unfortunately, by the end of the NTSC era, 32 and 36 inch sets were mainstream, even average. 42 inch sets were common. NTSC looked like $#@!! on large screens -- barely tolerable on 32" sets. I remember the early 25" Sony consoles. They had really weak color, though I don't know why. However, I own a 32" Toshiba IDTV and and Sony 36" IDTV. They display spectacularly good NTSC images. Both digitally goose the luminance, and (as far as I know) both have full-bandwidth chroma demodulation. By the way, the original Advent projector had full-bandwidth color. But this has _nothing whatever_ to do with what I'm talking about. Right, you're talking about perception. No, I'm talking objective fact. Color-difference signals require less bandwidth than color-primary signals. Many objects, both artificial and natural, don't follow the "same paint" rule. But it's true for most objects, natural or artificial. If you don't believe this, try to find any colored object -- natural or artificial -- that is _not_ "constant saturation". By this you mean constant saturation of a given color hue, no? Yes. It would be meaningless to talk about different hues. And certain trees and rocks. Water with certain lighting and/or degrees of activity. Artificial objects with exposed frameworks. Artificial objects designed to be highly visible. Much text. Text? Are you referring to illuminated manuscripts? grin One point is that the DVD was one of the larger beginnings of the end for NTSC TV. I don't want to be too quick to defend NTSC, but it can be exceptionally good. It's not that NTSC is of lower quality than DVD, but rather that DVD is better. |
|
#83
|
|||
|
|||
|
Arny Krueger wrote:
"William Sommerwerck" wrote in message "Arny Krueger" wrote in message ... "William Sommerwerck" wrote in message Yes, it's brilliant. (It's one of the great 20th century inventions.) But -- and I will keep repeating this ad nauseum -- the reason color TV systems (of all sorts) can "get away" with reduced chroma bandwidth If we extrapolate this discussion to audio, then we have Willaim Sommerwerck, MP3 advocate! ;-) God, no. I hate compressed audio. (Dolby Digital, at least.) (1) Dolby Digital is really old-old tech, predating MP3 by lots. What difference does it make when it was created? |
|
#84
|
|||
|
|||
|
Scott Dorsey wrote:
William Sommerwerck wrote: Since the s-video output and the composite output are both NTSC, it is impossible for either the s-video output or the composite output to have *more* output than the NTSC output -- they *ARE* NTSC outputs. This might be true in practice, but "it ain't necessarily so". How would they not be NTSC? --scott A case for S-Video in preference to Composite: Let's consider the case of cable delivery of a 480i broadcast. The signal originates from the station as a digital feed from the station to the cable company. (In San Diego, Cox maintains fiber feeds from each 'must carry' station.) At this point, the signal it subject to the limitations of the NTSC spec and Cox is receiving something better than the OTA signal. The cable company produces two distinct products: 1) A conventional NTSC analog signal that it delivers to the customer (via format conversions as it travels through the cable infrastructure). This RF signal is delivered directly to the customer's TV receiver or, is demodulated in the STB and presented to the customer as either an R.F. signal, in NTSC format on Channel 3/4; or, a composite video signal -- essentially the baseband version of the NTSC signal; or, an S-Video output of luminance and color. All three of these outputs are limited in quality by the limitations inherent in NTSC. 2) A digital signal applied, along with one or more other signals, to an RF channel compatible with the STB. This signal will have been sufficiently compressed to fit in the allocated bandwidth. This signal is detected and made available to the customer by perhaps four outputs; RF (NTSC), Composite (NTSC-baseband), Component and S--Video. The RF and Composite outputs are subject to the limitations inherent in the NTSC spec. The S-Video and Component outputs may be slightly superior since NTSC wasn't imposed between the originating station and the customer's STB. I've also seen this work in reverse where the cable company heavily compressed the digital feeds for less popular media (to fit three or four signals into a single RF slot). The analog signals (raw NTSC-RF) were superior to the output from the STB. -- pj |
|
#85
|
|||
|
|||
|
"trotsky" wrote in message
news:[email protected]_s22 Arny Krueger wrote: "William Sommerwerck" wrote in message "Arny Krueger" wrote in message ... "William Sommerwerck" wrote in message Yes, it's brilliant. (It's one of the great 20th century inventions.) But -- and I will keep repeating this ad nauseum -- the reason color TV systems (of all sorts) can "get away" with reduced chroma bandwidth If we extrapolate this discussion to audio, then we have Willaim Sommerwerck, MP3 advocate! ;-) God, no. I hate compressed audio. (Dolby Digital, at least.) (1) Dolby Digital is really old-old tech, predating MP3 by lots. What difference does it make when it was created? Perceptual coding was and is a work in progress. Progress was pretty rapid at the time that DD was introduced and the decade following it. Dolby AC-3 AKA Dolby Digital was introduced in 1991. It is a proprietary standard, and has not changed a lot over the years. MP3 has remained a work in progress since 1989. The rate at which MP3 coders were improved slowed down quite a bit after ca. 1998, but improvement may still be possible. |
|
#86
|
|||
|
|||
|
Arny Krueger wrote:
"trotsky" wrote in message news:[email protected]_s22 Arny Krueger wrote: "William Sommerwerck" wrote in message "Arny Krueger" wrote in message ... "William Sommerwerck" wrote in message Yes, it's brilliant. (It's one of the great 20th century inventions.) But -- and I will keep repeating this ad nauseum -- the reason color TV systems (of all sorts) can "get away" with reduced chroma bandwidth If we extrapolate this discussion to audio, then we have Willaim Sommerwerck, MP3 advocate! ;-) God, no. I hate compressed audio. (Dolby Digital, at least.) (1) Dolby Digital is really old-old tech, predating MP3 by lots. What difference does it make when it was created? Perceptual coding was and is a work in progress. Progress was pretty rapid at the time that DD was introduced and the decade following it. Dolby AC-3 AKA Dolby Digital was introduced in 1991. It is a proprietary standard, and has not changed a lot over the years. MP3 has remained a work in progress since 1989. The rate at which MP3 coders were improved slowed down quite a bit after ca. 1998, but improvement may still be possible. You're not making sense. Did Dolby do their homework and do sufficient blind tests to "prove" that their codec was transparent to people? Maybe you're a different Arny Krueger and have come to realize that these blind tests are ineffective. |
|
#87
|
|||
|
|||
|
In article , pj wrote:
Scott Dorsey wrote: William Sommerwerck wrote: Since the s-video output and the composite output are both NTSC, it is impossible for either the s-video output or the composite output to have *more* output than the NTSC output -- they *ARE* NTSC outputs. This might be true in practice, but "it ain't necessarily so". How would they not be NTSC? A case for S-Video in preference to Composite: Oh, there are many strong cases for S-Video over composite. But both are NTSC. The S-Video is also NTSC, it's just not RS-170. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
|
#88
|
|||
|
|||
|
"trotsky" wrote in message
news:[email protected]_s21... You're not making sense. Did Dolby do their homework and do sufficient blind tests to "prove" that their codec was transparent to people? Maybe you're a different Arny Krueger and have come to realize that these blind tests are ineffective. It doesn't matter. Dolby Digital is so bad that you can hear its problems without comparing it with anything else. Before Arny objects... I was accustomed to listening to CD-format stereo from my LaserDisks. I was continually surprised and pleased with the great transparency, cleanliness, and "ease" of the sound. The first time I decoded a Dolby Digital signal ("The Incredibles") I could hear the difference -- flat, grainy, dry, blah sound. The audibility of lossy codecs varies with the quality of the playback system. Over my computer speakers (Monsoon planar magnetics), KUOW sounds fine. Not only is it clean and transparent, but I've never heard anything that I interpreted as an artifact. (This is the Microsoft codec.) |
|
#89
|
|||
|
|||
|
"trotsky" wrote in message
news:[email protected]_s21 Arny Krueger wrote: "trotsky" wrote in message news:[email protected]_s22 Arny Krueger wrote: "William Sommerwerck" wrote in message "Arny Krueger" wrote in message ... "William Sommerwerck" wrote in message Yes, it's brilliant. (It's one of the great 20th century inventions.) But -- and I will keep repeating this ad nauseum -- the reason color TV systems (of all sorts) can "get away" with reduced chroma bandwidth If we extrapolate this discussion to audio, then we have Willaim Sommerwerck, MP3 advocate! ;-) God, no. I hate compressed audio. (Dolby Digital, at least.) (1) Dolby Digital is really old-old tech, predating MP3 by lots. What difference does it make when it was created? Perceptual coding was and is a work in progress. Progress was pretty rapid at the time that DD was introduced and the decade following it. Dolby AC-3 AKA Dolby Digital was introduced in 1991. It is a proprietary standard, and has not changed a lot over the years. MP3 has remained a work in progress since 1989. The rate at which MP3 coders were improved slowed down quite a bit after ca. 1998, but improvement may still be possible. You're not making sense. Please clarify, because the questions that follow are not requests for clarification. Did Dolby do their homework and do sufficient blind tests to "prove" that their codec was transparent to people? AFAIK, Dolby never claimed that DD was perfectly transparent. The MPEG group coder tests in the late 1990s showed that Dolby Digital was not sonically transparent and generally inferior to other, more modern codecs. Maybe you're a different Arny Krueger Nope. Just older and wiser. ;-) and have come to realize that these blind tests are ineffective. How so? The fact that AC-3 was a substandard codec based on the MPEG Group's blind tests was pretty well publicized by the MPEG and the AES. This was no doubt a bit of an embarrassment to Dolby. Dolby has been doing their own blind tests for decades. Dolby subsequently came out with a new multimodal system for coding and decoding audio known as Dolby TrueHD. In some modes, TrueHD is definitely sonically transparent. |
|
#90
|
|||
|
|||
|
William Sommerwerck wrote:
"trotsky" wrote in message news:[email protected]_s21... You're not making sense. Did Dolby do their homework and do sufficient blind tests to "prove" that their codec was transparent to people? Maybe you're a different Arny Krueger and have come to realize that these blind tests are ineffective. It doesn't matter. Dolby Digital is so bad that you can hear its problems without comparing it with anything else. Before Arny objects... I was accustomed to listening to CD-format stereo from my LaserDisks. I was continually surprised and pleased with the great transparency, cleanliness, and "ease" of the sound. The first time I decoded a Dolby Digital signal ("The Incredibles") I could hear the difference -- flat, grainy, dry, blah sound. Agreed on all counts. The audibility of lossy codecs varies with the quality of the playback system. Over my computer speakers (Monsoon planar magnetics), KUOW sounds fine. Not only is it clean and transparent, but I've never heard anything that I interpreted as an artifact. (This is the Microsoft codec.) What are you trying to say? Are you saying the lossiness of DD would be audible over your computer speakers or not? |
| Thread Tools | |
| Display Modes | |
|
|
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| CHRISTMAS SALE: ANY 24 "TRACI LORDS" OR "70'S/80'S GRINDHOUSE" DVDS37 POUNDS........... | desiree cousteau | UK sky | 0 | December 16th 07 08:45 PM |
| +"BBCi" +"freeview" +"radio" +easily? | FCS | UK digital tv | 0 | July 23rd 07 11:52 PM |
| Is the "HD Fury" HDMI to RGB converter any good? | John Ritchie | High definition TV | 2 | July 20th 07 07:41 AM |
| Vizio "Gallevia" GV42L 42" LCD poor sound | MHF | High definition TV | 3 | March 3rd 07 06:45 AM |