HomeCinemaBanter

HomeCinemaBanter (http://www.homecinemabanter.com/index.php)
-   High definition TV (http://www.homecinemabanter.com/forumdisplay.php?f=6)
-   -   Origins of PAL: 1956 radio engeenering airticle from UK mag -- phase alternations (and effects) considered... (http://www.homecinemabanter.com/showthread.php?t=7913)

John S. Dyson October 16th 04 11:03 PM

In article ,
Roderick Stewart writes:
In article , John S. Dyson wrote:
Actually, in my recent visit to the UK, I saw more inaccurate color
in faces than I have recently seen in the US.

This has probably more to do with the way the pictures were lit and
shot, and the way the camera was adjusted, than anything to do with the
encoding system.

Bingo -- and I had seen MORE difficulties with color in the UK (not less),
and the apparent NTSC color phase accuracy issues are now (and have been)
relatively insignificant in the US for quite a while. The color temperature
setup (and other aspects of camera setup) tend to be more troublesome than
solutions to "drifty tubes" and extreme multipath color problems (where the
luma is usually just as screwed up as any chroma problem.) Accurate color
hue while the picture is quite distorted in other aspects just isn't a
necessary design goal. A bad looking picture with accurate hue (but
necessarily distorted chroma level) is certainly a false advantage.

The real world usage issues WRT color accuracy are the same in PAL and NTSC,
where studio PAL shouldn't be distributed with built-in error either!!!
Improperly set-up cameras shouldn't be professionally used on PAL systems
or NTSC systems...

The PAL solution to drifty tube electronics are just not a problem anymore
here in the US :-). If you see it as an advantage, then it might be true
that you have more "drifty tube" TV sets than we have, even though our
1955-1956 "drifty tube' TV sets will still render a full color picture,
but with sometimes "green face" problems. Not very many of those TV sets are
in use anymore :-).

Bottom line: NOWADAYS, the biggest differences between OTA composite PAL
and NTSC are the much more extravagent video bandwidth in PAL countries,
the less flicker on native NTSC, and the better consumer decodability of
NTSC. In both cases, the systems are becoming legacy, where 720p60 and
1080i30(frames/sec) (often shown as 1080i60 also) are replacing NTSC for
distributing new productions.

Given the CULTURAL expectations in the US (believe it or not), the 1080p24 US
productions will provide better images for both the US market and the
Euro-market on all video productions (than the old NTSC production
standards.) 1080i30(frames per second) would actually NOT be desirable
for US production in most cases (except for sports, where 720p60 might
often be better.)

John


Roderick Stewart October 16th 04 11:30 PM

In article , John S. Dyson wrote:
PAL TV sets just cannot give an accurate green (period.)

Can you explain why not?

Look up the color gamut, and then recognize that 'grass green' is
much better reproduced with the original NTSC gamut.

Color bars are NOT an indicator of gamut, but are used to "tune-up"
a system.


The colour gamut is determined by the characteristics of the dichroic
filter system in the camera and the primary colours emitted by the CRT.
(The encode/decode system is completely transparent as far as colour
analysis is concerned and simply results in RGB to RGB with no signal
level changes). When the same cameras with the same filters are used to
produce both NTSC and PAL, and the results are viewed on the same
monitors, how can they produce different results?

Rod.


ivan October 16th 04 11:30 PM


"manitou910" wrote in message
...
John S. Dyson wrote:

Like we had quadriphonic radio broadcasts in the seventies. I would

guess
that more than 99% of broadcast TV in the US is 525 line.

Nowhere near 99%. Prime time is about half true HDTV. Prime time
is 1/8 of the day, so 1/2 of 1/8 is 1/16, which is 6%. There
are also HDTV broadcasts outside prime time, including much sports,
including at some time on Sunday as many as 10 different
HDTV broadcasts, one afternoon soap opera (yes, I know),
and Jay Leno, soon Letterman too.


During my recent visit to the UK (just returned today), I realized
that even our HDTV (16:9) shows are NOT necessarily shown in 16:9 and/or
PALPlus in UK.

Come to you senses, come to America, and look at our HDTV.

Your jaw will fall off.


PALPlus looks okay (on 100Hz TV), but it is far far far from HD. I had
also noticed that the 100Hz TV showed obtrusive scanlines (even small
screen, Loewe TV), yet I don't see a visible scanline on my own TV,
even displaying NTSC on the HDTV. Any kind of artifacting, including
visible scanlines, helps to confuse or distract the human vision system.


This would be because 576i (even at 100hz) is considerably fewer scan
lines than 1080i or 720p at 60hz.

OTOH, with Faroudja DCDi deinterlacing even 480p from NTSC sources
completely blows away anything I've seen from a PAL source.

The flicker still persists (on 50Hz TV sets), but the 100Hz display
(perhaps imperfect, however) makes the PAL video look generally better.
My travelling companion (a co-worker) initially thought I was very
wrong about the 'flickerfest' problem until I informed him that our
hotel (Aztec in Bristol) seemed to provide each of us with 100Hz TV

sets.
Initially, I was somewhat worried about my reputation when I looked at

my
first true UK TV set in 21yrs, and it took a few minutes (1-2minutes)

for me
to realize that it was a 100Hz TV. Star Trek, Next Generation (even

with
NTSC composite 60i post production), didn't really look bad.

Apparently,
they took advantage of NTSC 3D combing to remove the NTSC artifacting,
and did a good quality conversion. To me, the TNG broadcasts looked
little different from pristine NTSC reception of the same thing using
my HDTV.

Perhaps the worst case that I saw was a transcoded King of Queens. It
looked horrendous, perhaps even worse than the early Dr Who looked
here. (Given that show is done in 24p, and that looked like a moderatly
current episode, then there is NO excuse for artifacting other than

doing
something silly like
24p(original) -- 60i(US submaster) -- 50(i/p) (broadcast)-- 100i (tv)
There is almost NO excuse for using a 60i scan in the process.

Also, some morning news shows are done in PAL, and the conversion
to digital is done without 3D comb (there aren't any commodity PAL
3D combs, however a few SPECIALTY devices.) The color flashing even
on non moving subjects still persisted, even on the digital version.
In the US, the concept of providing a moderately high end, composite
`video analog capable TV set without a 3D comb would result in almost
a totally failed product.


Were you able to compare Sony and Loewe 100hz sets?




For what it's worth I live about 10 minutes drive away from where John was
staying (a pity he didn't let us know where he was going in the UK, as
perhaps I could have taken him on a tour of some of our local Somerset
hostelries!)

Five minutes' drive away from where he was staying is a shopping complex
called Cribbs Causeway, if he had paid a visit to John Lewis in the Shopping
Mall, or the Comet store, he would have seen literally dozens of large W\S
plasmas LCD's and CRT's (hardly any 4:3 receivers on display) but even so
there is such a very large disparity between the picture quality of
different manufacturers, that I find it difficult to see how he can assess
the overall quality from a few different makes of receivers, and I can
assure him that IMHO Lowe are certainly not the very best receivers I've
ever seen.







C.




John S. Dyson October 16th 04 11:54 PM

In article ,
manitou910 writes:
John S. Dyson wrote:

Also, all too often, I had seen (this last week) that even shows
that are 16:9 in the US aren't provided as 16:9 in the UK.

Frankly, with a new 100Hz TV set, the perceivable detail and image
quality does appear to seem better than on a 50Hz TV set...


It's not even as good, because interlace artifacts are exacerbated (I
had the opportunity to make side-by-side comparisons at The Sony Store
in Milan a few months ago). The only benefit is reduction of flicker
associated with 50hz.

I might (at least, partially) agree with you -- I did notice a kind
of 'twitter' (but less than normal interlace twitter) and also did
notice the scanlines still persisted. The scanlines were something
that I haven't ever seen on my own TV in my living room... (Of course,
I do have a small Sony TV, and a pro monitor that do show the scanlines
also, because they don't do a progressive transformation.)


However, even the 50hz flicker problem is minimal if a set is properly
calibrated and the white level kept low.

Also, I noticed that it was better in a very dark room. However, that
problem is already much much attenuated on 60Hz, given a normally lit
room.


It would be much better in PAL countries to sell high-end TVs with
Faroudja circuits to convert the signal to 576p and then show the image
at 75hz -- which would eliminate both flicker and interlace artifacts.

Yes -- I can see that as an advantage. Also, when properly sizing the
scanlines on the CRT (assuming CRT), then the scanline visibility in
progressive mode would diminish. Perhaps that would be a better
solution than the 100Hz Loewe in my hotel room. One other oddity was
the spurious mode changes where the material (even American originated)
would elicit resizing. In many cases, the mode changes were appropriate,
but in other cases, it was ludicrous. I wonder if the TV sets were some
kind of 'reject', and perhaps had ROM errors. (Stuff like that are
often field updatable, however.) When it changed modes to PALplus (one
show was some kind of Doris Day movie), then the perceived picture quality
was like a slightly soft DVD on a progressive NTSC-style set, but with still
visible scanlines.

However, I would still claim that the 100Hz Loewe made the TV viewing
much more enjoyable than if it was still flickering at me. I generally
don't view TV in a totally darkened room. We had
to change hotels to be closer to Gatwick (because of a foolish itinerary
and flight schedule) on the last night, and the 50Hz TV was everything
that I remembered in the past. (Even after adjusting the TV, the flicker
was still too distracting unless the room was totally darkened.)

Actually, the room darkening was almost a fully successful experiment,
in the sense that my own flicker sensitivity was VERY dependent upon
the room lighting. If the room was too dark for me to safely walk (given
the TV is off), then when the TV was on, the amount of flicker perception
diminished to the point where it was NO significant distraction.

John

Doug McDonald October 17th 04 12:27 AM

J.Michael Davison wrote:
"John S. Dyson" wrote in message
...
snip

PAL TV sets just cannot give an accurate green (period.)


That must be bunk.
The mathematics for deriving the luminance and colour difference signals for
both the NTSC and PAL systems is the same only the onward encoding for
broadcast is different.



It's not the math. It is that the OFFICIAL NTSC green
primary is GREEN, and the official PAL one is yellow-green.

True, it is not clear how well this applies in teh real world
of not-official NTSC primaries. Nevertheless, the greens
produced on RP-LCD or RP-DLP sets are far superior to
CRT one, because both the green and red hues are purer.
Greens, are monochromatic 546nm, are not really as they should be in the
530-535 region, but are vastly superior to the rare earth
phosphor yellow-green usually seen.

Doug McDonald

Doug McDonald October 17th 04 12:49 AM

Roderick Stewart wrote:


The colour gamut is determined by the characteristics of the dichroic
filter system in the camera and the primary colours emitted by the CRT.
(The encode/decode system is completely transparent as far as colour
analysis is concerned and simply results in RGB to RGB with no signal
level changes). When the same cameras with the same filters are used to
produce both NTSC and PAL, and the results are viewed on the same
monitors, how can they produce different results?



They don't. What you don't understand is the the 1953 NTSC spec
for the color of the phosphors is DIFFERENT from PAL.

see http://www.aim-dtp.net/aim/technolog...yz/cie_xyz.htm

The green is MUCH greener ... look at the places on the CIE
chart. Note that the 1953 NTSC spec is very similar to the
1998 Abode spec.

Thus, Dyson's (an my) statements apply only to TV sets
that actually obey the 1953 specs. As I have said, modern
RP sets with mercury lamps are much nearer the NTSC 1952 specs
than most CRTs, and result in very gorgeous greens.

Doug McDonald

John S. Dyson October 17th 04 12:51 AM

In article ,
manitou910 writes:

Also, some morning news shows are done in PAL, and the conversion
to digital is done without 3D comb (there aren't any commodity PAL
3D combs, however a few SPECIALTY devices.) The color flashing even
on non moving subjects still persisted, even on the digital version.
In the US, the concept of providing a moderately high end, composite
`video analog capable TV set without a 3D comb would result in almost
a totally failed product.


Were you able to compare Sony and Loewe 100hz sets?

No, but if I had time during the trip, I certainly would have
done more 'research.' Nowadays, I work at a TV company, and I have
more than hobby interest in the subject. My 100Hz vs. NTSC comparisons
are from memory, but the comparison points are simple enough to clearly
remember. I am somewhat practiced at such comparisons.

Frankly, I am more critical than the average consumer,
but not for 'wine tasting and tube audio HIFI reasons.' I am more
of a nitty gritty perfectionist rather than an artful type.

Sometimes I am wrong (because of misinterpretation), for example,
there is the possibility of the 100Hz video actually being worse, but it
made the video MUCH MUCH more comfortable to watch, and the easier
perception of detail without distraction. To me, watching the
100Hz Loewe felt ALOT like 'normal' Am. TV watching, where the issue
of 'scanrate' was totally attenuated... *IF* I was 'joe average'
American consumer with a moderate to high end TV set (normally watching
OTA NTSC with a 3d comb TV at home) on a visit in the UK, while watching
the Loewe, as that consumer, I probably wouldn't have noticed the PAL
color flashing due to the studio 2d comb conversion from analog
before digitization. I definitely would not have noticed more detail on
either the American or the UK video (given the Loewe TV.) If I was used
to 480p DTV (or upconverted component NTSC to 720p/1080i), I'd probably
notice that the picture at home was slightly sharper... However, that
could be due to the display device (Loewe SDTV vs. RCA 38" HDTV, in my
case), or due to differing studio configuration, or some other reason that
isn't controlled in the experiment.

Given the digital transmission, the detail wasn't a huge amount different
than I would expect in the US, even with OTA NTSC -- which should be
NOTICEABLY (perhaps 10%-20% on an arbitrary, intuitive scale) less
detailed. NTSC is probably saved in this comparison by the 3D combing and
the maintaining of diagonal resolution, among other things. However,
given my very picky perception, component NTSC originated material
(normal US news gathering practice, except for remote locations) that
is immediately converted to 720p or 1080i before US broadcast (depending
upon TV station) seemed to have noticeably more (but not
lightyears) more detail and perceived sharpness than the Loewe with
the reception of the digital multiplex video. When component NTSC
(kind of an oxymoron) is upconverted to 720p (for example), and 12-15mbps
is allocated for the signal, then liklihood of any MPEG2 artifacting
is nil, and the MPEG encoding is likely totally transparent relative
to the original studio 480i or 480p signal.

All in all, the qualitative differences between the enjoyability (ignoring
flicker) are probably nil (assuming either SDTV or composite analog
transmission.) SDTV seems to flatten the differences (of course), and
the 100Hz TVs totally mitigate my own flicker adversion (but maybe with
other negative side effects, which didn't really bother me.) The 'equvalent'
to 100Hz in the Am. market might be judged to be the 'deinterlaced' TV sets,
where I haven't generally seen objectional artifacting by those TVs, and
that truly does appear to be an improvment (e.g. disappearance of
scanlines.)

My visit was fun and educational. My comments are NOT meant to be divisive.

John

Doug McDonald October 17th 04 12:52 AM

J.Michael Davison wrote:
Wasn't the red phosphor particularly poor in the very early days ?
Much was made about 'europium red phosphors' in the adverts in 'National
Geographic' in the 60's when that came along.



The original red and green phosphors were MUCH better than
modern CRT phosphors in color rendition. They were in fact
very close to the actual NTSC 1953 specs. The rare earth
phosphors are simply brighter.

Doug McDonald

John S. Dyson October 17th 04 01:39 AM

In article ,
"ivan" writes:

For what it's worth I live about 10 minutes drive away from where John was
staying (a pity he didn't let us know where he was going in the UK, as
perhaps I could have taken him on a tour of some of our local Somerset
hostelries!)

It was a business trip, and I tried to avoid thinking about it!!! (I totally
love the UK, and had a wonderful time when meeting people, but I am also
someone who had previously promised themselves to never travel again...)
When reading opinion, when that opinion doesn't specify measured or
carefully estimated quantities, one should assume that the claim is
made like 'in my opinion.' Maybe I should have used that phrase more
often.


there is such a very large disparity between the picture quality of
different manufacturers, that I find it difficult to see how he can assess
the overall quality from a few different makes of receivers, and I can
assure him that IMHO Lowe are certainly not the very best receivers I've
ever seen.

There are indeed NUMEROUS dimensions of quality, but there are aspects
of quality that are seperable from each other. For example, even if
a display is imperfect (e.g. has color/tint problems, burn-in or even
bandwidth/peaking problems), I can still see many effects and capabilites
of the chroma/luma seperation (for example.)

Even if the contrast/black level is set all wrong, you can still estimate
bandwidth, or even some aspects of chroma decoding. It can even be
interesting to vary the contrast while measuring/estimating the bandwidth.
So, even an improperly set-up TV set can still be used to evaluate
limited aspects of a TV system (assuming the attributes of the TV set
were well understood.) I kind-of know what to expect given numerous
apparent attributes and behaviors of a given TV set. I agree that the Loewe
isn't perfect, but nor is any TV set. I happen to be someone who
blashphemes and proclaims defects in the best consumer Sonys :-).

Too often, I see TV sets with too much horizontal peaking, too much
coring, too much vertical enhancement, ANY SVM (all SVM is evil :-)),
and I am seldom satisfied when my 'perfectionist mode' is enabled.

When I gave my own comparisons in other postings, they were based upon
experience and reasonableness -- NOT an agenda to prove that NTSC-land
does better TVs than PAL-land (or vice versa.) I wasn't even intending
to whine about 50Hz, but happy to see that the 100Hz made the viewing
much nicer :-).

Frankly, very much similar to the FACT that speakers sound different in a
showroom environment than in a home, and the FACT that listening in the wrong
environment can bias a choice towards the suboptimal, the same thing is true
for TV sets in a normally lighted showroom. Unless one is experienced, an
overly sharp picture (even slightly overenhanced) can INITIALLY seem better
than a TV with a less pronounced 'enhancement.' In reality, too much
'sharpening' without careful crafting of the frequency/transfer function,
even if it isn't extreme, can obscure true detail.

So, even though my judgement isn't perfect, it is DEFINITELY based upon
experience and understanding about the actual design of the video
circuitry and processing. I can/do visualize the design of video
processing when I see the artifacting/defects.

I am NOT perfect, so some of my observations might be incorrect,
but my overall viewing comparisons indicated that current SDTV
regimes in 50Hz(100Hz) land vs 60Hz land seem to produce roughly
equivalent results (ignoring the screen update time issue itself.)

WRT the issue of divisiveness -- I wasn't really talking about
true HDTV -- the only comment that I have about that is where
certain TV shows are significantly 'nicer' when in HDTV (e.g.
CSI.) While another cop show, Law and Order is just normally
imaged better (they don't really take much advantage of HDTV,
and the widescreen is really the major upgrade of presentation.)

For example, Pal Plus is probably asymtotically adequate for Law and
Order -- HDTV doesnt' improve much. On the other hand, CSI almost
kind of specially takes advantage of HDTV. In general, the 'CBS look'
seems to do a little better with HDTV, but I don't really know
what it is... It isn't just the 1080i system choice...

John


John S. Dyson October 17th 04 02:23 AM

In article ,
Roderick Stewart writes:
In article , John S. Dyson wrote:
PAL TV sets just cannot give an accurate green (period.)

Can you explain why not?

Look up the color gamut, and then recognize that 'grass green' is
much better reproduced with the original NTSC gamut.

Color bars are NOT an indicator of gamut, but are used to "tune-up"
a system.


The colour gamut is determined by the characteristics of the dichroic
filter system in the camera and the primary colours emitted by the CRT.
(The encode/decode system is completely transparent as far as colour
analysis is concerned and simply results in RGB to RGB with no signal
level changes). When the same cameras with the same filters are used to
produce both NTSC and PAL, and the results are viewed on the same
monitors, how can they produce different results?

Read below -- first, the limitations of the 'green' filter dont' even
always apply for image capture. When they do, if you meet the NTSC
gamut, you can matrix it to be close to PAL. If you have a PAL gamut
to begin with, then there is little ability to reach into the NTSC
green.

The matrix is different between PAL and NTSC, and you are allowed to
produce a deeper, more realistic "green" on NTSC based upon the coordinates.
The gamut reaches deeper into green, and especially color realistic
phosphors or filter display devices have the possibility of providing
a more realistic rendition. Electronically, you can only create the
greenest green in the standard gamut, no matter the format. If that gamut
doesn't reach into a certain color region, then you cannot represent that
color.

Of course, if your phosphors do not match the specified gamut (which most
do not exactly match), then the colors will not match. A matrix can help
to approximate a conversion between a phosphor gamut to the specified
gamut, but the limit will hopefully :-) be approximately at the minimums
of the combined capability. (electronically, you'll be unlikely able
to force a visible green that is deeper than the phosphor :-).) For
example, standard P22 doesn't reach deep into NTSC green. That doesn't
mean that P22 cannot be used to represent 'green' on NTSC, but it
just means that it isn't very accurate in some cases. However, I do
have a pro monitor that reaches deeper into green than the standard TVs in
my repetoire.

The same kind of thing happens in the color splitting in a TV camera. The
filters don't have to perfectly match the gamut of the transmission
scheme, and a matrix can/does make the match come much closer. An NTSC
green filter should be able to produce close to a valid PAL green,
but it requires some matrixing, perhaps nonlinear processing (I haven't
actually done a design of a matrix, so I am looking at the superficial
math and referring to a detailed/fat reference manual.) Even color
negative prints (the masked kind) can be directly scanned and
produce a standard NTSC signal. This means that the limitations of
the color of the green 'filter' don't even apply for negative film
scanning in the same way!!!

When comparing my consumer proof monitor with my pro monitor, the
'green difference' when sourcing with my pro camera (KY-D29) is quite
noticeable. AFAIR, I have seen differences on externally sourced material
also. My consumer monitor is similar to most of them, where
green accuracy is attenuated, but that isn't really the same limitation
on non-phosphor display devices. The pro monitor can also distinguish
more subtile differences in green.

John


All times are GMT +1. The time now is 03:28 AM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com