A Home cinema forum. HomeCinemaBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HomeCinemaBanter forum » Home cinema newsgroups » High definition TV
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

HDTV reading between the lines



 
 
Thread Tools Display Modes
  #11  
Old January 24th 04, 11:48 PM
Jeff Rife
external usenet poster
 
Posts: n/a
Default

Ernie Wright ) wrote in alt.tv.tech.hdtv:
There aren't any exact figures, since they're not directly comparable
and, more importantly, film is *not* better.


Uh, huh...right.

The internegative (which is what is usually used to make a transfer to HD)
can easily resolve the equivalent of 4000x3000 pixels. Even if a method
like Super35 is used, the actual 1.78:1 area can easily handle 4000x2000
pixels.

People are often confused because things shot with HD cameras direct to
digital usually look "sharper" because of the way that the focus is held
throughout the entire frame.

--
Jeff Rife |
For address harvesters: | http://www.nabs.net/Cartoons/OverTheHedge/HDTV.gif
|
|
|
  #12  
Old January 24th 04, 11:53 PM
Jeff Rife
external usenet poster
 
Posts: n/a
Default

Norris Watkins ) wrote in alt.tv.tech.hdtv:
What I meant here was the grain etc. of the optical film. For instance
I read somewhere that a 35 mm still photo is equivalent to a 6
megapixel digital photograph.


A typical 35mm film frame can resolve 4000x3000 pixels, which would be
12 megapixel. But, many "6 megapixel" cameras actually only have about
1600x1200 pixel CCDs...they just have 3 of them (one for red, green, and
blue), and count each one, even though it takes all 3 to make up the whole
color picture. Using that sort of "advertising 6 megapixel camera" (as
opposed to a real one), 35mm film would actually be like a 36 megapixel
camera.

--
Jeff Rife | "Eternity with nerds. It's the Pasadena Star
For address harvesters: | Trek convention all over again."
| -- Nichelle Nichols, "Futurama"
|
|
  #13  
Old January 25th 04, 02:32 AM
John S. Dyson
external usenet poster
 
Posts: n/a
Default

In article ,
Jeff Rife writes:
Ernie Wright ) wrote in alt.tv.tech.hdtv:
There aren't any exact figures, since they're not directly comparable
and, more importantly, film is *not* better.


Uh, huh...right.

The internegative (which is what is usually used to make a transfer to HD)
can easily resolve the equivalent of 4000x3000 pixels. Even if a method
like Super35 is used, the actual 1.78:1 area can easily handle 4000x2000
pixels.

People are often confused because things shot with HD cameras direct to
digital usually look "sharper" because of the way that the focus is held
throughout the entire frame.

The actual quality difference between HDTV and film needs to be
carefully qualified. A often used film (for example) in a multiplex
will often look very bad, and an HDTV display will often look
better.

Few of us watches the internegative (or even first generation
prints in pristine condition.)

It is probably true that perfect HDTV vs. typically displayed
film will provide a favorable impression of HDTV. Film is still
better quality for 'mastering', when carefully handled.

All too often, the techie types (and film types over zealous
justifying film) will use limiting resolution of a color negative
as a final claim of the superiority of film. When film is
passed through several generations of processing, and noting that
using optical techniques to compensate for MTF rolloff isn't
very simple, it is plausible that the print being delivered to
a movie theatre has MUCH MUCH less resolution and much more grain
than the pristine 1st generation negative.

On the other hand, it is much easier to provide much flatter
resolution curve for video up to its limit. However, a hard
rolloff will tend to cause that hard-edged video look.

Perhaps the best approach for quality will be digital processing
of digitally captured film. Then, the final prints can be properly
corrected for detail (not limited to optical processing), while
the grain build-up can be minimized.

The ability to properly compensate for lens losses (diffraction) and
other kinds of rolloff is natural with HDTV.

John

  #14  
Old January 25th 04, 04:09 AM
Jeff Rife
external usenet poster
 
Posts: n/a
Default

John S. Dyson ) wrote in alt.tv.tech.hdtv:
When film is
passed through several generations of processing, and noting that
using optical techniques to compensate for MTF rolloff isn't
very simple, it is plausible that the print being delivered to
a movie theatre has MUCH MUCH less resolution and much more grain
than the pristine 1st generation negative.


Sure, but a HD telecine of a 35mm film IP/IN (which is what most HD movies
we see are made from) starts out far better than a final print. The
only reason uneducated people say that film has less resolution than HD
cameras is because most filmed movies end up "softer" than the stark
look an HD camera gives you. Softer != less resolution, though.

--
Jeff Rife |
For address harvesters: | http://www.nabs.net/Cartoons/OverThe...ricaOnline.gif
|
|
|
  #15  
Old January 25th 04, 05:44 AM
John S. Dyson
external usenet poster
 
Posts: n/a
Default

In article ,
Jeff Rife writes:
John S. Dyson ) wrote in alt.tv.tech.hdtv:
When film is
passed through several generations of processing, and noting that
using optical techniques to compensate for MTF rolloff isn't
very simple, it is plausible that the print being delivered to
a movie theatre has MUCH MUCH less resolution and much more grain
than the pristine 1st generation negative.


Sure, but a HD telecine of a 35mm film IP/IN (which is what most HD movies
we see are made from) starts out far better than a final print. The
only reason uneducated people say that film has less resolution than HD
cameras is because most filmed movies end up "softer" than the stark
look an HD camera gives you. Softer != less resolution, though.


I wouldn't use the term 'stark' as much as film having a rolloff that
tends to start off at about 300-500 lines equivalent. It is almost
impossible to optically correct the natural film rolloff. This is
why a hybrid approach would probably be best. HDTV can tend to maintain
the MTF (until its limit) much more easily. At higher amounts of
detail (frequencies), the SNR of film becomes pretty bad. Pro level
video gear is amazingly noise free. (It isn't like SVHS or Betamax,
or even as bad as DV25 WRT the contouring and edge noise.)

The perception of 'detail' is a combination of resolution and how
the frequency response 'rolls off.' In a pure film system, that
rolloff can cause an OVERLY SOFT look. You can tolerate the extremely
noise and loss of resolution in normal sized film prints because
your eyes tend to average the noise away. This is also the ONLY
reason why VHS was a tolerable video format (still framed VHS can
be quite ugly, and still framed film is noisier than the noise
averaged movies.) (35mm still photography can provide more
detail than the often smaller images on 35mm movie stock. Trying
to compare the negative from a Nikon with the movie negative
might cause some disappointment in the quality of the individual
movie frames.)

Anywhere near the alleged 3000-4000 lines of film resolution will be
TOTALLY BURIED by the time the film sits in the projector, ready
to be displayed on the screen.

So, rather than video always looking 'stark', I tend to perceive that
film is overly soft (due to the almost impossiblity of correcting
the rolloff of detail in the film domain.) (Actually, it CAN be
corrected, but people are used to the overly soft imaging.)

Overly hyped video is ANOTHER problem, but not all video need be
hyped (good equipment can tend to be relatively flat across the
entire bandwidth.) By the time that the video is prepared for
distribution, additional processing is added that can actually
purposefully ROLL OFF the freq response!!!

I don't judge film or video to be superior, but silly claims about
3000 lines of resolution for film need to be CAREFULLY QUALIFIED
that those are laboratory conditions that are only accurate for
first generation... That detail at 3000 lines of resolution will
be EXTREMELY NOISY by the time that a print of that negative is
made. By the time the projection print is made, the actual
detail for film might be slightly better than studio HDTV (or
theatre distribution), but will be much much noisier. The print
will get severely mangled, while the HDTV signal will likely
be fully corrected for errors for the entire life of the movie.

(When making prints, the lens systems also add to rolloff.) Again,
this is a reason why a HYBRID digital/film process is advantageous.

John
  #16  
Old January 25th 04, 08:48 AM
Ernie Wright
external usenet poster
 
Posts: n/a
Default

Jeff Rife wrote:

The only reason uneducated people say that film has less resolution
than HD cameras is


You're a smart guy, Jeff, but this innuendo about my lack of education
isn't your finest moment.

I've helped write software used by virtually every major Hollywood
animation and effects studio. When I say that DPs often go to 70mm to
be sure the digital artists can pull clean plates, I'm not basing that
on something I read on a website.

If the effective resolution of 35mm is twice as high as the resolution
used by digital effects and 3D animation artists, why would they need to
use 70mm? How do Pixar and PDI get away with doing entire movies at
1.5K? Why did no one seem to notice the supposedly inferior resolution
of Star Wars Ep 2, or Oh Brother, which after being shot on film was
digitized in its entirety for color shifting and other effects?

Nearly every major release in the past few years has passed through a
2K digital pipeline. If, as you claim, 35mm has twice that resolution,
the difference should be as obvious as HD and SD. Why isn't it?

The fact is, 95% of digital work is invisible to the audience: set
extension, background replacement, color correction, wire removal,
reflection and shadow removal, digital extras, even subtle changes to
the facial expressions of actors in close-ups. It's pervasive, it's
all 2K or less, and you can't tell that it wasn't done in-camera. The
only reason that's possible is that the real world resolution of 35mm
movie film is *not* a factor of two better than HD, or anywhere close to
that.

- Ernie http://mywebpages.comcast.net/erniew

  #17  
Old January 25th 04, 06:21 PM
Jeff Rife
external usenet poster
 
Posts: n/a
Default

Ernie Wright ) wrote in alt.tv.tech.hdtv:
The only reason uneducated people say that film has less resolution
than HD cameras is


You're a smart guy, Jeff, but this innuendo about my lack of education
isn't your finest moment.

I wasn't aiming this at you...sorry. I was talking about all the people
who think "The Tonight Show" is higher resolution than "CSI" (and other
filmed shows) because the HD cameras set to as much depth of field as
possible plus the very bright lighting make it look "sharper".

If the effective resolution of 35mm is twice as high as the resolution
used by digital effects and 3D animation artists, why would they need to
use 70mm?


Because digital effects are "stark" (like HD cameras). They use sharp
lines between different colors or objects. Film (and reality) shows that
there aren't any sharp lines between such things. For example, a person
standing in front of an object doesn't really have a line that separates
them from that object. Film shows this.

A higher resolution film allows more options about where you slice off
that "blending" between objects so that it can more closely match the
sharp, un-natural lines of CG effects.

How do Pixar and PDI get away with doing entire movies at
1.5K?


Because most of their films don't concern "real" things. People are just
fine with pseudo-objects (humans, toys, fish) at those resolutions. I
can still get a lot more resolution from even an average 35mm still
camera than one a frozen frame of a Pixar movie. This doesn't mean I don't
think the Pixar movies aren't great...I love 'em all.

The second reason they "get away with it" is because the final print or
DVD *is* low quality. People don't get to see the originals. Even an
HD showing might not matter, as last night's "A Bug's Life" shows...ABC
only does 1280x720p, which is around the resolution it was rendered at,
or at least not enough difference to matter.

Why did no one seem to notice the supposedly inferior resolution
of Star Wars Ep 2, or Oh Brother, which after being shot on film was
digitized in its entirety for color shifting and other effects?


Actually, there were quite a few comments about Ep1's live action being
obviously lower quality than other films, based on the HD showings. I'm
not sure about Ep2. Of course, Lucas *likes* that "everything in focus,
computer-like" look, and he's not a good enough director to be able to
use film to its full possibilities.

But, still, the major point is that nobody can notice much because the
final delivery medium is never as high a quality as the source.

Nearly every major release in the past few years has passed through a
2K digital pipeline. If, as you claim, 35mm has twice that resolution,
the difference should be as obvious as HD and SD. Why isn't it?


Again, because there isn't usually a good enough presentation.

The fact is, 95% of digital work is invisible to the audience: set
extension, background replacement, color correction, wire removal,
reflection and shadow removal, digital extras, even subtle changes to
the facial expressions of actors in close-ups. It's pervasive, it's
all 2K or less, and you can't tell that it wasn't done in-camera.


The average Joe can't tell it wasn't done in-camera. I see a lot more of
it than the average guy, but then I know what to look for. I've found
that I have to just "watch the movie" instead of notice these things. But,
if I sit down and watch just for effects, I see most of them.

Having done slide to digital and back years ago, I can tell you that it
was a revelation when we got the 4K scanner. There was so much detail
that the 2K scanner missed, it was unbelievable.

--
Jeff Rife |
For address harvesters: | http://www.nabs.net/Cartoons/RhymesW...****erBowl.jpg
|
|
|
  #18  
Old January 25th 04, 09:50 PM
Ernie Wright
external usenet poster
 
Posts: n/a
Default

Jeff Rife wrote:

I wasn't aiming this at you...sorry.


OK, thanks.

I was talking about all the people who think "The Tonight Show" is
higher resolution than "CSI" (and other filmed shows)


Right, the difference isn't resolution.

Because digital effects are "stark" (like HD cameras).


This is the same Leno/CSI mistake. It's a difference in signal to noise,
an artistic choice in the case of CG, not a limitation of the medium. CG
can be made to look exactly like film. It is all the time. All it takes
is a good color match and the introduction of artifacts like motion blur,
grain, and flutter. It doesn't require higher resolution.

Most digital work isn't of the fantasy/cartoon type. I guarantee you're
not seeing it. I wrote some of the software, I know some of the guys who
use it, and I don't see it. They have to tell me what they did.

People don't get to see the originals.


The people I know do. That's what I'm basing my assessment on.

Having done slide to digital and back years ago, I can tell you that
it was a revelation when we got the 4K scanner. There was so much
detail that the 2K scanner missed, it was unbelievable.


This likely says more about the quality of the scanners than it does
about the inherent resolution of the film, particularly if this was
years ago. Often the cheapest way to compensate for poor fidelity is by
oversampling.

And slides != movie film. They don't have to travel through the camera
at 2 feet per second, or match across thousands of frames.

It isn't terribly meaningful to talk about the discrete resolution of
analog media, but this idea that movie film is an order of magnitude
better than video is a holdover from an earlier time. We've arrived at
a point where they're roughly comparable. We don't need 4K until we do
digital IMAX, which in analog is 70mm film going through the camera
horizontally.

Kodak has stopped making traditional film cameras for Europe and North
America. Astrophotography is now 99% digital. We're in about the same
place with video that we were in with audio in 1985. The retirement of
the older analog medium isn't in the distant future. It's already
happening.

- Ernie http://mywebpages.comcast.net/erniew

  #19  
Old January 25th 04, 11:10 PM
Jeff Rife
external usenet poster
 
Posts: n/a
Default

Ernie Wright ) wrote in alt.tv.tech.hdtv:
Most digital work isn't of the fantasy/cartoon type. I guarantee you're
not seeing it.


After learning what to look for, it's not that hard to spot. Edge
enhancement is the number one giveaway. The digital artists don't seem
to be able to do *anything* without at least some EE, and no pure film
chain introduces EE.

Unreal movement is another thing to look for. Gravity and Newton's First
Law seem to be things they need refresher courses on.

It's also true that they are getting somewhat better, but only a bit.
The difference in "non-CGI-look" between the cave troll in FotR and Gollum
in RotK was a lot, but still not enough to keep from taking me out of
the movie *if* I paid attention to the video and not the story.

Once the various clues give me the hint that it was CG, it's easy to
start to spot all the other tell-tails.

It isn't terribly meaningful to talk about the discrete resolution of
analog media, but this idea that movie film is an order of magnitude
better than video is a holdover from an earlier time.


Well, 35mm *is* nearly an order of magnitude better than most NTSC.

We've arrived at
a point where they're roughly comparable.


To the average end viewer, sure.

We don't need 4K until we do
digital IMAX, which in analog is 70mm film going through the camera
horizontally.


Until there are some *much* better digital projectors, IMAX will have to
use film. And, I don't think you'd get away with 2K digital images for
IMAX.

Kodak has stopped making traditional film cameras for Europe and North
America.


That's not a big deal, as Kodak wasn't a big camera company. Let me know
when they stop making *film*.

--
Jeff Rife | Al Go To my left, you'll recognize
For address harvesters: | Gary Gygax, inventor of Dungeons &
| Dragons.
| Gary Gygax: Greetings it's a...
| [rolls dice]
| Gary Gygax: ...pleasure to meet you.
| -- "Futurama"
  #20  
Old January 26th 04, 01:36 AM
Ernie Wright
external usenet poster
 
Posts: n/a
Default

Jeff Rife wrote:

I guarantee you're not seeing it.


After learning what to look for, it's not that hard to spot. Edge
enhancement is the number one giveaway.


That's not edge enhancement, it's bad compositing, a failure to hide the
boundary between elements. That problem was a *lot* worse when it was
all done optically. It's not a digital problem. In fact, it's mostly a
film problem, which is why, like I said, they sometimes go to 70mm for
shots that need clean plates.

An example of the stuff you're not seeing: About half the TVs in the
movie Contact were burn-ins, meaning they were blank when the live action
was filmed, and the content, including reflections of the display in the
floor or whatever, was added later. I'd bet you a donut you can't tell
which TVs were live and which were fake without listening to Zemeckis's
commentary on the DVD.

Unless you know something about the layout of the VLA, you won't know
which of the telescopes were fake, or in which shots they were made to
move faster than they can in real life, while Jodie Foster was in the
foreground. Or in which shots she wasn't even in New Mexico.

Or in which shot Jodie's eyebrow was digitally "unraised," or the movie
camera was removed from reflections in her glasses. Or what color the
Aricebo dish really is--it's not the color it is in the movie. Or which
interiors were on a soundstage and mated digitally to moving exteriors.
Or which childhood home was real and which was 3D. Or at what point in
Foster's dash back to the VLA control room the switch took place between
location shooting at the VLA and a room hundreds of miles away on a
soundstage, without blurring or cutting away from her while she ran.

It doesn't take much discernment to recognize that Gollum's not real.
That's not the stuff I'm talking about.

Well, 35mm *is* nearly an order of magnitude better than most NTSC.


I meant order of magnitude, base 2. In other words, "a lot" rather than
"slightly" or "not at all."

That's not a big deal, as Kodak wasn't a big camera company. Let me
know when they stop making *film*.


U.S. film sales are already falling.

Give me a call in about 15 years.

- Ernie http://mywebpages.comcast.net/erniew

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
MOBILE HDTV Bob Miller High definition TV 56 January 20th 04 03:41 PM
HDTV Newbie wants a little info... Mandy High definition TV 17 October 11th 03 03:58 PM
newbie wants comcast HDTV, but i need "HDTV monitor" (not "HDTV ready")? Doug High definition TV 8 September 10th 03 04:54 AM
Comcast HDTV fun in Western PA (venting) John C. Ring, Jr. High definition TV 10 September 1st 03 05:11 PM


All times are GMT +1. The time now is 10:53 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
Copyright ©2004-2021 HomeCinemaBanter.
The comments are property of their posters.