A Home cinema forum. HomeCinemaBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HomeCinemaBanter forum » Home cinema newsgroups » High definition TV
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Do you really like the way HDTV looks?



 
 
Thread Tools Display Modes
  #81  
Old September 16th 06, 11:25 PM posted to rec.video.desktop,alt.tv.tech.hdtv,sci.engr.television.advanced
Dave Oldridge
external usenet poster
 
Posts: 139
Default Do you really like the way HDTV looks?

Martin Heffels t wrote
in :

On Fri, 15 Sep 2006 19:22:02 GMT, Dave Oldridge
wrote:

3db is not a really LARGE difference in clarity.


Decibels have nothing to do with the clarity of your picture. You
(usually) use them to tell a difference in power, voltage or
soundlevel.


Uh, they are still a ratio. And I do believe that clarity perception is of
a similar logarithmic nature to sound, so the analogy is apt.


--
Dave Oldridge+
ICQ 1800667
  #82  
Old September 16th 06, 11:27 PM posted to rec.video.desktop,alt.tv.tech.hdtv,sci.engr.television.advanced
Dave Oldridge
external usenet poster
 
Posts: 139
Default Do you really like the way HDTV looks?

" wrote in
ups.com:

big snip

But most TV series, even the 4:3 ones are shot on 35mm film and then
transferreb to video modes, often separately for the DVD releases.

--
Dave Oldridge+
ICQ 1800667


Back when Lucy and Desi were making series that was true, and it is
still true of dramas, but for some reason, sitcoms are often shot on
video, some even in super 16. Given the choice of a less expensive
medium, the producer will usually take it.


Still, a lot is shot on 35mm. My Buffy DVD's have a commentary somewhere.
They started with 16mm and then, after one season, when they actually had
some money, they switched to 35mm. But even 16mm is WAY better than most
NTSC broadcast TV resolution.


--
Dave Oldridge+
ICQ 1800667
  #83  
Old September 16th 06, 11:30 PM posted to rec.video.desktop,alt.tv.tech.hdtv,sci.engr.television.advanced
Dave Oldridge
external usenet poster
 
Posts: 139
Default Do you really like the way HDTV looks?

Bob Miller wrote in
link.net:

Dave Oldridge wrote:
Bob Miller wrote in
ink.net:

Dave Oldridge wrote:
Bob Miller wrote in
link.net:

Dave Oldridge wrote:
"HiC" wrote in
ink.net:

Went into a local Circuit City and took a good long look at
their HDTV selections. They had several including 2 1080p sets
that I was told were set up correctly and what I was seeing was
as good as it gets. Everything HD from the cams to the screen.
Both the 1080p's were running some sort of hard drive unit, not
off a broadcast.

I've been hearing how amazing HDTV is. Well....while there's a
certain "pow" when you first see them, I get the sense it's due
to some artifically induced phenomena. The colors seem vivid,
but it seems to me in an enhanced - i.e. forced way. There seems
to be an excessive "whiteness" to the image that adds a certain
kind of sparkle/sharpness, but again it seems artificial. The
real world as viewed by eyeballs doesn't seem that "sharp" or
vivid. The demos that were showing were clearly intended to take
advantage of this, all these closeups of brightly colored
flowers, snowboarders on glaring snow etc. I don't believe a sky
exists anywhere the shade of blue they were depicting in that
demo.

I see all kinds of artifacts in the images. Yeah, okay, they're
not meant to be viewed from 6 inches away. But when I back off
to 8 - 10 feet, I still see this odd graininess, especially when
the image pans. Plus all these other odd things that happen to
the image. Overall I find it harder on my eyes than a sharp
picture on a good analog tv.

As I understand it, in a few years we're getting all digital
whether we like it or not. Is the whole HDTV thing just a bill
of goods we got sold/crammed down our throats?
When I bought an HDTV-ready TV, I bought a CRT model. CRT and
rear projection CRT are proven technologies that can reproduce
signals at these resolutions. They've been in use for some time
in the computer industry, doing just that.

The difference is not HUGE, but my SD signals are actually
received, often, at EDTV resolution from a satellite, so what I'm
actually comparing is the line-doubled 480p signal from the
satellite to the 1080i signal from the same source. My estimate
is that the picture clarity is 3db better on the HDTV signals,
especially the good ones.

That's about twice as good as the SDTV signals.

Might that suggest that if the EDTV signal was actually true 480P
and had been captured with a good 720P camera that it might be as
good as the 1080i signal?
Actually, you might suggest it, but it runs counter to my actual
experience. I see materials that are converted from HD cameras all
the time and, while they are 1000% better than regular SDTV
signals, they are still about 3db short of a 1080i or 720p
production over the 1080i path from my satellite. Even the best
DVD films are about 3db worse. For example, I have the entire LotR
trilogy in anamorphic widescreen. It is good, but it still has
that 3db clarity loss from the 1080i version broeadcast by my movie
supplier.

That was a question. I was following your math and maybe
misunderstood it. You were saying "line-doubled 480P" which I
interpreted as 480i information. And I was then suggesting that if
it were true 480P from a very good source, since it has twice the
information as the 480i line doubled version, might it not be as
good as the 1080i you were comparing it to since you said the 1080i
was only twice as good as what I took to be 480i. Wouldn't 480P then
equal your 1080i?


I think you misunderstand something. The i in 1080i implies not that
the resolution is any less, but that the raster is scanned twice to
get the full frame. What you actually see depends on the vertical
refresh rate of the mode, which I'll assume is 30fps. So you lose
some resolution along the time axis to trade for resolution in space.
The picture is still 1920x1080 pixels, but, due to the interlace,
it's a little blurry where it's moving. Usually the eye doesn't see
this and most often, it is obscured by motion blur in the original
film source. Even on live baseball it looks OK to me.


Scanned twice to get a different half frame if everything is moving.
Scanned twice to get a full frame which would then be called 1080P if
you are doing a movie and have the luxury of scanning each frame twice
or a still image where little moves. Works great for still images.
Baseball can be pretty still most of the time. But how about
basketball or other active sports where more of the image is in
chaotic motion.

I understand that 1080i also introduces artifacts due to interlace
that would not be present with P. I like progressive and can't wait
till interlace leaves the scene altogether.

You talk of db as to image quality. I am taking that in the colloquial
to mean half as good on the way down or twice as good on the way up.
Is that how you are using it?


Yep...I do feel that the perception, like that of loudness is sort of
logarithmic, so db, as a ratio is a good expression of it. But it's
definitely an estimate. I have no way of making instrumented
measurements.

The trick with a CRT set like mine, with a Trinitron tube is to sit just
far enough away from the tube so that your eyesight merges the vertical
lines of the tube.

--
Dave Oldridge+
ICQ 1800667
  #84  
Old September 16th 06, 11:32 PM posted to rec.video.desktop,alt.tv.tech.hdtv,sci.engr.television.advanced
Dave Oldridge
external usenet poster
 
Posts: 139
Default Do you really like the way HDTV looks?

"Greg" wrote in
ups.com:

Dave Oldridge wrote:
...
I think you misunderstand something. The i in 1080i implies not that
the resolution is any less, but that the raster is scanned twice to
get the full frame. ...


All the same, there's some reason to think you get some sort of
quality premium for a progressive signal. I don't know why -- it
doesn't seem to be just the absence of motion blur. Owners of 480
line plasma displays seem to think they look very good showing
downconverted HD signals; some people think 720p looks at least
as good as 1080i; and Gamecube owners seem to agree that a 480p
picture looks much better than a 480i picture.


A progressive signal will have no interlace artifacts. When an
interlaced scan meets a moving object, blur is introduced that does NOT
stem from the original production. Of course it may be masked by the
original blur, and that's why interlaced scan was used for NTSC in the
first place. It's not really that noticeable except in certain
conditions.


--
Dave Oldridge+
ICQ 1800667
  #85  
Old September 17th 06, 12:24 AM posted to rec.video.desktop,alt.tv.tech.hdtv,sci.engr.television.advanced
Michael A. Terrell
external usenet poster
 
Posts: 62
Default Do you really like the way HDTV looks?

Dave Oldridge wrote:

Still, a lot is shot on 35mm. My Buffy DVD's have a commentary somewhere.
They started with 16mm and then, after one season, when they actually had
some money, they switched to 35mm. But even 16mm is WAY better than most
NTSC broadcast TV resolution.



16 mm was the medium of choice back in the early '70s when I started
working as a broadcast engineer. That's how non network programming was
fed to stations without 2" R-R or the new U-matic machines. 16 mm blew
away the first generation of U-matic for picture quality on a good film
chain projector and camera, while a good 2" tape machine could cost more
than everything else in the control room of a small station. By the
time I left Broadcasting, the 1" R-R machines had taken over. U-matic
was a lot better, but you could still tell the difference on a studio
monitor. The biggest problem in video quality with the standard NTSC
video was the cheap TV sets, and lower grade CRTs. It was amazing when
you brought a fairly decent TV set into the station, and compared it to
a $3,000 inline monitor, and even more so against the master monitor,
which was a $7,000 traditional tri-gun CRT. Even fed off air by the
Tektronix Demodulator, that monitor still blew everything else away.


--
Service to my country? Been there, Done that, and I've got my DD214 to
prove it.
Member of DAV #85.

Michael A. Terrell
Central Florida
  #86  
Old September 17th 06, 12:26 AM posted to rec.video.desktop,alt.tv.tech.hdtv,sci.engr.television.advanced
Martin Heffels
external usenet poster
 
Posts: 12
Default Do you really like the way HDTV looks?

On Sat, 16 Sep 2006 21:25:40 GMT, Dave Oldridge
wrote:

Uh, they are still a ratio. And I do believe that clarity perception is of
a similar logarithmic nature to sound, so the analogy is apt.


Nah, optical is in lines per inch, not dB. As far as I know that would be
linear and not logarithmic (exposure is logarithmic though).

-m-
--
  #87  
Old September 17th 06, 12:33 AM posted to rec.video.desktop,alt.tv.tech.hdtv,sci.engr.television.advanced
Martin Heffels
external usenet poster
 
Posts: 12
Default Do you really like the way HDTV looks?

On Sat, 16 Sep 2006 21:32:14 GMT, Dave Oldridge
wrote:

Of course it may be masked by the
original blur, and that's why interlaced scan was used for NTSC in the
first place. It's not really that noticeable except in certain
conditions.


Interlaced scan was used because in the beginning it was not possible for
the phosphors on the tube to "keep" the picture all the time for the whole
500 or so, lines.

-m-
--
  #88  
Old September 17th 06, 06:31 AM posted to rec.video.desktop,alt.tv.tech.hdtv,sci.engr.television.advanced
[email protected]
external usenet poster
 
Posts: 51
Default Do you really like the way HDTV looks?


Martin Heffels wrote:
On Sat, 16 Sep 2006 21:32:14 GMT, Dave Oldridge
wrote:

Of course it may be masked by the
original blur, and that's why interlaced scan was used for NTSC in the
first place. It's not really that noticeable except in certain
conditions.


Interlaced scan was used because in the beginning it was not possible for
the phosphors on the tube to "keep" the picture all the time for the whole
500 or so, lines.

-m-
--


Absolutely false. Interlace scan was used to reduce the bandwidth
required for transmission..

  #89  
Old September 17th 06, 08:09 AM posted to rec.video.desktop,alt.tv.tech.hdtv,sci.engr.television.advanced
Michael A. Terrell
external usenet poster
 
Posts: 62
Default Do you really like the way HDTV looks?

" wrote:

Martin Heffels wrote:
On Sat, 16 Sep 2006 21:32:14 GMT, Dave Oldridge
wrote:

Of course it may be masked by the
original blur, and that's why interlaced scan was used for NTSC in the
first place. It's not really that noticeable except in certain
conditions.


Interlaced scan was used because in the beginning it was not possible for
the phosphors on the tube to "keep" the picture all the time for the whole
500 or so, lines.

-m-
--


Absolutely false. Interlace scan was used to reduce the bandwidth
required for transmission..



Would you care to show everyone the math for that? You are still
sending the same amount of data in the same time frame per frame.
Interlacing does nothing to compress or alter the signal. Do you even
understand how complex a single frame of NTSC video really is? The only
difference in "One field pre frame" VS "Two fields per frame" is a
little change in the frame timing.


--
Service to my country? Been there, Done that, and I've got my DD214 to
prove it.
Member of DAV #85.

Michael A. Terrell
Central Florida
  #90  
Old September 17th 06, 08:23 AM posted to rec.video.desktop,alt.tv.tech.hdtv,sci.engr.television.advanced
Martin Heffels
external usenet poster
 
Posts: 12
Default Do you really like the way HDTV looks?

On 16 Sep 2006 21:31:31 -0700, "
wrote:

Absolutely false. Interlace scan was used to reduce the bandwidth
required for transmission..


That was the added benefit.
--
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Another Article About Sky's HDTV DAB sounds worse than FM UK sky 10 March 13th 05 04:07 PM
HDTV - after one year, I'm unimpressed magnulus High definition TV 102 December 27th 04 02:36 AM
HDTV - after one year, I'm unimpressed using a 17" monitor imjohnny High definition TV 0 December 1st 04 10:43 AM
Perfume on the PIG Bob Miller High definition TV 31 June 20th 04 03:49 PM
Completing the HDTV Picture Ben Thomas High definition TV 0 July 22nd 03 10:55 PM


All times are GMT +1. The time now is 08:14 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
Copyright ©2004-2021 HomeCinemaBanter.
The comments are property of their posters.