A Home cinema forum. HomeCinemaBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HomeCinemaBanter forum » Home cinema newsgroups » High definition TV
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

when will a REAL HDTV projection unit come out?



 
 
Thread Tools Display Modes
  #11  
Old November 5th 03, 02:41 AM
Mudd Bug
external usenet poster
 
Posts: n/a
Default


"Matthew L. Martin" wrote in message
s.com...
Matthew Vaughan wrote:

"Matthew L. Martin" wrote in message
s.com...

Please do some research. 1080i is not equivalent to 540p. 1080i has
considerably more vertical resolution than 540p even in fast action
sequences.



This is not really true. With motion (particularly vertical motion), the
effective vertical resolution of an interlaced image is cut nearly in

half.

I have never seen a claim that the Kell factor drops as low as .50 in
any scene. Do you have a source that supports that claim?

Matthew


http://www.quantel.com/domisphere/in...256CCC004F2470

Kell Factor
The vertical definition of a scanned image is only around 70% (the Kell
Factor) of the line count due to a scan's inability to show detail occurring
between the lines. Note that, for interlaced scans, vertical definition is
further reduced by the Interlace Factor to 50% or less overall during most
vertical image movement.



--
http://www.mlmartin.com/bbq/

Thermodynamics For Dummies: You can't win.
You can't break even.
You can't get out of the game.



  #12  
Old November 5th 03, 02:41 AM
Mudd Bug
external usenet poster
 
Posts: n/a
Default


"Matthew L. Martin" wrote in message
s.com...
Matthew Vaughan wrote:

"Matthew L. Martin" wrote in message
s.com...

Please do some research. 1080i is not equivalent to 540p. 1080i has
considerably more vertical resolution than 540p even in fast action
sequences.



This is not really true. With motion (particularly vertical motion), the
effective vertical resolution of an interlaced image is cut nearly in

half.

I have never seen a claim that the Kell factor drops as low as .50 in
any scene. Do you have a source that supports that claim?

Matthew


Interlace Factor
The reduction in vertical definition during vertical image movement due to
interlaced (rather than progressive) scans. Typically this is assumed to be
30%, and is in addition to the Kell Factor (another 30% reduction), making
an overall reduction of 50%. Note that, when scanning film frame-per-frame
(ie 24 or 25fps - not 3:2 pull-down to 60fps), or a succession of electronic
frames each representing a single snapshot in time, there is no vertical
movement between fields and the Interlace Factor has no effect.


--
http://www.mlmartin.com/bbq/

Thermodynamics For Dummies: You can't win.
You can't break even.
You can't get out of the game.



  #13  
Old November 6th 03, 03:48 AM
Matthew L. Martin
external usenet poster
 
Posts: n/a
Default

Mudd Bug wrote:
"Matthew L. Martin" wrote in message
s.com...

Matthew Vaughan wrote:


"Matthew L. Martin" wrote in message
news.com...


Please do some research. 1080i is not equivalent to 540p. 1080i has
considerably more vertical resolution than 540p even in fast action
sequences.


This is not really true. With motion (particularly vertical motion), the
effective vertical resolution of an interlaced image is cut nearly in


half.

I have never seen a claim that the Kell factor drops as low as .50 in
any scene. Do you have a source that supports that claim?

Matthew



http://www.quantel.com/domisphere/in...256CCC004F2470

Kell Factor
The vertical definition of a scanned image is only around 70% (the Kell
Factor) of the line count due to a scan's inability to show detail occurring
between the lines. Note that, for interlaced scans, vertical definition is
further reduced by the Interlace Factor to 50% or less overall during most
vertical image movement.


Considering that most movement in TV images is horizontal ...

Matthew

--
http://www.mlmartin.com/bbq/

Thermodynamics For Dummies: You can't win.
You can't break even.
You can't get out of the game.

  #14  
Old November 6th 03, 02:05 PM
Mudd Bug
external usenet poster
 
Posts: n/a
Default


"Matthew L. Martin" wrote in message
s.com...
Mudd Bug wrote:
"Matthew L. Martin" wrote in message
s.com...

Matthew Vaughan wrote:


"Matthew L. Martin" wrote in message
news.com...


Please do some research. 1080i is not equivalent to 540p. 1080i has
considerably more vertical resolution than 540p even in fast action
sequences.


This is not really true. With motion (particularly vertical motion),

the
effective vertical resolution of an interlaced image is cut nearly in


half.

I have never seen a claim that the Kell factor drops as low as .50 in
any scene. Do you have a source that supports that claim?

Matthew






http://www.quantel.com/domisphere/in...256CCC004F2470

Kell Factor
The vertical definition of a scanned image is only around 70% (the Kell
Factor) of the line count due to a scan's inability to show detail

occurring
between the lines. Note that, for interlaced scans, vertical definition

is
further reduced by the Interlace Factor to 50% or less overall during

most
vertical image movement.


Considering that most movement in TV images is horizontal ...

Matthew


Maybe. But my only point is that progressive scan is better than interlace.
480p looks
much better than 480i, so much so that I would take fewer scan lines
progressive (720)
over more scan lines interlaced (1080). Both are a real improvement over the
old standard
but progressive looks better.



--
http://www.mlmartin.com/bbq/

Thermodynamics For Dummies: You can't win.
You can't break even.
You can't get out of the game.



  #15  
Old November 7th 03, 07:18 AM
Matthew Vaughan
external usenet poster
 
Posts: n/a
Default

"Matthew L. Martin" wrote in message
s.com...
Matthew Vaughan wrote:

This is not really true. With motion (particularly vertical motion), the
effective vertical resolution of an interlaced image is cut nearly in

half.

I have never seen a claim that the Kell factor drops as low as .50 in
any scene. Do you have a source that supports that claim?


I am not talking about the Kell factor. That affects all video images,
progressive scan and interlace alike. Since it affects all of them, there's
no point to factoring it in. What I am talking about is the additional
reduction in resolution specific to interlace.


  #16  
Old November 8th 03, 09:20 AM
Matthew Vaughan
external usenet poster
 
Posts: n/a
Default

"dave" wrote in message
news:[email protected]_s54...

By the way, 540p is NOT a higher resolution than 1080i, just that 540

lines
are drawn at once per frame, but ONLY 540 lines. 1080i, while "muddier"
than 540p is still twice the resolution only alternating lines are drawn

for
every frame, but there are spaces between the lines.


Even I would never have made that claim! At WORST 1080i may effectively
approach 1/2 vertically resolution, but at best it can be nearly like 1080p.
(Recently I've seen a reference that, on average, interlace reduces
effective vertical resolution by about 30% compared to progressive scan when
any vertical motion is present.)

Most people cannot
really discern a 1080i from a 1080p picture except when they are

displaying
frame by frame.


I'm not sure where you would get that idea, or how you would have tested it.
I seriously doubt most people have ever seen 1080p. With some program
material, that may be more true than with others. Certainly from a
sufficient distance, it would be difficult to tell the difference, but from
a similar distance or not much farther, it's also difficult to tell the
difference between 1080i and 480p, so that's not saying much.

In my opinion, comparing 720p with 1080i, the 1080i picture looks
significantly better while viewing HDNet on a CRT HD monitor.


No doubt it would, since the monitor is 1080i, so you are not actually
seeing 720p. The monitor throws away all the advantages of 720p since it
takes that nice progressive-scan signal and then interlaces it, so what you
are seeing is 720i, scaled to 1080. This is not the same as 720p displayed
in its native format.


  #17  
Old November 12th 03, 10:57 PM
dave
external usenet poster
 
Posts: n/a
Default


"Matthew Vaughan" wrote in message
...
"dave" wrote in message
news:[email protected]_s54...

By the way, 540p is NOT a higher resolution than 1080i, just that 540

lines
are drawn at once per frame, but ONLY 540 lines. 1080i, while "muddier"
than 540p is still twice the resolution only alternating lines are drawn

for
every frame, but there are spaces between the lines.


Even I would never have made that claim! At WORST 1080i may effectively
approach 1/2 vertically resolution, but at best it can be nearly like

1080p.
(Recently I've seen a reference that, on average, interlace reduces
effective vertical resolution by about 30% compared to progressive scan

when
any vertical motion is present.)


Maybe the definition of the word "interlaced" has somehow changed (I doubt
it): In the past, it meant that
the all the odd lines of an image were created in one field, followed by all
the even lines in the next field.
The phosphor dots (or bars) would glow just long enough so that the image
would appear continuous.
For example, if you had 100 lines of resolution, lines 1,3,5,7,9,11,...,99
would be scanned for the first field and 2,4,6,8,...,100 would be scanned
for the second field. The 2 fields would be 1 frame.
Because the image data could change between fields slightly, if you were to
"freeze frame" at any one point and there was a lot of movement in the
scene, you would see slight differences in every other line (as if the image
was behind blinds). Interlacing has nothing to do with displayable screen
resolution, so how could an interlaced image somehow reduce itself by 30%?
When it is interlaced, the other lines are simply not swept - it does not
mean they are not there or that the resolution is cut in half - the
alternate lines are still glowing from the last frame.
In my 100 line resolution screen example, it is true that only 50 lines are
being displayed at any one field, but the resolution is still 100 lines (100
horizontal lines). Manufacturers tend to overscan pixels to make up for a
physical screen resolution deficit and to sell outdated technology based on
confusing the consumer (like saying 1080i is the same "resolution" as 540p -
absolutely not true! - they both scan 540 lines in 1/60th of a second but
are differing in resolution by a factor of 2).
In short, unless IEEE and SMPTE has changed their definition of "interlaced"
to somehow include "resolution", you are completely and utterly misinformed.



Most people cannot
really discern a 1080i from a 1080p picture except when they are

displaying
frame by frame.


I'm not sure where you would get that idea, or how you would have tested

it.
I seriously doubt most people have ever seen 1080p. With some program
material, that may be more true than with others. Certainly from a
sufficient distance, it would be difficult to tell the difference, but

from
a similar distance or not much farther, it's also difficult to tell the
difference between 1080i and 480p, so that's not saying much.

In my opinion, comparing 720p with 1080i, the 1080i picture looks
significantly better while viewing HDNet on a CRT HD monitor.


No doubt it would, since the monitor is 1080i, so you are not actually
seeing 720p. The monitor throws away all the advantages of 720p since it
takes that nice progressive-scan signal and then interlaces it, so what

you
are seeing is 720i, scaled to 1080. This is not the same as 720p displayed
in its native format.




  #18  
Old November 13th 03, 03:35 AM
Mudd Bug
external usenet poster
 
Posts: n/a
Default


"dave" wrote in message
news:[email protected]_s02...

"Matthew Vaughan" wrote in message
...
"dave" wrote in message
news:[email protected]_s54...

By the way, 540p is NOT a higher resolution than 1080i, just that 540

lines
are drawn at once per frame, but ONLY 540 lines. 1080i, while

"muddier"
than 540p is still twice the resolution only alternating lines are

drawn
for
every frame, but there are spaces between the lines.


Even I would never have made that claim! At WORST 1080i may effectively
approach 1/2 vertically resolution, but at best it can be nearly like

1080p.
(Recently I've seen a reference that, on average, interlace reduces
effective vertical resolution by about 30% compared to progressive scan

when
any vertical motion is present.)


Maybe the definition of the word "interlaced" has somehow changed (I doubt
it): In the past, it meant that
the all the odd lines of an image were created in one field, followed by

all
the even lines in the next field.
The phosphor dots (or bars) would glow just long enough so that the image
would appear continuous.
For example, if you had 100 lines of resolution, lines 1,3,5,7,9,11,...,99
would be scanned for the first field and 2,4,6,8,...,100 would be scanned
for the second field. The 2 fields would be 1 frame.
Because the image data could change between fields slightly, if you were

to
"freeze frame" at any one point and there was a lot of movement in the
scene, you would see slight differences in every other line (as if the

image
was behind blinds).


Does this mean that field 1 of a frame is captured 1/60 of a second before
field 2 of the same frame?

Or are talking about field 1 being interlaced with the field 2 of the
previous frame and that the two fields
for any single frame are created from one image captured every 1/30 o a
second?


Interlacing has nothing to do with displayable screen
resolution, so how could an interlaced image somehow reduce itself by 30%?
When it is interlaced, the other lines are simply not swept - it does not
mean they are not there or that the resolution is cut in half - the
alternate lines are still glowing from the last frame.
In my 100 line resolution screen example, it is true that only 50 lines

are
being displayed at any one field, but the resolution is still 100 lines

(100
horizontal lines). Manufacturers tend to overscan pixels to make up for a
physical screen resolution deficit and to sell outdated technology based

on
confusing the consumer (like saying 1080i is the same "resolution" as

540p -
absolutely not true! - they both scan 540 lines in 1/60th of a second but
are differing in resolution by a factor of 2).
In short, unless IEEE and SMPTE has changed their definition of

"interlaced"
to somehow include "resolution", you are completely and utterly

misinformed.



Most people cannot
really discern a 1080i from a 1080p picture except when they are

displaying
frame by frame.


I'm not sure where you would get that idea, or how you would have tested

it.
I seriously doubt most people have ever seen 1080p. With some program
material, that may be more true than with others. Certainly from a
sufficient distance, it would be difficult to tell the difference, but

from
a similar distance or not much farther, it's also difficult to tell the
difference between 1080i and 480p, so that's not saying much.

In my opinion, comparing 720p with 1080i, the 1080i picture looks
significantly better while viewing HDNet on a CRT HD monitor.


No doubt it would, since the monitor is 1080i, so you are not actually
seeing 720p. The monitor throws away all the advantages of 720p since it
takes that nice progressive-scan signal and then interlaces it, so what

you
are seeing is 720i, scaled to 1080. This is not the same as 720p

displayed
in its native format.






  #19  
Old November 19th 03, 01:14 AM
Eric Hoffman
external usenet poster
 
Posts: n/a
Default

"Mudd Bug" wrote in message news:[email protected]
"dave" wrote in message
news:[email protected]_s02...

"Matthew Vaughan" wrote in message
...
"dave" wrote in message
news:[email protected]_s54...

By the way, 540p is NOT a higher resolution than 1080i, just that 540

lines
are drawn at once per frame, but ONLY 540 lines. 1080i, while

"muddier"
than 540p is still twice the resolution only alternating lines are

drawn
for
every frame, but there are spaces between the lines.

Even I would never have made that claim! At WORST 1080i may effectively
approach 1/2 vertically resolution, but at best it can be nearly like

1080p.
(Recently I've seen a reference that, on average, interlace reduces
effective vertical resolution by about 30% compared to progressive scan

when
any vertical motion is present.)


Maybe the definition of the word "interlaced" has somehow changed (I doubt
it): In the past, it meant that
the all the odd lines of an image were created in one field, followed by

all
the even lines in the next field.
The phosphor dots (or bars) would glow just long enough so that the image
would appear continuous.
For example, if you had 100 lines of resolution, lines 1,3,5,7,9,11,...,99
would be scanned for the first field and 2,4,6,8,...,100 would be scanned
for the second field. The 2 fields would be 1 frame.
Because the image data could change between fields slightly, if you were

to
"freeze frame" at any one point and there was a lot of movement in the
scene, you would see slight differences in every other line (as if the

image
was behind blinds).


Does this mean that field 1 of a frame is captured 1/60 of a second before
field 2 of the same frame?

Or are talking about field 1 being interlaced with the field 2 of the
previous frame and that the two fields
for any single frame are created from one image captured every 1/30 o a
second?


Interlacing has nothing to do with displayable screen
resolution, so how could an interlaced image somehow reduce itself by 30%?
When it is interlaced, the other lines are simply not swept - it does not
mean they are not there or that the resolution is cut in half - the
alternate lines are still glowing from the last frame.
In my 100 line resolution screen example, it is true that only 50 lines

are
being displayed at any one field, but the resolution is still 100 lines

(100
horizontal lines). Manufacturers tend to overscan pixels to make up for a
physical screen resolution deficit and to sell outdated technology based

on
confusing the consumer (like saying 1080i is the same "resolution" as

540p -
absolutely not true! - they both scan 540 lines in 1/60th of a second but
are differing in resolution by a factor of 2).
In short, unless IEEE and SMPTE has changed their definition of

"interlaced"
to somehow include "resolution", you are completely and utterly

misinformed.



Most people cannot
really discern a 1080i from a 1080p picture except when they are

displaying
frame by frame.

I'm not sure where you would get that idea, or how you would have tested

it.
I seriously doubt most people have ever seen 1080p. With some program
material, that may be more true than with others. Certainly from a
sufficient distance, it would be difficult to tell the difference, but

from
a similar distance or not much farther, it's also difficult to tell the
difference between 1080i and 480p, so that's not saying much.

In my opinion, comparing 720p with 1080i, the 1080i picture looks
significantly better while viewing HDNet on a CRT HD monitor.

No doubt it would, since the monitor is 1080i, so you are not actually
seeing 720p. The monitor throws away all the advantages of 720p since it
takes that nice progressive-scan signal and then interlaces it, so what

you
are seeing is 720i, scaled to 1080. This is not the same as 720p

displayed
in its native format.


I believe this depends on the system used to capture the video and is
independent of how it is displayed. My DV camera for example captures
in interlaced mode, 720x480 at 60 fields per second. Field 2 will be
1/60th of a second after Field 1, and is why any motion looks so
crappy on a non-interlaced display like my computer monitor. (Using
some form of deinterlacing that does more then just combine the two
fields together helps some, but at the expence of over all image
quality.)

Film on the other hand is done at 24fps and is non-interlaced in
nature. When converted to 30fps (60 fields/second), frame 1 and frame
2 come from the same film-frame and thus will be at the same point in
time.
  #20  
Old November 19th 03, 01:17 AM
Eric Hoffman
external usenet poster
 
Posts: n/a
Default

Sorry about the double post, but Google seems to have mangled my first
reply.

"Mudd Bug" wrote in message news:[email protected]
"dave" wrote in message
news:[email protected]_s02...

"Matthew Vaughan" wrote in message
...
"dave" wrote in message
news:[email protected]_s54...

By the way, 540p is NOT a higher resolution than 1080i, just that 540

lines
are drawn at once per frame, but ONLY 540 lines. 1080i, while

"muddier"
than 540p is still twice the resolution only alternating lines are

drawn
for
every frame, but there are spaces between the lines.

Even I would never have made that claim! At WORST 1080i may effectively
approach 1/2 vertically resolution, but at best it can be nearly like

1080p.
(Recently I've seen a reference that, on average, interlace reduces
effective vertical resolution by about 30% compared to progressive scan

when
any vertical motion is present.)


Maybe the definition of the word "interlaced" has somehow changed (I doubt
it): In the past, it meant that
the all the odd lines of an image were created in one field, followed by

all
the even lines in the next field.
The phosphor dots (or bars) would glow just long enough so that the image
would appear continuous.
For example, if you had 100 lines of resolution, lines 1,3,5,7,9,11,...,99
would be scanned for the first field and 2,4,6,8,...,100 would be scanned
for the second field. The 2 fields would be 1 frame.
Because the image data could change between fields slightly, if you were

to
"freeze frame" at any one point and there was a lot of movement in the
scene, you would see slight differences in every other line (as if the

image
was behind blinds).


Does this mean that field 1 of a frame is captured 1/60 of a second before
field 2 of the same frame?

Or are talking about field 1 being interlaced with the field 2 of the
previous frame and that the two fields
for any single frame are created from one image captured every 1/30 o a
second?


I believe this depends on the system used to capture the video and is
independent of how it is displayed. My DV camera for example captures
in interlaced mode, 720x480 at 60 fields per second. Field 2 will be
1/60th of a second after Field 1, and is why any motion looks so
crappy on a non-interlaced display like my computer monitor. (Using
some form of deinterlacing that does more then just combine the two
fields together helps some, but at the expence of over all image
quality.)

Film on the other hand is done at 24fps and is non-interlaced in
nature. When converted to 30fps (60 fields/second), frame 1 and frame
2 come from the same film-frame and thus will be at the same point in
time.


Interlacing has nothing to do with displayable screen
resolution, so how could an interlaced image somehow reduce itself by 30%?
When it is interlaced, the other lines are simply not swept - it does not
mean they are not there or that the resolution is cut in half - the
alternate lines are still glowing from the last frame.
In my 100 line resolution screen example, it is true that only 50 lines

are
being displayed at any one field, but the resolution is still 100 lines

(100
horizontal lines). Manufacturers tend to overscan pixels to make up for a
physical screen resolution deficit and to sell outdated technology based

on
confusing the consumer (like saying 1080i is the same "resolution" as

540p -
absolutely not true! - they both scan 540 lines in 1/60th of a second but
are differing in resolution by a factor of 2).
In short, unless IEEE and SMPTE has changed their definition of

"interlaced"
to somehow include "resolution", you are completely and utterly

misinformed.



Most people cannot
really discern a 1080i from a 1080p picture except when they are

displaying
frame by frame.

I'm not sure where you would get that idea, or how you would have tested

it.
I seriously doubt most people have ever seen 1080p. With some program
material, that may be more true than with others. Certainly from a
sufficient distance, it would be difficult to tell the difference, but

from
a similar distance or not much farther, it's also difficult to tell the
difference between 1080i and 480p, so that's not saying much.

In my opinion, comparing 720p with 1080i, the 1080i picture looks
significantly better while viewing HDNet on a CRT HD monitor.

No doubt it would, since the monitor is 1080i, so you are not actually
seeing 720p. The monitor throws away all the advantages of 720p since it
takes that nice progressive-scan signal and then interlaces it, so what

you
are seeing is 720i, scaled to 1080. This is not the same as 720p

displayed
in its native format.




 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Does DVI connection make a big difference on HDTV Rear projection monitor.? Nospam Home theater (general) 1 December 4th 03 01:17 PM
HDTV Newbie wants a little info... Mandy High definition TV 17 October 11th 03 03:58 PM
newbie wants comcast HDTV, but i need "HDTV monitor" (not "HDTV ready")? Doug High definition TV 8 September 10th 03 04:54 AM
Question re projection TV, HDTV, DLP, etc. Brian Siano High definition TV 0 September 2nd 03 06:08 AM


All times are GMT +1. The time now is 05:23 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
Copyright ©2004-2021 HomeCinemaBanter.
The comments are property of their posters.