HomeCinemaBanter

HomeCinemaBanter (http://www.homecinemabanter.com/index.php)
-   High definition TV (http://www.homecinemabanter.com/forumdisplay.php?f=6)
-   -   when will a REAL HDTV projection unit come out? (http://www.homecinemabanter.com/showthread.php?t=4391)

dave November 1st 03 07:29 AM

when will a REAL HDTV projection unit come out?
 
Hi All,

Does anyone know of a company that plans to manufacture (at any price) a
TRUE HDTV projection system.
I have looked at RCA, Sony, Hitachi, Panasonic, JVC and other websites only
to find that EVERY system has a vertical resolution
of only 720 lines (they merely downgrade the 1080i signal). RCA briefly
produced a CRT style set that could display 1080 lines for real, but since
then, not a single manufacturer is producing a set (of any type including
plasma) that has a display resolution of 1920 x 1080 pixels.

My PC monitor can display this resolution, why the hell can't we get REAL
1080i High Def on a consumer set?

Dave



November 1st 03 09:53 PM

You should have said 1080p instead of 1080i. But yeah I would be curious how much a CRT
HDTV would cost that can do 1080x1920. If anyone knows which models if any can do that
right now I would like to know which one just because I'm curious. I'm not going to buy one,
because I'm a young adult and don't have the need or money. 1080x1920 is full HDTV
resolution. I know the HDTV rear projection my parents have can only do 1080i which is a less
expensive format that still qualifies as HDTV. Its funny that sometimes people say that 1080i is
the highest HDTV resolution but 1080p at 1080x1920 is. UPN even said star trek enterprise
would be in he highest HDTV format which was some format based on 1080i. I realized that
the compression to do HDTV isn't really even hear yet. H.264 codec is really needed to get
hdtv resolution (full or not) without using too much bandwidth. On another side note I think some
PC games should be developed with HDTV in mind. If console games games can be built with
hdtv in mind and consoles cost less than pc's and HDTV is very expensive, then people that are
willing to spend money on a pc are MORE likely to have the money to spend on an HDTV. This
does not consider the fact of wether or not a game would be well suited for a large screen of
high resolution but maybe lesser sharpness or dpi.



Matthew Vaughan November 2nd 03 12:29 AM


wrote in message
m...

On another side note I think some
PC games should be developed with HDTV in mind. If console games games can

be built with
hdtv in mind and consoles cost less than pc's and HDTV is very expensive,

then people that are
willing to spend money on a pc are MORE likely to have the money to spend

on an HDTV. This
does not consider the fact of wether or not a game would be well suited

for a large screen of
high resolution but maybe lesser sharpness or dpi.


Most PC games for the last 5 years or more have supported high resolutions
(not necessarily a particular HD format or 16:9, but high resolution).
Nowadays it's standard for PC games to support various resolutions from
640x480 to around 1600x1200 (800x600, 1024x768, 1280x1024, etc.), and good
PC graphic cards can really drive them at these resolutions with decent
performance (and often with full-screen anti-aliasing, anisotropic
filtering, etc. for a really beautiful picture). Plus don't forget that
these are all progressive-scan resolutions, not the 1/2 resolution you'd get
with interlace, and that the color is full resolution, not the 1/2 or less
resolution in each direction with TV or even HDTV. 1024x768 is sort of a
standard base resolution these days (no reason to go any lower unless you
have a really old graphic card). PC games also tend to include significantly
higher resolution textures than console games (or they would look blurry at
the high resolutions they are generally played at). Modern 3D graphic cards
generally have as much as 2-4x as much memory as an entire console (typical
these days is 64-256MB on the graphic card alone), very high fill rates, and
high color quality.

In general, PC games are much MORE suited to high resolutions than console
games, and are far more likely to have such support built in. I'm sure in
the future some PC games might start explicitly supporting 16:9 HDTV modes,
but most people don't have their PC hooked up to a TV, and wouldn't want to
play using mouse and keyboard when sitting on the sofa! (And if you're going
to degrade the image by displaying it on a TV, and use controls that are
inferior for many game types, why not just play it on a console in the first
place?)



Curmudgeon November 2nd 03 02:48 AM

1080i is the highest broadcast resolution under the ATSC standards. You
could have 1080...30 or 24fps/p but that is not higher resolution....It IS
a lower frame rate and more susceptible to flicker/artifacts. There is NO
1080p/60fps standard for broadcast television as adopted by the FCC...and
you will not likely live long enough to ever see such a standard...eats up
WAY too much bandwidth.

For spatial resolution 1080i is the highest. For temporal resolution, 720p
has the highest resolution. (and it is at 60p)


wrote in message
m...
You should have said 1080p instead of 1080i. But yeah I would be curious

how much a CRT
HDTV would cost that can do 1080x1920. If anyone knows which models if any

can do that
right now I would like to know which one just because I'm curious. I'm not

going to buy one,
because I'm a young adult and don't have the need or money. 1080x1920 is

full HDTV
resolution. I know the HDTV rear projection my parents have can only do

1080i which is a less
expensive format that still qualifies as HDTV. Its funny that sometimes

people say that 1080i is
the highest HDTV resolution but 1080p at 1080x1920 is. UPN even said star

trek enterprise
would be in he highest HDTV format which was some format based on 1080i. I

realized that
the compression to do HDTV isn't really even hear yet. H.264 codec is

really needed to get
hdtv resolution (full or not) without using too much bandwidth. On another

side note I think some
PC games should be developed with HDTV in mind. If console games games can

be built with
hdtv in mind and consoles cost less than pc's and HDTV is very expensive,

then people that are
willing to spend money on a pc are MORE likely to have the money to spend

on an HDTV. This
does not consider the fact of wether or not a game would be well suited

for a large screen of
high resolution but maybe lesser sharpness or dpi.





Matthew Vaughan November 2nd 03 06:49 AM

"Curmudgeon" wrote in message
...
1080i is the highest broadcast resolution under the ATSC standards. You
could have 1080...30 or 24fps/p but that is not higher resolution....It

IS
a lower frame rate and more susceptible to flicker/artifacts.


This isn't really true. Yes, 24 or 30p is not going to show motion as well
because the lower framerate will make things more jerky (though if the
source is a movie, it is already only 24fps). However, it will look
noticeably sharper and clearer than 60i, and have significantly less
artifacts. It will also not affect flicker, since the screen is still
refreshing at 60Hz.

There is NO
1080p/60fps standard for broadcast television as adopted by the FCC...


This is a shame, and part of the problem with HDTV: the standards were both
too ambitious and not ambitious enough. They didn't really get it "just
right" on anything. Almost everything done in HDTV is significantly, and
unnecessarily, compromised in one way or another. It is possible to get
pretty good quality, but at a higher price than should be needed to achieve
that level of quality. And for the price, had they waited just a few years
for the technology to catch up, yes, we COULD have 1080 x 60p, right now.

and you will not likely live long enough to ever see such a

standard...eats up
WAY too much bandwidth.


Only assuming they stick with an obsolete compression standard. It is
possible today, but not using MPEG2.

For spatial resolution 1080i is the highest. For temporal resolution,

720p
has the highest resolution. (and it is at 60p)


This is a disingenous way to put it. In fact, 1080i does not always have the
highest effective "spatial" resolution. Any area experiencing motion
(particularly vertical) displays only 1/2 the effective vertical resolution.
This gives an effective vertical resolution much of the time of only about
540 lines. And 1080i in real life only uses 1440 resolution horizontally. So
in many circumstances, 720p DOES have higher spatial resolution (1280x720
compared to 1440x540).

1080i does have the highest spatial resolution for absolutely still
material, but any motion will defeat this advantage. In addition, it
presents an easily visible (and highly annoying, in my opinion) "interlace
flicker" characteristic even in still material (which is what makes it look
so "TV" or "video"-like--it looks like the screen is alive or something, not
at all the "like looking through a window" characteristic that we'd all like
to see).

Interlace just has too many tradeoffs: you can eliminate the interlace
artifacts entirely, but only by halving both the resolution and the
framerate (display 1080i with only 540 scan lines and at 30fps progressive
scan, and all interlace artifacts should disappear). (You can also eliminate
them by backing up far enough from the screen that you wouldn't be able to
tell the difference between 1440x1080 and 720x540...)



Thumper November 2nd 03 06:10 PM

On Sun, 02 Nov 2003 05:49:15 GMT, "Matthew Vaughan"
wrote:

"Curmudgeon" wrote in message
. ..
1080i is the highest broadcast resolution under the ATSC standards. You
could have 1080...30 or 24fps/p but that is not higher resolution....It

IS
a lower frame rate and more susceptible to flicker/artifacts.


This isn't really true. Yes, 24 or 30p is not going to show motion as well
because the lower framerate will make things more jerky (though if the
source is a movie, it is already only 24fps). However, it will look
noticeably sharper and clearer than 60i, and have significantly less
artifacts. It will also not affect flicker, since the screen is still
refreshing at 60Hz.

There is NO
1080p/60fps standard for broadcast television as adopted by the FCC...


This is a shame, and part of the problem with HDTV: the standards were both
too ambitious and not ambitious enough. They didn't really get it "just
right" on anything. Almost everything done in HDTV is significantly, and
unnecessarily, compromised in one way or another. It is possible to get
pretty good quality, but at a higher price than should be needed to achieve
that level of quality. And for the price, had they waited just a few years
for the technology to catch up, yes, we COULD have 1080 x 60p, right now.

and you will not likely live long enough to ever see such a

standard...eats up
WAY too much bandwidth.


Only assuming they stick with an obsolete compression standard. It is
possible today, but not using MPEG2.

For spatial resolution 1080i is the highest. For temporal resolution,

720p
has the highest resolution. (and it is at 60p)


This is a disingenous way to put it. In fact, 1080i does not always have the
highest effective "spatial" resolution. Any area experiencing motion
(particularly vertical) displays only 1/2 the effective vertical resolution.
This gives an effective vertical resolution much of the time of only about
540 lines. And 1080i in real life only uses 1440 resolution horizontally. So
in many circumstances, 720p DOES have higher spatial resolution (1280x720
compared to 1440x540).

1080i does have the highest spatial resolution for absolutely still
material, but any motion will defeat this advantage. In addition, it
presents an easily visible (and highly annoying, in my opinion) "interlace
flicker" characteristic even in still material (which is what makes it look
so "TV" or "video"-like--it looks like the screen is alive or something, not
at all the "like looking through a window" characteristic that we'd all like
to see).

This is very subjective. I have never experienced any flicker.
Thumper


Interlace just has too many tradeoffs: you can eliminate the interlace
artifacts entirely, but only by halving both the resolution and the
framerate (display 1080i with only 540 scan lines and at 30fps progressive
scan, and all interlace artifacts should disappear). (You can also eliminate
them by backing up far enough from the screen that you wouldn't be able to
tell the difference between 1440x1080 and 720x540...)


To reply drop XYZ in address

John S. Dyson November 2nd 03 11:36 PM

In article ,
Thumper writes:

1080i does have the highest spatial resolution for absolutely still
material, but any motion will defeat this advantage. In addition, it
presents an easily visible (and highly annoying, in my opinion) "interlace
flicker" characteristic even in still material (which is what makes it look
so "TV" or "video"-like--it looks like the screen is alive or something, not
at all the "like looking through a window" characteristic that we'd all like
to see).

This is very subjective. I have never experienced any flicker.
Thumper

I am VERY sensitive to flicker, and the flicker due to properly
encoded 1080i should be nil. All too often, there is a mistake
made by neophytes that interlace filtering must always be constant.
Interlace filtering NEED NOT be a sledgehammer.

Actually, interlace filtering should be a dynamic scheme, where
twitter is reduced, but also the vertical resolution isn't severely
impacted. Such DYNAMIC filtering is commonly done for dynamic
combs in NTSC (PAL) TV sets, and is done in other applications
also. Even a good MPEG encoder will do various kinds of filtering
to avoid artifacts.

If there is 'flicker' on a 1080i presentation, it means that the
equipment isn't being used correctly, the equipment might be older
with less technology, or other 'excuses' might apply. Given
good 'processing', the stairstepping effects coming from the
natural aliasing of the sampled system should be approximately
as distracting as the interlace twitter.

People also forget that the 720p systems ALSO need vertical
filtering to avoid ugly aliasing effects. The amount of artifact
free resolution isn't as high as the 720p designation might
superficially imply.

John

Matthew Vaughan November 3rd 03 01:32 AM

"Thumper" wrote in message
...

presents an easily visible (and highly annoying, in my opinion)

"interlace
flicker" characteristic even in still material (which is what makes it

look
so "TV" or "video"-like--it looks like the screen is alive or something,

not
at all the "like looking through a window" characteristic that we'd all

like
to see).

This is very subjective. I have never experienced any flicker.


You may have a low visual sensitivity to motion and detail. I believe that I
have very high sensitivity to visual motion, because low computer monitor
refresh rates, as well as interlace, both bother me a lot more than they
seem to bother most people. Or you may watch TV from far enough away that
you would not be able to tell the difference if it were progressive scan
with 1/2 the resolution (that would be the only way for me to stop noticing
it). Or you may have a LCD or plasma or DLP that de-interlaces everything
before display anyway (it doesn't remove all interlace artifacts, but
substantially reduces the flicker in still scenes).



Matthew Vaughan November 3rd 03 01:51 AM

"John S. Dyson" wrote in message
...

If there is 'flicker' on a 1080i presentation, it means that the
equipment isn't being used correctly, the equipment might be older
with less technology, or other 'excuses' might apply. Given
good 'processing', the stairstepping effects coming from the
natural aliasing of the sampled system should be approximately
as distracting as the interlace twitter.


I agree with your other points, except to say that the display itself causes
a form of small-scale flicker that can't be removed (unless the phosphors
have a longer decay time, in which case you'd probably get increased
smearing).



Matthew L. Martin November 5th 03 01:15 AM

Matthew Vaughan wrote:

"Matthew L. Martin" wrote in message
s.com...

Please do some research. 1080i is not equivalent to 540p. 1080i has
considerably more vertical resolution than 540p even in fast action
sequences.



This is not really true. With motion (particularly vertical motion), the
effective vertical resolution of an interlaced image is cut nearly in half.


I have never seen a claim that the Kell factor drops as low as .50 in
any scene. Do you have a source that supports that claim?

Matthew

--
http://www.mlmartin.com/bbq/

Thermodynamics For Dummies: You can't win.
You can't break even.
You can't get out of the game.


Mudd Bug November 5th 03 02:41 AM


"Matthew L. Martin" wrote in message
s.com...
Matthew Vaughan wrote:

"Matthew L. Martin" wrote in message
s.com...

Please do some research. 1080i is not equivalent to 540p. 1080i has
considerably more vertical resolution than 540p even in fast action
sequences.



This is not really true. With motion (particularly vertical motion), the
effective vertical resolution of an interlaced image is cut nearly in

half.

I have never seen a claim that the Kell factor drops as low as .50 in
any scene. Do you have a source that supports that claim?

Matthew


http://www.quantel.com/domisphere/in...256CCC004F2470

Kell Factor
The vertical definition of a scanned image is only around 70% (the Kell
Factor) of the line count due to a scan's inability to show detail occurring
between the lines. Note that, for interlaced scans, vertical definition is
further reduced by the Interlace Factor to 50% or less overall during most
vertical image movement.



--
http://www.mlmartin.com/bbq/

Thermodynamics For Dummies: You can't win.
You can't break even.
You can't get out of the game.




Mudd Bug November 5th 03 02:41 AM


"Matthew L. Martin" wrote in message
s.com...
Matthew Vaughan wrote:

"Matthew L. Martin" wrote in message
s.com...

Please do some research. 1080i is not equivalent to 540p. 1080i has
considerably more vertical resolution than 540p even in fast action
sequences.



This is not really true. With motion (particularly vertical motion), the
effective vertical resolution of an interlaced image is cut nearly in

half.

I have never seen a claim that the Kell factor drops as low as .50 in
any scene. Do you have a source that supports that claim?

Matthew


Interlace Factor
The reduction in vertical definition during vertical image movement due to
interlaced (rather than progressive) scans. Typically this is assumed to be
30%, and is in addition to the Kell Factor (another 30% reduction), making
an overall reduction of 50%. Note that, when scanning film frame-per-frame
(ie 24 or 25fps - not 3:2 pull-down to 60fps), or a succession of electronic
frames each representing a single snapshot in time, there is no vertical
movement between fields and the Interlace Factor has no effect.


--
http://www.mlmartin.com/bbq/

Thermodynamics For Dummies: You can't win.
You can't break even.
You can't get out of the game.




Matthew L. Martin November 6th 03 03:48 AM

Mudd Bug wrote:
"Matthew L. Martin" wrote in message
s.com...

Matthew Vaughan wrote:


"Matthew L. Martin" wrote in message
news.com...


Please do some research. 1080i is not equivalent to 540p. 1080i has
considerably more vertical resolution than 540p even in fast action
sequences.


This is not really true. With motion (particularly vertical motion), the
effective vertical resolution of an interlaced image is cut nearly in


half.

I have never seen a claim that the Kell factor drops as low as .50 in
any scene. Do you have a source that supports that claim?

Matthew



http://www.quantel.com/domisphere/in...256CCC004F2470

Kell Factor
The vertical definition of a scanned image is only around 70% (the Kell
Factor) of the line count due to a scan's inability to show detail occurring
between the lines. Note that, for interlaced scans, vertical definition is
further reduced by the Interlace Factor to 50% or less overall during most
vertical image movement.


Considering that most movement in TV images is horizontal ...

Matthew

--
http://www.mlmartin.com/bbq/

Thermodynamics For Dummies: You can't win.
You can't break even.
You can't get out of the game.


Mudd Bug November 6th 03 02:05 PM


"Matthew L. Martin" wrote in message
s.com...
Mudd Bug wrote:
"Matthew L. Martin" wrote in message
s.com...

Matthew Vaughan wrote:


"Matthew L. Martin" wrote in message
news.com...


Please do some research. 1080i is not equivalent to 540p. 1080i has
considerably more vertical resolution than 540p even in fast action
sequences.


This is not really true. With motion (particularly vertical motion),

the
effective vertical resolution of an interlaced image is cut nearly in


half.

I have never seen a claim that the Kell factor drops as low as .50 in
any scene. Do you have a source that supports that claim?

Matthew






http://www.quantel.com/domisphere/in...256CCC004F2470

Kell Factor
The vertical definition of a scanned image is only around 70% (the Kell
Factor) of the line count due to a scan's inability to show detail

occurring
between the lines. Note that, for interlaced scans, vertical definition

is
further reduced by the Interlace Factor to 50% or less overall during

most
vertical image movement.


Considering that most movement in TV images is horizontal ...

Matthew


Maybe. But my only point is that progressive scan is better than interlace.
480p looks
much better than 480i, so much so that I would take fewer scan lines
progressive (720)
over more scan lines interlaced (1080). Both are a real improvement over the
old standard
but progressive looks better.



--
http://www.mlmartin.com/bbq/

Thermodynamics For Dummies: You can't win.
You can't break even.
You can't get out of the game.




Matthew Vaughan November 7th 03 07:18 AM

"Matthew L. Martin" wrote in message
s.com...
Matthew Vaughan wrote:

This is not really true. With motion (particularly vertical motion), the
effective vertical resolution of an interlaced image is cut nearly in

half.

I have never seen a claim that the Kell factor drops as low as .50 in
any scene. Do you have a source that supports that claim?


I am not talking about the Kell factor. That affects all video images,
progressive scan and interlace alike. Since it affects all of them, there's
no point to factoring it in. What I am talking about is the additional
reduction in resolution specific to interlace.



Matthew Vaughan November 8th 03 09:20 AM

"dave" wrote in message
news:[email protected]_s54...

By the way, 540p is NOT a higher resolution than 1080i, just that 540

lines
are drawn at once per frame, but ONLY 540 lines. 1080i, while "muddier"
than 540p is still twice the resolution only alternating lines are drawn

for
every frame, but there are spaces between the lines.


Even I would never have made that claim! At WORST 1080i may effectively
approach 1/2 vertically resolution, but at best it can be nearly like 1080p.
(Recently I've seen a reference that, on average, interlace reduces
effective vertical resolution by about 30% compared to progressive scan when
any vertical motion is present.)

Most people cannot
really discern a 1080i from a 1080p picture except when they are

displaying
frame by frame.


I'm not sure where you would get that idea, or how you would have tested it.
I seriously doubt most people have ever seen 1080p. With some program
material, that may be more true than with others. Certainly from a
sufficient distance, it would be difficult to tell the difference, but from
a similar distance or not much farther, it's also difficult to tell the
difference between 1080i and 480p, so that's not saying much.

In my opinion, comparing 720p with 1080i, the 1080i picture looks
significantly better while viewing HDNet on a CRT HD monitor.


No doubt it would, since the monitor is 1080i, so you are not actually
seeing 720p. The monitor throws away all the advantages of 720p since it
takes that nice progressive-scan signal and then interlaces it, so what you
are seeing is 720i, scaled to 1080. This is not the same as 720p displayed
in its native format.



dave November 12th 03 10:57 PM


"Matthew Vaughan" wrote in message
...
"dave" wrote in message
news:[email protected]_s54...

By the way, 540p is NOT a higher resolution than 1080i, just that 540

lines
are drawn at once per frame, but ONLY 540 lines. 1080i, while "muddier"
than 540p is still twice the resolution only alternating lines are drawn

for
every frame, but there are spaces between the lines.


Even I would never have made that claim! At WORST 1080i may effectively
approach 1/2 vertically resolution, but at best it can be nearly like

1080p.
(Recently I've seen a reference that, on average, interlace reduces
effective vertical resolution by about 30% compared to progressive scan

when
any vertical motion is present.)


Maybe the definition of the word "interlaced" has somehow changed (I doubt
it): In the past, it meant that
the all the odd lines of an image were created in one field, followed by all
the even lines in the next field.
The phosphor dots (or bars) would glow just long enough so that the image
would appear continuous.
For example, if you had 100 lines of resolution, lines 1,3,5,7,9,11,...,99
would be scanned for the first field and 2,4,6,8,...,100 would be scanned
for the second field. The 2 fields would be 1 frame.
Because the image data could change between fields slightly, if you were to
"freeze frame" at any one point and there was a lot of movement in the
scene, you would see slight differences in every other line (as if the image
was behind blinds). Interlacing has nothing to do with displayable screen
resolution, so how could an interlaced image somehow reduce itself by 30%?
When it is interlaced, the other lines are simply not swept - it does not
mean they are not there or that the resolution is cut in half - the
alternate lines are still glowing from the last frame.
In my 100 line resolution screen example, it is true that only 50 lines are
being displayed at any one field, but the resolution is still 100 lines (100
horizontal lines). Manufacturers tend to overscan pixels to make up for a
physical screen resolution deficit and to sell outdated technology based on
confusing the consumer (like saying 1080i is the same "resolution" as 540p -
absolutely not true! - they both scan 540 lines in 1/60th of a second but
are differing in resolution by a factor of 2).
In short, unless IEEE and SMPTE has changed their definition of "interlaced"
to somehow include "resolution", you are completely and utterly misinformed.



Most people cannot
really discern a 1080i from a 1080p picture except when they are

displaying
frame by frame.


I'm not sure where you would get that idea, or how you would have tested

it.
I seriously doubt most people have ever seen 1080p. With some program
material, that may be more true than with others. Certainly from a
sufficient distance, it would be difficult to tell the difference, but

from
a similar distance or not much farther, it's also difficult to tell the
difference between 1080i and 480p, so that's not saying much.

In my opinion, comparing 720p with 1080i, the 1080i picture looks
significantly better while viewing HDNet on a CRT HD monitor.


No doubt it would, since the monitor is 1080i, so you are not actually
seeing 720p. The monitor throws away all the advantages of 720p since it
takes that nice progressive-scan signal and then interlaces it, so what

you
are seeing is 720i, scaled to 1080. This is not the same as 720p displayed
in its native format.





Mudd Bug November 13th 03 03:35 AM


"dave" wrote in message
news:[email protected]_s02...

"Matthew Vaughan" wrote in message
...
"dave" wrote in message
news:[email protected]_s54...

By the way, 540p is NOT a higher resolution than 1080i, just that 540

lines
are drawn at once per frame, but ONLY 540 lines. 1080i, while

"muddier"
than 540p is still twice the resolution only alternating lines are

drawn
for
every frame, but there are spaces between the lines.


Even I would never have made that claim! At WORST 1080i may effectively
approach 1/2 vertically resolution, but at best it can be nearly like

1080p.
(Recently I've seen a reference that, on average, interlace reduces
effective vertical resolution by about 30% compared to progressive scan

when
any vertical motion is present.)


Maybe the definition of the word "interlaced" has somehow changed (I doubt
it): In the past, it meant that
the all the odd lines of an image were created in one field, followed by

all
the even lines in the next field.
The phosphor dots (or bars) would glow just long enough so that the image
would appear continuous.
For example, if you had 100 lines of resolution, lines 1,3,5,7,9,11,...,99
would be scanned for the first field and 2,4,6,8,...,100 would be scanned
for the second field. The 2 fields would be 1 frame.
Because the image data could change between fields slightly, if you were

to
"freeze frame" at any one point and there was a lot of movement in the
scene, you would see slight differences in every other line (as if the

image
was behind blinds).


Does this mean that field 1 of a frame is captured 1/60 of a second before
field 2 of the same frame?

Or are talking about field 1 being interlaced with the field 2 of the
previous frame and that the two fields
for any single frame are created from one image captured every 1/30 o a
second?


Interlacing has nothing to do with displayable screen
resolution, so how could an interlaced image somehow reduce itself by 30%?
When it is interlaced, the other lines are simply not swept - it does not
mean they are not there or that the resolution is cut in half - the
alternate lines are still glowing from the last frame.
In my 100 line resolution screen example, it is true that only 50 lines

are
being displayed at any one field, but the resolution is still 100 lines

(100
horizontal lines). Manufacturers tend to overscan pixels to make up for a
physical screen resolution deficit and to sell outdated technology based

on
confusing the consumer (like saying 1080i is the same "resolution" as

540p -
absolutely not true! - they both scan 540 lines in 1/60th of a second but
are differing in resolution by a factor of 2).
In short, unless IEEE and SMPTE has changed their definition of

"interlaced"
to somehow include "resolution", you are completely and utterly

misinformed.



Most people cannot
really discern a 1080i from a 1080p picture except when they are

displaying
frame by frame.


I'm not sure where you would get that idea, or how you would have tested

it.
I seriously doubt most people have ever seen 1080p. With some program
material, that may be more true than with others. Certainly from a
sufficient distance, it would be difficult to tell the difference, but

from
a similar distance or not much farther, it's also difficult to tell the
difference between 1080i and 480p, so that's not saying much.

In my opinion, comparing 720p with 1080i, the 1080i picture looks
significantly better while viewing HDNet on a CRT HD monitor.


No doubt it would, since the monitor is 1080i, so you are not actually
seeing 720p. The monitor throws away all the advantages of 720p since it
takes that nice progressive-scan signal and then interlaces it, so what

you
are seeing is 720i, scaled to 1080. This is not the same as 720p

displayed
in its native format.







Eric Hoffman November 19th 03 01:14 AM

"Mudd Bug" wrote in message news:[email protected]
"dave" wrote in message
news:[email protected]_s02...

"Matthew Vaughan" wrote in message
...
"dave" wrote in message
news:[email protected]_s54...

By the way, 540p is NOT a higher resolution than 1080i, just that 540

lines
are drawn at once per frame, but ONLY 540 lines. 1080i, while

"muddier"
than 540p is still twice the resolution only alternating lines are

drawn
for
every frame, but there are spaces between the lines.

Even I would never have made that claim! At WORST 1080i may effectively
approach 1/2 vertically resolution, but at best it can be nearly like

1080p.
(Recently I've seen a reference that, on average, interlace reduces
effective vertical resolution by about 30% compared to progressive scan

when
any vertical motion is present.)


Maybe the definition of the word "interlaced" has somehow changed (I doubt
it): In the past, it meant that
the all the odd lines of an image were created in one field, followed by

all
the even lines in the next field.
The phosphor dots (or bars) would glow just long enough so that the image
would appear continuous.
For example, if you had 100 lines of resolution, lines 1,3,5,7,9,11,...,99
would be scanned for the first field and 2,4,6,8,...,100 would be scanned
for the second field. The 2 fields would be 1 frame.
Because the image data could change between fields slightly, if you were

to
"freeze frame" at any one point and there was a lot of movement in the
scene, you would see slight differences in every other line (as if the

image
was behind blinds).


Does this mean that field 1 of a frame is captured 1/60 of a second before
field 2 of the same frame?

Or are talking about field 1 being interlaced with the field 2 of the
previous frame and that the two fields
for any single frame are created from one image captured every 1/30 o a
second?


Interlacing has nothing to do with displayable screen
resolution, so how could an interlaced image somehow reduce itself by 30%?
When it is interlaced, the other lines are simply not swept - it does not
mean they are not there or that the resolution is cut in half - the
alternate lines are still glowing from the last frame.
In my 100 line resolution screen example, it is true that only 50 lines

are
being displayed at any one field, but the resolution is still 100 lines

(100
horizontal lines). Manufacturers tend to overscan pixels to make up for a
physical screen resolution deficit and to sell outdated technology based

on
confusing the consumer (like saying 1080i is the same "resolution" as

540p -
absolutely not true! - they both scan 540 lines in 1/60th of a second but
are differing in resolution by a factor of 2).
In short, unless IEEE and SMPTE has changed their definition of

"interlaced"
to somehow include "resolution", you are completely and utterly

misinformed.



Most people cannot
really discern a 1080i from a 1080p picture except when they are

displaying
frame by frame.

I'm not sure where you would get that idea, or how you would have tested

it.
I seriously doubt most people have ever seen 1080p. With some program
material, that may be more true than with others. Certainly from a
sufficient distance, it would be difficult to tell the difference, but

from
a similar distance or not much farther, it's also difficult to tell the
difference between 1080i and 480p, so that's not saying much.

In my opinion, comparing 720p with 1080i, the 1080i picture looks
significantly better while viewing HDNet on a CRT HD monitor.

No doubt it would, since the monitor is 1080i, so you are not actually
seeing 720p. The monitor throws away all the advantages of 720p since it
takes that nice progressive-scan signal and then interlaces it, so what

you
are seeing is 720i, scaled to 1080. This is not the same as 720p

displayed
in its native format.


I believe this depends on the system used to capture the video and is
independent of how it is displayed. My DV camera for example captures
in interlaced mode, 720x480 at 60 fields per second. Field 2 will be
1/60th of a second after Field 1, and is why any motion looks so
crappy on a non-interlaced display like my computer monitor. (Using
some form of deinterlacing that does more then just combine the two
fields together helps some, but at the expence of over all image
quality.)

Film on the other hand is done at 24fps and is non-interlaced in
nature. When converted to 30fps (60 fields/second), frame 1 and frame
2 come from the same film-frame and thus will be at the same point in
time.

Eric Hoffman November 19th 03 01:17 AM

Sorry about the double post, but Google seems to have mangled my first
reply.

"Mudd Bug" wrote in message news:[email protected]
"dave" wrote in message
news:[email protected]_s02...

"Matthew Vaughan" wrote in message
...
"dave" wrote in message
news:[email protected]_s54...

By the way, 540p is NOT a higher resolution than 1080i, just that 540

lines
are drawn at once per frame, but ONLY 540 lines. 1080i, while

"muddier"
than 540p is still twice the resolution only alternating lines are

drawn
for
every frame, but there are spaces between the lines.

Even I would never have made that claim! At WORST 1080i may effectively
approach 1/2 vertically resolution, but at best it can be nearly like

1080p.
(Recently I've seen a reference that, on average, interlace reduces
effective vertical resolution by about 30% compared to progressive scan

when
any vertical motion is present.)


Maybe the definition of the word "interlaced" has somehow changed (I doubt
it): In the past, it meant that
the all the odd lines of an image were created in one field, followed by

all
the even lines in the next field.
The phosphor dots (or bars) would glow just long enough so that the image
would appear continuous.
For example, if you had 100 lines of resolution, lines 1,3,5,7,9,11,...,99
would be scanned for the first field and 2,4,6,8,...,100 would be scanned
for the second field. The 2 fields would be 1 frame.
Because the image data could change between fields slightly, if you were

to
"freeze frame" at any one point and there was a lot of movement in the
scene, you would see slight differences in every other line (as if the

image
was behind blinds).


Does this mean that field 1 of a frame is captured 1/60 of a second before
field 2 of the same frame?

Or are talking about field 1 being interlaced with the field 2 of the
previous frame and that the two fields
for any single frame are created from one image captured every 1/30 o a
second?


I believe this depends on the system used to capture the video and is
independent of how it is displayed. My DV camera for example captures
in interlaced mode, 720x480 at 60 fields per second. Field 2 will be
1/60th of a second after Field 1, and is why any motion looks so
crappy on a non-interlaced display like my computer monitor. (Using
some form of deinterlacing that does more then just combine the two
fields together helps some, but at the expence of over all image
quality.)

Film on the other hand is done at 24fps and is non-interlaced in
nature. When converted to 30fps (60 fields/second), frame 1 and frame
2 come from the same film-frame and thus will be at the same point in
time.


Interlacing has nothing to do with displayable screen
resolution, so how could an interlaced image somehow reduce itself by 30%?
When it is interlaced, the other lines are simply not swept - it does not
mean they are not there or that the resolution is cut in half - the
alternate lines are still glowing from the last frame.
In my 100 line resolution screen example, it is true that only 50 lines

are
being displayed at any one field, but the resolution is still 100 lines

(100
horizontal lines). Manufacturers tend to overscan pixels to make up for a
physical screen resolution deficit and to sell outdated technology based

on
confusing the consumer (like saying 1080i is the same "resolution" as

540p -
absolutely not true! - they both scan 540 lines in 1/60th of a second but
are differing in resolution by a factor of 2).
In short, unless IEEE and SMPTE has changed their definition of

"interlaced"
to somehow include "resolution", you are completely and utterly

misinformed.



Most people cannot
really discern a 1080i from a 1080p picture except when they are

displaying
frame by frame.

I'm not sure where you would get that idea, or how you would have tested

it.
I seriously doubt most people have ever seen 1080p. With some program
material, that may be more true than with others. Certainly from a
sufficient distance, it would be difficult to tell the difference, but

from
a similar distance or not much farther, it's also difficult to tell the
difference between 1080i and 480p, so that's not saying much.

In my opinion, comparing 720p with 1080i, the 1080i picture looks
significantly better while viewing HDNet on a CRT HD monitor.

No doubt it would, since the monitor is 1080i, so you are not actually
seeing 720p. The monitor throws away all the advantages of 720p since it
takes that nice progressive-scan signal and then interlaces it, so what

you
are seeing is 720i, scaled to 1080. This is not the same as 720p

displayed
in its native format.






All times are GMT +1. The time now is 05:23 AM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com