A Home cinema forum. HomeCinemaBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HomeCinemaBanter forum » Home cinema newsgroups » High definition TV
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

high definition does not equal better graphics (rant)



 
 
Thread Tools Display Modes
  #1  
Old August 18th 07, 08:47 PM posted to alt.games.video.sony-playstation3,comp.sys.ibm.pc.games.action,alt.tv.tech.hdtv,alt.games.video.xbox,alt.games.video.sony-playstation2
AirRaid
external usenet poster
 
Posts: 12
Default high definition does not equal better graphics (rant)

I read time & time again, forum posters, articles from the gaming and
non-gaming press about high definition. it seems to me most of them
can't or don't understand high definition / HD resolution does not
mean better graphics in games. better graphics and HD resolutions
are two entirely seperate things. with Xbox 360 and PS3, what you
have are two consoles with 99% or more of the games running in one (or
more) of the HD resolutons; 720p, 1080i, 1080p. this gives 3 to 6
times the amount of pixels on screen compared to 480i/480p. on top of
that, the actual graphics complexity, detail, lighting, textures,
shaders, special effects, etc are all somewhat better than the
previous generation game consoles, but it doesn't seem like a
generational leap, especially since most current-gen games run at
30fps while many of the best previous-gen games ran at 60fps in SD and
with less detail - yet those games look more appealing because having
a smooth framerate makes a great deal of difference. the problem with
X360/PS3 is, so much of their horsepower go into running games at 3 to
6 times the resolution that there is not MUCH extra performance left
over to improve the actual graphics-- there is some, but like I said,
not enough to seem like a generational leap over last-gen PS2/GGN/
Xbox. the powerful multi-core Xenon and CELL CPUs are mainly used to
run more advanced physics and A.I. calculations, keep track of more
players & objects on and off-screen in the game world. those CPUs can
help the graphics processors with graphics, but only at the "front
end" of the graphics rendering process, the initial calculations,
setup, but not the shading, rendering, displaying of graphics which is
the sole responsibility of the graphics processors (Xenos and RSX
respectively).
what I am saying in a nutshell is, neither Xenos nor RSX are a massive
leap beyond the graphics processors in last-gen consoles, the way the
graphics processors in last-gen consoles *were* a massive leap over
the graphics of the PS1 / Nintendo64 generation.

the other half of my point is, graphics can improve massively even
within standard definition / 640x480 resolution. look at all of the
thousands of prerendered CGI television programs, television network
logos, commercials and high budget movies or sequences in movies,
that you've watched on a standard definition screen, even played back
via old, super low definition VHS -- the graphics of which are FAR
superior to the best X360/PS3/PC games that run at the highest HDTV
resolution or higher in the case of PCs. an example would be the old
game Quake3. you can run that game at incredibly high resolutions,
well beyond 1080p, yet it's still the same old crude, low-polygon
graphics. you can say the same about more recent PC games like Doom3,
Half Life 2, as well as the newest games as well as upcoming games.
running the same game in higher resolution doesn't improve the actual
graphics. at the same time, you have newer titles that have both HD
resolution and better graphics. the improved graphics are not because
the resolution is HD. HD only shows more of what's already
there. another example, take a modern game like Half-Life 2 or
Quake 4, run it at 640 x 480, they still have better graphics than
Quake 3 running at 4000x2000 or whatever resolution.

my final point is. now that both PCs and consoles can run games in HD
resolutions, I hope resolution doesn't increase for another 20 years
or more. I hope hardware / platform providers allow developers to
create much better looking games that run at smoother framerates. the
move to HD resolution really hurt that progress with the current
consoles, (not talking about Wii).

  #2  
Old August 18th 07, 09:10 PM posted to alt.games.video.sony-playstation2,alt.games.video.sony-playstation3,alt.games.video.xbox,alt.tv.tech.hdtv,comp.sys.ibm.pc.games.action
NobodyImportant
external usenet poster
 
Posts: 1
Default high definition does not equal better graphics (rant)

AirRaid wrote:
I read time & time again, forum posters, articles from the gaming and
non-gaming press about high definition. it seems to me most of them
can't or don't understand high definition / HD resolution does not
mean better graphics in games. better graphics and HD resolutions
are two entirely seperate things. with Xbox 360 and PS3, what you
have are two consoles with 99% or more of the games running in one (or
more) of the HD resolutons; 720p, 1080i, 1080p. this gives 3 to 6
times the amount of pixels on screen compared to 480i/480p. on top of
that, the actual graphics complexity, detail, lighting, textures,
shaders, special effects, etc are all somewhat better than the
previous generation game consoles, but it doesn't seem like a
generational leap, especially since most current-gen games run at
30fps while many of the best previous-gen games ran at 60fps in SD and
with less detail - yet those games look more appealing because having
a smooth framerate makes a great deal of difference. the problem with
X360/PS3 is, so much of their horsepower go into running games at 3 to
6 times the resolution that there is not MUCH extra performance left
over to improve the actual graphics-- there is some, but like I said,
not enough to seem like a generational leap over last-gen PS2/GGN/
Xbox. the powerful multi-core Xenon and CELL CPUs are mainly used to
run more advanced physics and A.I. calculations, keep track of more
players & objects on and off-screen in the game world. those CPUs can
help the graphics processors with graphics, but only at the "front
end" of the graphics rendering process, the initial calculations,
setup, but not the shading, rendering, displaying of graphics which is
the sole responsibility of the graphics processors (Xenos and RSX
respectively).
what I am saying in a nutshell is, neither Xenos nor RSX are a massive
leap beyond the graphics processors in last-gen consoles, the way the
graphics processors in last-gen consoles *were* a massive leap over
the graphics of the PS1 / Nintendo64 generation.

the other half of my point is, graphics can improve massively even
within standard definition / 640x480 resolution. look at all of the
thousands of prerendered CGI television programs, television network
logos, commercials and high budget movies or sequences in movies,
that you've watched on a standard definition screen, even played back
via old, super low definition VHS -- the graphics of which are FAR
superior to the best X360/PS3/PC games that run at the highest HDTV
resolution or higher in the case of PCs. an example would be the old
game Quake3. you can run that game at incredibly high resolutions,
well beyond 1080p, yet it's still the same old crude, low-polygon
graphics. you can say the same about more recent PC games like Doom3,
Half Life 2, as well as the newest games as well as upcoming games.
running the same game in higher resolution doesn't improve the actual
graphics. at the same time, you have newer titles that have both HD
resolution and better graphics. the improved graphics are not because
the resolution is HD. HD only shows more of what's already
there. another example, take a modern game like Half-Life 2 or
Quake 4, run it at 640 x 480, they still have better graphics than
Quake 3 running at 4000x2000 or whatever resolution.

my final point is. now that both PCs and consoles can run games in HD
resolutions, I hope resolution doesn't increase for another 20 years
or more. I hope hardware / platform providers allow developers to
create much better looking games that run at smoother framerates. the
move to HD resolution really hurt that progress with the current
consoles, (not talking about Wii).



I agree with most of what you said. Graphics are different from resolution.
However IMO they both contribute equally to the overall look of the game.
Throw Half Life 2 in your PC drive and fire it up. Put every detail up to the
max, textures and shading, anti-aliasing and anisotropic filtering, everything
up on quality, but run it at 640 x 480. Then compare that to 2048 x 1536 with
nothing extra turned on. I would be interested to look at screenshots of both
those setups, but alas I am too lazy, but my point is that I venture to say
that the OVERALL look of game doesn't differ a great deal.
Reguardless, my point is that you need both. I agree 100% that this pursuit of
constantly driving up the resolution is the wrong direction. But by the same
token I feel that staying put for 2 decades just to concentrete on the rest is
just as bad.
  #3  
Old August 18th 07, 10:43 PM posted to alt.games.video.sony-playstation3,comp.sys.ibm.pc.games.action,alt.tv.tech.hdtv,alt.games.video.xbox,alt.games.video.sony-playstation2
Jordan
external usenet poster
 
Posts: 16
Default high definition does not equal better graphics (rant)

On Aug 18, 11:47 am, AirRaid wrote:

HD only shows more of what's already
there. another example, take a modern game like Half-Life 2 or
Quake 4, run it at 640 x 480, they still have better graphics than
Quake 3 running at 4000x2000 or whatever resolution.


This is something I've been saying for years, which most people still
don't get.

It all depends on the native resolution that a game is created in. If
you have a texture that is created in 640 x 480 it's ALWAYS going to
be 640 x 480 regardless of what resolution you display it at. Choosing
to display an Atari 2600 game at 1080p is not going to miraculously
make it an HD image. You can't add data that's not there.

Fortunately for folks without HDTV they still get the benefit of
higher resolution graphics. Look at a flick like King Kong. I can't
begin to imagine what resolution they rendered the movie in but it's
going to look great regardless of what TV set you show it on. Yes, if
you have an HDTV and HD media it's going to look better because you're
many steps closer to the native resolution, but it's not going to look
"bad" in any lower res.

- Jordan

  #4  
Old August 19th 07, 04:27 AM posted to alt.games.video.sony-playstation3,comp.sys.ibm.pc.games.action,alt.tv.tech.hdtv,alt.games.video.xbox,alt.games.video.sony-playstation2
RMZ
external usenet poster
 
Posts: 4
Default high definition does not equal better graphics (rant)

I understand where you're going with this and I don't disagree
entirely, but I want to make some counterpoints, see below.

On Aug 18, 1:47 pm, AirRaid wrote:
I read time & time again, forum posters, articles from the gaming and
non-gaming press about high definition. it seems to me most of them
can't or don't understand high definition / HD resolution does not
mean better graphics in games. better graphics and HD resolutions
are two entirely seperate things.


If by "better graphics" you mean 3D rendering capabilties then yes.

The exception is of course the resolution required to produce a
quality image is relative to screen size and monitor capabilities. For
example on a Nintendo DS , PSP, video iPod, etc.. you can get away
with lower resolutions without degrading the image quality. When you
play a normal definition game system like the PS1, PS2, Gamecube,
etc... on a non-digital, analog television these systems look better
than they do on most HDTV sets, because the sets can't produce a high
enough quality image to reveal all the flaws. That sounds strange I
know, but that's exactly what happens, it's also why non-HD television
shows look especially bad on HDTV.

If you take a game system like PS1 or PS2 where most of the the games
are designed around non-HD spec; When you play those games on a HD
capable TV set you notice more of the flaws for the same reason. Sort
of like putting a magnifying glass up to newspaper comic strips you
can see the ugly details and the problem becomes more noticeable/
pronounced for the larger the screen becomes. There's also quite a bit
more color depth to HD. If you own an HDTV then having a game console
that can output HD does matters a whole lot, otherwise the games won't
look so great.

If you have a 42"-60" or whatever size widescreen HDTV you have,
you're going to need a higher resolution image to give you the same
perceived quality as you would get from a non-HD image on an analog
TV. But the image the end up with is much higher resolution so the
added detail is there and if since the display is HD it's going to
shine through.

The display size and viewing distance will determine how much your
eyes can detect in perceived quality difference between analog TV and
digital HD, but there is a huge difference there with the right size
screen and the correct viewing distance.



with Xbox 360 and PS3, what you
have are two consoles with 99% or more of the games running in one (or
more) of the HD resolutons; 720p, 1080i, 1080p. this gives 3 to 6
times the amount of pixels on screen compared to 480i/480p. on top of
that, the actual graphics complexity, detail, lighting, textures,
shaders, special effects, etc are all somewhat better


The lighting, shaders, special effects, etc... are at least twice as
cmplex. In fact in regard to shaders this is quite easy to measure
from the XBox 1 to XBox 360 as Microsoft implementes something called
Pixel Shader in Direct3D (the API used to program the system). The
video hardware in the 360 is about 5x more capable than the previous
generation in the area of producing real time lighting effects, bump
mapping, etc... You can look at the specs of the hardware from ATI and
you can measure the distance and it's drastic.

than the
previous generation game consoles, but it doesn't seem like a
generational leap, especially since most current-gen games run at
30fps while many of the best previous-gen games ran at 60fps in SD and
with less detail - yet those games look more appealing because having
a smooth framerate makes a great deal of difference.


Where did you find most games run at 30fps? This isn't true. Perhaps
on PS3.

the problem with
X360/PS3 is, so much of their horsepower go into running games at 3 to
6 times the resolution that there is not MUCH extra performance left
over to improve the actual graphics-- there is some, but like I said,
not enough to seem like a generational leap over last-gen PS2/GGN/
Xbox.


Oh it's very, very much pronounced. You are absolutely wrong. Go play
Burnout: Revenge on the PS2 and then try the XBox 360 version on an
HDTV. One of my family members who loves Burnout: Revenge on PS2 came
over and his jaw dropped when he saw the 360 version of the game. He
asked me "does it look that amazing because you have an HDTV or would
it look that good on my set". I couldn't answer him, because I've
never tried the 360 on a non HD television. At this point in your
post, you've angered me, because... well you're spewing bull**** that
is not true. My six year old can see the difference.

To anyone reading this bs, I can challenge you with two games:
Burnout: Revenge and the movie tie in Transformers: The Game. I own a
PS2 and a 360 and I own both of these games on both platforms, there
is a world of realism in the 360 versions not obtainable in the PS2
versions. It comes from a combination of improved lighting effects,
improved resolution, much improved character models and the
performance of the games match. There is no noticable better
framerate. In fact, if anything Burnout: Revenge gets a higher
framerate on the 360 at 720p.

Anyone can prove this and in fact I would say go find a Gamestop or
some other retailer that has multiple systems hooked up and is willing
to let you try out games. If they have used copies Burnout: Revenge
for the PS2 and 360 they will let you put this to the test and you'll
see original poster here is way off the mark.

the powerful multi-core Xenon and CELL CPUs are mainly used to
run more advanced physics and A.I. calculations, keep track of more
players & objects on and off-screen in the game world. those CPUs can
help the graphics processors with graphics, but only at the "front
end" of the graphics rendering process, the initial calculations,
setup, but not the shading, rendering, displaying of graphics which is
the sole responsibility of the graphics processors (Xenos and RSX
respectively).
what I am saying in a nutshell is, neither Xenos nor RSX are a massive
leap beyond the graphics processors in last-gen consoles, the way the
graphics processors in last-gen consoles *were* a massive leap over
the graphics of the PS1 / Nintendo64 generation.


I agree with this for the most part, except for where it sounds like
you are imply the GPU's in these new systems just aren't that big of a
leap, that's where you're wrong. You're wrong not only by the specs on
paper, but again you can look at cross-platform games like Burnout and
see a very, very clear difference immediately, a difference like where
you never want to touch the PS2 version again.

Aside from the processor leaps the GPU in the XBox 360 is also a
massive leap over the PS2 and original XBox, see Pixel Sadder 3.0 spec
for Direct3D devices. Your way off the mark, I post the spec
differences, but the leap between the graphics hardware from original
XBox (which was actually a bit better than PS2, but not by much) to
XBox 360 is very significant.

Now the GPU in the PS3 is a different story. It's a tad less capable
than the one found in the 360, although not enough so to make a
pronounced difference in the hands of the right developer.


the other half of my point is, graphics can improve massively even
within standard definition / 640x480 resolution. look at all of the
thousands of prerendered CGI television programs, television network
logos, commercials and high budget movies or sequences in movies,
that you've watched on a standard definition screen, even played back
via old, super low definition VHS -- the graphics of which are FAR
superior to the best X360/PS3/PC games that run at the highest HDTV
resolution or higher in the case of PCs.



Well, this is an interesting idea and there is a good point there. You
want less horsepower devoted to resolution and more shifted to better
quality visuals. Well, the visuals on the PS3 and 360 are a generation
ahead of the PS2 in terms of what's possible with character models,
lighting, etc... this is where we disagree, but you are we have games
that exist on both the PS2 and 360 that prove it, you should try some
of them before holding so tightly to this opinion. Again, two in
recent memory (there are a few) would be Burnout: Revenge and
Transformers: The Game. After you['ve played the "nex-gen" versions of
these titles on 360 you will be very disapointed in the PS2's
graphical on these titles.You can see a very distinct generation gap
in quality outside of resolution. To me this fact invalidates you
point, but it is a passionate point.

The premise that they could take that extra horsepower devoted to HD
resolutions and instead give us better quality low-resolutions visuals
is true, but it won't happen. You said 99% of the games for PS3 and
360 are HD, No 100% of them are because both Microsoft and Sony are
comitted to providing HD quality this generation and they are forcing
all games on their console (first party and third party) to support HD
and to be designed with wide screen HD in mind.


an example would be the old
game Quake3. you can run that game at incredibly high resolutions,
well beyond 1080p, yet it's still the same old crude, low-polygon
graphics. you can say the same about more recent PC games like Doom3,
Half Life 2, as well as the newest games as well as upcoming games.
running the same game in higher resolution doesn't improve the actual
graphics.



True to some extent. If you have 42" LCD display for your PC monitor
and you go to run Doom III at 640x480 at 60fps and then you fire up
Quake 2 at 1920x1080 at 60fps on that set up, you would see so much
pix elation in Doom III that many would probably argue Quake II had
the better graphics of the two. The pixels at 640x480 would be
pronounced on a large screen size.

So again, the importance of resolution is always linked with size of
display. In terms of HD, capability of detail as well, since even a
smaller, say 32" HDTV will show all the flaws of an converted analog
480i or 480p signal compared to an analog 32" non HDTV.




  #5  
Old August 19th 07, 06:26 PM posted to alt.games.video.sony-playstation3,comp.sys.ibm.pc.games.action,alt.tv.tech.hdtv,alt.games.video.xbox,alt.games.video.sony-playstation2
jgarbuz
external usenet poster
 
Posts: 1
Default high definition does not equal better graphics (rant)

Just two points:

First, higher resolution doesn't mean much if your eyesight is weak. Forget
about screens; if you look at anything in real life, the clarity will depend
a lot on how sharp your eyesight is. To improve, you need glasses or laser
surgery.
Second, the larger the screen, the higher resolution you will need just to
stay the same sharpness. On a small TV or moniter, the pixels are closer
together, so the picture looks sharper. If you take the same resolution to a
large screen TV, obviously it will be less clear unless you have more
pixels, i.e., higher resolution. If you have a 49" screen or larger, you
certainly will need a higher definition (more pixels per square inch) DVD or
else it will be blurrier than on a 32" screen or less. Larger screens drive
the need for higher definition video standards.



"AirRaid" wrote in message
ups.com...
I read time & time again, forum posters, articles from the gaming and
non-gaming press about high definition. it seems to me most of them
can't or don't understand high definition / HD resolution does not
mean better graphics in games. better graphics and HD resolutions
are two entirely seperate things. with Xbox 360 and PS3, what you
have are two consoles with 99% or more of the games running in one (or
more) of the HD resolutons; 720p, 1080i, 1080p. this gives 3 to 6
times the amount of pixels on screen compared to 480i/480p. on top of
that, the actual graphics complexity, detail, lighting, textures,
shaders, special effects, etc are all somewhat better than the
previous generation game consoles, but it doesn't seem like a
generational leap, especially since most current-gen games run at
30fps while many of the best previous-gen games ran at 60fps in SD and
with less detail - yet those games look more appealing because having
a smooth framerate makes a great deal of difference. the problem with
X360/PS3 is, so much of their horsepower go into running games at 3 to
6 times the resolution that there is not MUCH extra performance left
over to improve the actual graphics-- there is some, but like I said,
not enough to seem like a generational leap over last-gen PS2/GGN/
Xbox. the powerful multi-core Xenon and CELL CPUs are mainly used to
run more advanced physics and A.I. calculations, keep track of more
players & objects on and off-screen in the game world. those CPUs can
help the graphics processors with graphics, but only at the "front
end" of the graphics rendering process, the initial calculations,
setup, but not the shading, rendering, displaying of graphics which is
the sole responsibility of the graphics processors (Xenos and RSX
respectively).
what I am saying in a nutshell is, neither Xenos nor RSX are a massive
leap beyond the graphics processors in last-gen consoles, the way the
graphics processors in last-gen consoles *were* a massive leap over
the graphics of the PS1 / Nintendo64 generation.

the other half of my point is, graphics can improve massively even
within standard definition / 640x480 resolution. look at all of the
thousands of prerendered CGI television programs, television network
logos, commercials and high budget movies or sequences in movies,
that you've watched on a standard definition screen, even played back
via old, super low definition VHS -- the graphics of which are FAR
superior to the best X360/PS3/PC games that run at the highest HDTV
resolution or higher in the case of PCs. an example would be the old
game Quake3. you can run that game at incredibly high resolutions,
well beyond 1080p, yet it's still the same old crude, low-polygon
graphics. you can say the same about more recent PC games like Doom3,
Half Life 2, as well as the newest games as well as upcoming games.
running the same game in higher resolution doesn't improve the actual
graphics. at the same time, you have newer titles that have both HD
resolution and better graphics. the improved graphics are not because
the resolution is HD. HD only shows more of what's already
there. another example, take a modern game like Half-Life 2 or
Quake 4, run it at 640 x 480, they still have better graphics than
Quake 3 running at 4000x2000 or whatever resolution.

my final point is. now that both PCs and consoles can run games in HD
resolutions, I hope resolution doesn't increase for another 20 years
or more. I hope hardware / platform providers allow developers to
create much better looking games that run at smoother framerates. the
move to HD resolution really hurt that progress with the current
consoles, (not talking about Wii).




--
Posted via a free Usenet account from http://www.teranews.com

  #6  
Old August 19th 07, 08:47 PM posted to alt.tv.tech.hdtv,alt.games.video.xbox,alt.games.video.sony-playstation2
Eric[_4_]
external usenet poster
 
Posts: 1
Default high definition does not equal better graphics (rant)


"AirRaid" wrote in message
ups.com...
I read time & time again, forum posters, articles from the gaming and
non-gaming press about high definition. it seems to me most of them
can't or don't understand high definition / HD resolution does not
mean better graphics in games. better graphics and HD resolutions
are two entirely seperate things. with Xbox 360 and PS3, what you
have are two consoles with 99% or more of the games running in one (or
more) of the HD resolutons; 720p, 1080i, 1080p. this gives 3 to 6
times the amount of pixels on screen compared to 480i/480p. on top of
that, the actual graphics complexity, detail, lighting, textures,
shaders, special effects, etc are all somewhat better than the
previous generation game consoles, but it doesn't seem like a
generational leap, especially since most current-gen games run at
30fps while many of the best previous-gen games ran at 60fps in SD and
with less detail - yet those games look more appealing because having
a smooth framerate makes a great deal of difference. the problem with
X360/PS3 is, so much of their horsepower go into running games at 3 to
6 times the resolution that there is not MUCH extra performance left
over to improve the actual graphics-- there is some, but like I said,
not enough to seem like a generational leap over last-gen PS2/GGN/
Xbox. the powerful multi-core Xenon and CELL CPUs are mainly used to
run more advanced physics and A.I. calculations, keep track of more
players & objects on and off-screen in the game world. those CPUs can
help the graphics processors with graphics, but only at the "front
end" of the graphics rendering process, the initial calculations,
setup, but not the shading, rendering, displaying of graphics which is
the sole responsibility of the graphics processors (Xenos and RSX
respectively).
what I am saying in a nutshell is, neither Xenos nor RSX are a massive
leap beyond the graphics processors in last-gen consoles, the way the
graphics processors in last-gen consoles *were* a massive leap over
the graphics of the PS1 / Nintendo64 generation.

the other half of my point is, graphics can improve massively even
within standard definition / 640x480 resolution. look at all of the
thousands of prerendered CGI television programs, television network
logos, commercials and high budget movies or sequences in movies,
that you've watched on a standard definition screen, even played back
via old, super low definition VHS -- the graphics of which are FAR
superior to the best X360/PS3/PC games that run at the highest HDTV
resolution or higher in the case of PCs. an example would be the old
game Quake3. you can run that game at incredibly high resolutions,
well beyond 1080p, yet it's still the same old crude, low-polygon
graphics. you can say the same about more recent PC games like Doom3,
Half Life 2, as well as the newest games as well as upcoming games.
running the same game in higher resolution doesn't improve the actual
graphics. at the same time, you have newer titles that have both HD
resolution and better graphics. the improved graphics are not because
the resolution is HD. HD only shows more of what's already
there. another example, take a modern game like Half-Life 2 or
Quake 4, run it at 640 x 480, they still have better graphics than
Quake 3 running at 4000x2000 or whatever resolution.

my final point is. now that both PCs and consoles can run games in HD
resolutions, I hope resolution doesn't increase for another 20 years
or more. I hope hardware / platform providers allow developers to
create much better looking games that run at smoother framerates. the
move to HD resolution really hurt that progress with the current
consoles, (not talking about Wii).


No argument here. I've had an HDTV for several years now. Didn't get it
just for consoles, but of course used them (XB, PS2, GC) with it. Posted
about a year and a half ago about how I ended up later getting an EDTV
specifically for those consoles. An EDTV, being the best physical match for
the previous generation, looked better (to my eyes) on the EDTV than the
HDTV. The HDTV was simply physically too good. Sure it can do witchcraft
like upscaling, but in the end the data for those additional lines is either
there or it isn't. My eyes never saw much difference in 720p over 480p with
the XB games that supported it. They didn't have texturing to really take
advantage of the resolution. The few games that could do 1080i on the XB got
away with it because the textures were very bland. The PS2 had it's
pseudo-1080i gimmick with GT4. The GC, of course, was just 480p. (GC does
look great at 480p though.)

Like watching SD on an HDTV, my biggest complaint with the previous
generation was that games that were photoreal (Forza, GT4, etc) on an SDTV
lost their "photoreality" on an HDTV. Like you said, all the flaws become
visable. In fact, I think they were using "flaws" deliberetly to create
artificating on an 480i SDTV, which you don't get on an HDTV because it is
too good. The EDTV, with only 480 lines but capable of progressive video,
was the perfect match for the previous generation for my eyes.

My expectation for the current generation was simply for the XB360/PS3 to be
able to have the same perception of "photorealism" that the previous
generation had at 480i on an SDTV, but capable of presenting it on an HDTV.
Only have played the Forza demo for XB360, but it seems close to making my
expectation. This is a major improvement in "graphics", but I think we are
at a point now where exponential leaps in technology no longer translates
into exponential leaps in "graphics". Perception is also coming into play
now. Like you said, there is still tremendous amount of room for
improvement, but the hill to get to Hollywood-CGI-like "graphics" is going
to be a long, slow, slope...


  #7  
Old August 19th 07, 09:58 PM posted to alt.games.video.sony-playstation3,comp.sys.ibm.pc.games.action,alt.tv.tech.hdtv,alt.games.video.xbox,alt.games.video.sony-playstation2
Ds Bukkake
external usenet poster
 
Posts: 1
Default high definition does not equal better graphics (rant)

AirRaid wrote:
high definition / HD resolution does not
mean better graphics in games.


DUDE !

I WOULD LOVE TO PLAY PONG IN 1080P !!!!
  #8  
Old August 20th 07, 12:38 AM posted to alt.games.video.sony-playstation2,alt.games.video.sony-playstation3,alt.games.video.xbox,alt.tv.tech.hdtv,comp.sys.ibm.pc.games.action
AirRaid
external usenet poster
 
Posts: 12
Default high definition does not equal better graphics (rant)

On Aug 18, 2:10 pm, NobodyImportant wrote:
AirRaid wrote:
I read time & time again, forum posters, articles from the gaming and
non-gaming press about high definition. it seems to me most of them
can't or don't understand high definition / HD resolution does not
mean better graphics in games. better graphics and HD resolutions
are two entirely seperate things. with Xbox 360 and PS3, what you
have are two consoles with 99% or more of the games running in one (or
more) of the HD resolutons; 720p, 1080i, 1080p. this gives 3 to 6
times the amount of pixels on screen compared to 480i/480p. on top of
that, the actual graphics complexity, detail, lighting, textures,
shaders, special effects, etc are all somewhat better than the
previous generation game consoles, but it doesn't seem like a
generational leap, especially since most current-gen games run at
30fps while many of the best previous-gen games ran at 60fps in SD and
with less detail - yet those games look more appealing because having
a smooth framerate makes a great deal of difference. the problem with
X360/PS3 is, so much of their horsepower go into running games at 3 to
6 times the resolution that there is not MUCH extra performance left
over to improve the actual graphics-- there is some, but like I said,
not enough to seem like a generational leap over last-gen PS2/GGN/
Xbox. the powerful multi-core Xenon and CELL CPUs are mainly used to
run more advanced physics and A.I. calculations, keep track of more
players & objects on and off-screen in the game world. those CPUs can
help the graphics processors with graphics, but only at the "front
end" of the graphics rendering process, the initial calculations,
setup, but not the shading, rendering, displaying of graphics which is
the sole responsibility of the graphics processors (Xenos and RSX
respectively).
what I am saying in a nutshell is, neither Xenos nor RSX are a massive
leap beyond the graphics processors in last-gen consoles, the way the
graphics processors in last-gen consoles *were* a massive leap over
the graphics of the PS1 / Nintendo64 generation.


the other half of my point is, graphics can improve massively even
within standard definition / 640x480 resolution. look at all of the
thousands of prerendered CGI television programs, television network
logos, commercials and high budget movies or sequences in movies,
that you've watched on a standard definition screen, even played back
via old, super low definition VHS -- the graphics of which are FAR
superior to the best X360/PS3/PC games that run at the highest HDTV
resolution or higher in the case of PCs. an example would be the old
game Quake3. you can run that game at incredibly high resolutions,
well beyond 1080p, yet it's still the same old crude, low-polygon
graphics. you can say the same about more recent PC games like Doom3,
Half Life 2, as well as the newest games as well as upcoming games.
running the same game in higher resolution doesn't improve the actual
graphics. at the same time, you have newer titles that have both HD
resolution and better graphics. the improved graphics are not because
the resolution is HD. HD only shows more of what's already
there. another example, take a modern game like Half-Life 2 or
Quake 4, run it at 640 x 480, they still have better graphics than
Quake 3 running at 4000x2000 or whatever resolution.


my final point is. now that both PCs and consoles can run games in HD
resolutions, I hope resolution doesn't increase for another 20 years
or more. I hope hardware / platform providers allow developers to
create much better looking games that run at smoother framerates. the
move to HD resolution really hurt that progress with the current
consoles, (not talking about Wii).


I agree with most of what you said. Graphics are different from resolution.
However IMO they both contribute equally to the overall look of the game.
Throw Half Life 2 in your PC drive and fire it up. Put every detail up to the
max, textures and shading, anti-aliasing and anisotropic filtering, everything
up on quality, but run it at 640 x 480. Then compare that to 2048 x 1536 with
nothing extra turned on. I would be interested to look at screenshots of both
those setups, but alas I am too lazy, but my point is that I venture to say
that the OVERALL look of game doesn't differ a great deal.
Reguardless, my point is that you need both. I agree 100% that this pursuit of
constantly driving up the resolution is the wrong direction. But by the same
token I feel that staying put for 2 decades just to concentrete on the rest is
just as bad.




I also agree with most of the things you're saying. increasing the
resolution by 3x to 6x definitally improved the visual clarity of
whatever is already there. if the game already has outstanding
graphics, animation, lighting, shaders, textures, image quality, etc,
this will only be enhanced a great deal by running it in one of the HD
resolutions. as for what HD resolution actualy do provide, they do
make a large difference in resolution. even the low-end of HD: 720p,
provides a large difference. now that this current-generation of
consoles has made the shift to HD resolutions, I very much hope that
the next gen consoles in 5-6 years use the same resolutions, but
massively improve what is possible as far as the actual in-game
graphics themselves. next-gen consoles probably won't be using much
of their horsepower to reach even 1080p, unlike this-gen where even
720p is a large drain on CPU, GPU and RAM resources. my hope is, next-
gen games in 5-6 years rival, in realtime, the kind of prerendered CGI
graphics used for last-gen game intros and cut-scenes. were NOT
talking raytracing or anything radically different, just a great deal
more performance for graphics. And graphics are only one aspect of
games, certainly not the most important, but still make games stand
out to draw people in deeper so they can experience what really
counts: an immersive, fun, challenging gameplay experience.

  #9  
Old August 20th 07, 01:01 AM posted to alt.games.video.sony-playstation3,comp.sys.ibm.pc.games.action,alt.tv.tech.hdtv,alt.games.video.xbox,alt.games.video.sony-playstation2
Conor
external usenet poster
 
Posts: 14
Default high definition does not equal better graphics (rant)

In article . com,
AirRaid felt he had to say
I read time & time again, forum posters, articles from the gaming and
non-gaming press about high definition. it seems to me most of them
can't or don't understand high definition / HD resolution does not
mean better graphics in games. better graphics and HD resolutions
are two entirely seperate things. with Xbox 360 and PS3, what you
have are two consoles with 99% or more of the games running in one (or
more) of the HD resolutons; 720p, 1080i, 1080p. this gives 3 to 6
times the amount of pixels on screen compared to 480i/480p. on top of
that, the actual graphics complexity, detail, lighting, textures,
shaders, special effects, etc are all somewhat better than the
previous generation game consoles, but it doesn't seem like a
generational leap,


You must be ****ing blind. When was the last time you played a PS2
game? My kids still use a PS2 and the difference is massive compared to
the 360.

--
Conor

I'm not prejudiced. I hate everyone equally.
  #10  
Old August 20th 07, 01:20 AM posted to alt.tv.tech.hdtv,alt.games.video.xbox,alt.games.video.sony-playstation2
Eric[_5_]
external usenet poster
 
Posts: 1
Default high definition does not equal better graphics (rant)


"Conor" wrote in message
...
In article . com,
AirRaid felt he had to say
I read time & time again, forum posters, articles from the gaming and
non-gaming press about high definition. it seems to me most of them
can't or don't understand high definition / HD resolution does not
mean better graphics in games. better graphics and HD resolutions
are two entirely seperate things. with Xbox 360 and PS3, what you
have are two consoles with 99% or more of the games running in one (or
more) of the HD resolutons; 720p, 1080i, 1080p. this gives 3 to 6
times the amount of pixels on screen compared to 480i/480p. on top of
that, the actual graphics complexity, detail, lighting, textures,
shaders, special effects, etc are all somewhat better than the
previous generation game consoles, but it doesn't seem like a
generational leap,


You must be ****ing blind. When was the last time you played a PS2
game? My kids still use a PS2 and the difference is massive compared to
the 360.


He isn't blind at all. Everything he said is straight on. Try this for
comparison: PS2's Gran Turismo 4 (which does have good texturing, lighting,
etc) on a standard definition 480i TV versus an XB360 game that with not so
good texturing, lighting, etc on an HDTV...

GT4 on the SDTV at 480i (my eyes find interlacing to be beneficial for some
games that seem to try to take advantage of artifacting flaws) gives more
immersion of photoreal perception...





 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Watch high definition trailers on your current non high definition computer fluffy cupcake UK digital tv 6 December 20th 06 01:39 AM
One third of High Definition TV owners are watching channels in High Definition. [email protected] High definition TV 13 December 30th 05 11:38 AM
For Once And For All: High Definition Is Better Than Standard Definition Tricky Dicky UK home cinema 6 October 27th 05 10:14 PM
High Definition Zoyburg Satellite tvro 1 August 2nd 04 10:18 AM


All times are GMT +1. The time now is 07:38 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
Copyright ©2004-2021 HomeCinemaBanter.
The comments are property of their posters.