HomeCinemaBanter

HomeCinemaBanter (http://www.homecinemabanter.com/index.php)
-   UK digital tv (http://www.homecinemabanter.com/forumdisplay.php?f=5)
-   -   Best tv for pc resolution (http://www.homecinemabanter.com/showthread.php?t=53692)

Roderick Stewart September 26th 07 12:27 PM

Best tv for pc resolution
 
In article , Colin Stamp wrote:
I thought I was agreeing that it was nonsense to state that a 15-pin VGA*
analogue video output would have difficulty handling HD video. In reality I*
would expect it to have no difficulty at all.*


You're agreeing, yet you haven't tried it any more than he has.


I'm looking at a 1600x1200 screen right now. I'm sitting quite close to it
looking at quite small text. I can't even be sure without checking the cables
whether it is fed from the digital or analogue output from the computer,
because I have tried both and cannot see any difference at all.

1920x1080 is practically the same number of pixels as 1600x1200, so pretty*
much the same frequency range would be required to handle it as an analogue*
signal.*


It's the horizontal resolution that really matters, not the vertical,
so it's 20% higher.


That's close enough not to matter. Typically the frequency response of a video
amplifier and 2 metres of cable (or, for that matter, 20 metres of cable) will
exceed the expected frequency range of the video signal by a very generous
margin indeed, in percentage terms perhaps several hundred, so a mere 20%
difference in the upper frequency limits of two signals is of no consequence.

The fact that a 15-pin analogue output shows not the slightest hint*
of problems handling 1600x1200 leads me to expect it would handle the other*
easily.*


Expect, but not know.


My expectations are based on decades of practical experience using, and
occasionally designing and building, video equipment of various types, and
connecting it via cables of various lengths. In this context, if I say I would
"expect" two very similar video signals to behave in a very similar manner when
fed through two metres of cable, then I might be guilty of understatement to
avoid appearing arrogant, but it's actually pretty close to a certainty. I may
not have seen this exact situation, but I have seen plenty of equivalent ones.

Performance does not look in the least "hit-and-miss". Analogue video*
amplifiers can be made which exceed the required performance by a generous*
margin.


It's hit-and-miss in that some systems will work and some won't. If
you have one that works then that's wonderful - for you and you alone.
But then again, we don't know that it works even for you.


Well, I know that it works for me, because I'm looking at it. Design of video
amplifiers and impedance matching of cables is not hit-and-miss at all. There
are ways of doing these things properly, and they are done routinely by
broadcasters who convey their signals through hundreds of miles of cables, some
of the circuits changing by the day, or by the hour, depending on the needs of
particular programmes. If a video amplifier works with one monitor but not
another, or if it handles one video signal but not another in which the upper
frequency limit is only 20% different, then it's badly designed.

I've even seen one 1380X720 setup that's pixel-perfect until the PC
starts to warm up, then starts to smear after half an hor or so.


If a video amplifier gives a performance that deteriorates as it warms up, then
it's *very* badly designed. I'd suspect a fault in the monitor first.

Flat-panel displays are *much* less forgiving than CRTs in this
respect. The output *has* to be pixel-perfect or it shows up really
badly.


True, but I don't think this can have anything to do with the video amplifier
or the cable.

Rod.


Roderick Stewart September 26th 07 12:27 PM

Best tv for pc resolution
 
In article , Colin Stamp wrote:
All it needs is a properly designed video output stage*
and the correct impedance cable. If your video output stage cannot*
manage 2 metres of cable, there is something wrong with it, or it's the*
wrong cable.

But you're thinking back in the analogue realm, when a bit of timing
skew didn't really matter one way or the other because it'd just get
lost in the CRT displays where the number of phosphor dots exceeded
the resolution of the incoming signal. With a flat-panel, if the
signal moves by more than half a pixel's width, it'll turn up really
visibly in the pixel next-door.

Apologies if the above doesn't make sense. I've had a few...


It makes perfect sense, and yes, when considering a VGA analogue signal
being fed through a cable, I am thinking in the analogue realm. However, I
can't think of anything in a properly designed video amplifier that could
change enough to cause pixels to "move", and have never seen this. As long
as I choose the pixel size of the PC output video signal to match the
physical pixel size of the display, there never seems to be a problem.

Rod.


Colin Stamp September 26th 07 09:17 PM

Best tv for pc resolution
 
On Wed, 26 Sep 2007 11:17:05 +0100, "Paul D.Smith"
wrote:

I'm afraid you've been led up the garden path by Dr Hfuhruhurr.
The PC display on my TV is perfect. The text is nearly unreadable
because I sit too far away from it.


Well now, that wasn't exactly made clear now, was it?

Doc


Got to agree with the good doctor here. The fact that you want to read the
text from across the room was far from obvious.


How obvious do you need it? Here's the paragraph you both just
misread...

Incidentally, If you wan't to be able to read the text at 1080p,
you'll need to sit really close, or get a really big TV. Our 32 inch
TV is only 720p and, from a "normal" TV viewing distance, the text is
borderline unreadable.


Still seems pretty clear to me. I suppose I must just be strange...


If you want to be able to read text from a distance, I'd suggest either
increasing the font size or investing in a data projector so you can get a
"6 foot screen".


Again you've read something into it that was never there. It was an
observation, not a complaint. I don't want to be able to read the text
from a distance, I've just noticed that it's difficult and I thought
the OP might benefit from the knowledge before selecting his own TV.

Cheers,

Colin.

Colin Stamp September 26th 07 09:17 PM

Best tv for pc resolution
 
On Wed, 26 Sep 2007 11:27:23 +0100, Roderick Stewart
wrote:

In article , Colin Stamp wrote:
All it needs is a properly designed video output stage*
and the correct impedance cable. If your video output stage cannot*
manage 2 metres of cable, there is something wrong with it, or it's the*
wrong cable.

But you're thinking back in the analogue realm, when a bit of timing
skew didn't really matter one way or the other because it'd just get
lost in the CRT displays where the number of phosphor dots exceeded
the resolution of the incoming signal. With a flat-panel, if the
signal moves by more than half a pixel's width, it'll turn up really
visibly in the pixel next-door.

Apologies if the above doesn't make sense. I've had a few...


It makes perfect sense, and yes, when considering a VGA analogue signal
being fed through a cable, I am thinking in the analogue realm. However, I
can't think of anything in a properly designed video amplifier that could
change enough to cause pixels to "move",


If the bandwidth is limited, the rise and fall times will both extend,
"moving" both edges of the pixel to the right on the screen. The
effect would obviously be very slight, but then you've only got half a
1920 pixel's width to play with.

and have never seen this. As long
as I choose the pixel size of the PC output video signal to match the
physical pixel size of the display, there never seems to be a problem.


It's great that *you* haven't had problems up to 1600 horizontal (Is
that on CRT or LCD, by the way?). It doesn't guarantee that others
won't have problems at 1920 horizontal though.

Cheers,

Colin.

Colin Stamp September 26th 07 09:17 PM

Best tv for pc resolution
 
On Wed, 26 Sep 2007 08:45:38 +0100, Andrew wrote:

On Tue, 25 Sep 2007 20:19:09 +0100, Colin Stamp
wrote:

You're agreeing, yet you haven't tried it any more than he has.


Thousands of people use 1080P via VGA on XBox 360's.


OK, I'll take your word for it that they're all pixel-perfect, so
that's exactly *one* VGA source that has been made to work at 1080p.

Now, what about all the thousands of other VGA sources?

Cheers,

Colin.

Colin Stamp September 26th 07 09:17 PM

Best tv for pc resolution
 
On Wed, 26 Sep 2007 11:27:22 +0100, Roderick Stewart
wrote:

In article , Colin Stamp wrote:
I thought I was agreeing that it was nonsense to state that a 15-pin VGA*
analogue video output would have difficulty handling HD video. In reality I*
would expect it to have no difficulty at all.*


You're agreeing, yet you haven't tried it any more than he has.


I'm looking at a 1600x1200 screen right now. I'm sitting quite close to it
looking at quite small text. I can't even be sure without checking the cables
whether it is fed from the digital or analogue output from the computer,
because I have tried both and cannot see any difference at all.


We've been though this before. You only have one system, you haven't
said whether it's LCD or CRT, and 1600 is significantly less than
1920.


1920x1080 is practically the same number of pixels as 1600x1200, so pretty*
much the same frequency range would be required to handle it as an analogue*
signal.*


It's the horizontal resolution that really matters, not the vertical,
so it's 20% higher.


That's close enough not to matter. Typically the frequency response of a video
amplifier and 2 metres of cable (or, for that matter, 20 metres of cable) will
exceed the expected frequency range of the video signal by a very generous
margin indeed, in percentage terms perhaps several hundred, so a mere 20%
difference in the upper frequency limits of two signals is of no consequence.


Yep, analogue VGA was no-doubt designed to have a very generous
bandwidth - for 640 pixels horizontally. No doubt it's been improved a
lot since then, but there's still a limit, and given the myriad of
different graphics cards and displays, it's an extremely wooly limit.
The grey area certainly extends into the gap between 1920 and 1600,
and probably below.


The fact that a 15-pin analogue output shows not the slightest hint*
of problems handling 1600x1200 leads me to expect it would handle the other*
easily.*


Expect, but not know.


My expectations are based on decades of practical experience using, and
occasionally designing and building, video equipment of various types, and
connecting it via cables of various lengths. In this context, if I say I would
"expect" two very similar video signals to behave in a very similar manner when
fed through two metres of cable, then I might be guilty of understatement to
avoid appearing arrogant, but it's actually pretty close to a certainty. I may
not have seen this exact situation, but I have seen plenty of equivalent ones.


But you still haven't seen *every* combination of the thousands of
video cards on the market plugged into the thousands of 1080 TVs on
the market via VGA interfaces. In fact, you don't appear to have seen
any at-all.


Performance does not look in the least "hit-and-miss". Analogue video*
amplifiers can be made which exceed the required performance by a generous*
margin.


It's hit-and-miss in that some systems will work and some won't. If
you have one that works then that's wonderful - for you and you alone.
But then again, we don't know that it works even for you.


Well, I know that it works for me, because I'm looking at it. Design of video
amplifiers and impedance matching of cables is not hit-and-miss at all. There
are ways of doing these things properly, and they are done routinely by
broadcasters who convey their signals through hundreds of miles of cables, some
of the circuits changing by the day, or by the hour, depending on the needs of
particular programmes. If a video amplifier works with one monitor but not
another, or if it handles one video signal but not another in which the upper
frequency limit is only 20% different, then it's badly designed.


And you don't think these "badly designed" ones are out there in
significant numbers then? The vast bulk of video cards haven't been
designed by broadcasters.


I've even seen one 1380X720 setup that's pixel-perfect until the PC
starts to warm up, then starts to smear after half an hor or so.


If a video amplifier gives a performance that deteriorates as it warms up, then
it's *very* badly designed. I'd suspect a fault in the monitor first.

Nope. It's not the TV. The effect is independent of the TV
temperature, but dependant on the PC temperature. I guess the video
card is a bit crap It was cheap after-all. It's disappointing that it
doesn't work perfectly at 1380 horizontal, but not a huge surprise.
Incidentally, the same PC worked apparently perfectly at 1600X1200 on
a CRT monitor for some time before being switched to TV duty.


Flat-panel displays are *much* less forgiving than CRTs in this
respect. The output *has* to be pixel-perfect or it shows up really
badly.


True, but I don't think this can have anything to do with the video amplifier
or the cable.


Of course it does. The interface can be a bit dodgy and it'll just
marginally reduce the sharpness on a CRT screen - nobody will be any
the wiser. Use that same interface to drive an LCD display at the same
resolution, and the deficiency will be really obvious.

Cheers,

Colin

Colin Stamp September 26th 07 10:04 PM

Best tv for pc resolution
 
On Wed, 26 Sep 2007 20:36:31 +0100, Mike Henry
wrote:

In , Colin Stamp
wrote:

On Wed, 26 Sep 2007 11:27:22 +0100, Roderick Stewart
wrote:

My expectations are based on decades of practical experience using, and
occasionally designing and building, video equipment of various types, and
connecting it via cables of various lengths. In this context, if I say I would
"expect" two very similar video signals to behave in a very similar manner when
fed through two metres of cable, then I might be guilty of understatement to
avoid appearing arrogant, but it's actually pretty close to a certainty. I may
not have seen this exact situation, but I have seen plenty of equivalent ones.


But you still haven't seen *every* combination of the thousands of
video cards on the market plugged into the thousands of 1080 TVs on
the market via VGA interfaces. In fact, you don't appear to have seen
any at-all.


Oh dear. You must be new here. I'd take anything Roderick says above
anything you say at any time.


Well it sure beats listening to the arguments and forming an opinion,
I suppose.

Cheers,

Colin.

Jane T September 26th 07 11:03 PM

Best tv for pc resolution
 

The bottom line is, the more detail you want on your screen, the
bigger the screen will need to be, or the closer you will need to sit.
It sounds really obvious, I know, but I bet it's often not thought of
until after the small, high-resolution screen gets installed ten feet
away from the sofa, and the PC boots for the first time..


Ok your 32" TV what is the optimum distance to read small text? and at what
distance is it ok and at what distance is it unacceptable.

An example to explain what I mean is, you might say: the optimum distance
is 5 foot, at 4 foot and 6 foot I can read the text ok and at =7 foot I
can't read the text and at =3 foot although i can read the text, I am sat
too close to the tv.



Colin Stamp September 27th 07 12:28 AM

Best tv for pc resolution
 
On Wed, 26 Sep 2007 22:03:56 +0100, "Jane T" wrote:


The bottom line is, the more detail you want on your screen, the
bigger the screen will need to be, or the closer you will need to sit.
It sounds really obvious, I know, but I bet it's often not thought of
until after the small, high-resolution screen gets installed ten feet
away from the sofa, and the PC boots for the first time..


Ok your 32" TV what is the optimum distance to read small text? and at what
distance is it ok and at what distance is it unacceptable.

An example to explain what I mean is, you might say: the optimum distance
is 5 foot, at 4 foot and 6 foot I can read the text ok and at =7 foot I
can't read the text and at =3 foot although i can read the text, I am sat
too close to the tv.

Well, it's all subjective, of course, but I'd say there's no point
sitting closer than 1M - the individual sub-pixels are descendible at
that point.

Everything is comfortably readable at 2M.

Where I normally sit, at 3M it's definitely borderline. I find myself
guessing a bit, which works fine for things like menus and icon
titles, but not so well for text I haven't read before. Also, I need
good contrast to have a chance, so I have to lean forward for
greyed-out menu options, or text on busy backgrounds.

At 4M, any reading is mostly guesswork.

The resolution is 1360X768 and I use the normal size (96 DPI) fonts,
which makes the standard text for things like folder listings, menus
etc. around 4.5mm high. My eyes are un-modified and 40 years old.

Cheers,

Colin.

Roderick Stewart September 27th 07 01:50 AM

Best tv for pc resolution
 
In article , Colin Stamp wrote:
I'm looking at a 1600x1200 screen right now. I'm sitting quite close to it*
looking at quite small text. I can't even be sure without checking the cables*
whether it is fed from the digital or analogue output from the computer,*
because I have tried both and cannot see any difference at all.


We've been though this before. You only have one system, you haven't
said whether it's LCD or CRT, and 1600 is significantly less than
1920.


You're right. Checking back I see that I didn't actually specify the type of
monitor in those terms, though I did say that my *monitor* was 1600 x 1200, which
would not be the case if it was a CRT, because the displayed resolution would
depend on the signal fed to it. Evidently that didn't make it as clear as I assumed
it would. Sorry for the misunderstanding. It is actually an LCD type with a
physical pixel structure of 1600 x 1200, so of course best results are obtained by
presenting it with a video signal with that pixel size. It does not make the
slightest difference that I can see whether I use the digital or analogue signal,
and I have looked very carefully because I was interested t osee if there was one.

However, I would dispute "significantly". Yes there will be a difference in the
frequency response required of the video amplifier, but it is not significant. I
haven't worked out exactly what it would be, but you suggested 20% and that sounds
about right. Hardly enough to make a difference unless the video amplifier had a
fairly steep HF filter somewhere between the upper limits of the two video signals,
which would be highly unusual as the filter itself would do more damage than
anything else.

I also wonder how you think you can say that I have only one system. We are not
aquainted outside this newsgroup, so how could you possibly know? The best
experiment I've been able to try was on my main system, as described, because both
the graphic card and the monitor can handle digital and analogue signals, so it
really is an "all other things being equal" situation, but it is not the only
system I have, and certainly not the only one I have seen. I actually have three
working computers at the moment, but must have built dozens over the years, for
myself and other people. I am familiar with the behaviour of CRT and LCD monitors,
video amplifiers, and various ways of connecting them together, and have used them
for a variety of purposes.

If a video amplifier works with one monitor but not*
another, or if it handles one video signal but not another in which the upper*
frequency limit is only 20% different, then it's badly designed.


And you don't think these "badly designed" ones are out there in
significant numbers then? The vast bulk of video cards haven't been
designed by broadcasters.


I'm sure there's a lot of rubbish out there, but it doesn't tell us anything
fundamental about the relative merits of digital versus analogue connections
between computers and monitors through 2 metres of cable. If something doesn't work
very well because it's badly designed, then that's the reason, not the fact that
it's analogue, or whatever.

If a video amplifier gives a performance that deteriorates as it warms up, then*
it's *very* badly designed. I'd suspect a fault in the monitor first.

Nope. It's not the TV. The effect is independent of the TV
temperature, but dependant on the PC temperature. I guess the video
card is a bit crap It was cheap after-all. It's disappointing that it
doesn't work perfectly at 1380 horizontal, but not a huge surprise.


Well, there's your answer - not a very good video card.

Incidentally, the same PC worked apparently perfectly at 1600X1200 on
a CRT monitor for some time before being switched to TV duty.


I'd suggest that agrees with the above.

True, but I don't think this can have anything to do with the video amplifier*
or the cable.


Of course it does. The interface can be a bit dodgy and it'll just
marginally reduce the sharpness on a CRT screen - nobody will be any
the wiser. Use that same interface to drive an LCD display at the same
resolution, and the deficiency will be really obvious.


Assuming both monitors - LCD and CRT - are capable of displaying the full bandwidth
of a given video signal, can you explain the mechanism which will make a reduction
in HF of the signals fed to them more apparent on one display than the other? I
can't think how this could happen, and have never seen the effect.

Rod.



All times are GMT +1. The time now is 06:46 PM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com