HomeCinemaBanter

HomeCinemaBanter (http://www.homecinemabanter.com/index.php)
-   UK digital tv (http://www.homecinemabanter.com/forumdisplay.php?f=5)
-   -   Best tv for pc resolution (http://www.homecinemabanter.com/showthread.php?t=53692)

Roderick Stewart September 27th 07 01:50 AM

Best tv for pc resolution
 
In article , Colin Stamp wrote:
It makes perfect sense, and yes, when considering a VGA analogue signal*
being fed through a cable, I am thinking in the analogue realm. However, I*
can't think of anything in a properly designed video amplifier that could*
change enough to cause pixels to "move",*


If the bandwidth is limited, the rise and fall times will both extend,
"moving" both edges of the pixel to the right on the screen. The
effect would obviously be very slight, but then you've only got half a
1920 pixel's width to play with.


Won't the rise time of horizontal sync be affected in exactly the same way?

Rod.


Roderick Stewart September 27th 07 01:50 AM

Best tv for pc resolution
 
In article , Colin Stamp
wrote:
Well, it's all subjective, of course, but I'd say there's no point
sitting closer than 1M - the individual sub-pixels are descendible at
that point.*

Everything is comfortably readable at 2M.


I find the most comfortable place to sit to watch TV is on the sofa.

Rod.


Colin Stamp September 27th 07 07:52 PM

Best tv for pc resolution
 
On Thu, 27 Sep 2007 00:50:26 +0100, Roderick Stewart
wrote:

In article , Colin Stamp wrote:
It makes perfect sense, and yes, when considering a VGA analogue signal*
being fed through a cable, I am thinking in the analogue realm. However, I*
can't think of anything in a properly designed video amplifier that could*
change enough to cause pixels to "move",*


If the bandwidth is limited, the rise and fall times will both extend,
"moving" both edges of the pixel to the right on the screen. The
effect would obviously be very slight, but then you've only got half a
1920 pixel's width to play with.


Won't the rise time of horizontal sync be affected in exactly the same way?

Not necessarily. It's obviously generated in a different way to the
video signals so it may be subject to different distortions before it
gets to the line driver. Then it has to share its ground pin with the
other sync signal whereas the video signals each have their own
ground, so it's quite possible that the wiring and/or driver will be
different too. Then we can move to receiving end of the link...

Cheers,

Colin.

Colin Stamp September 27th 07 07:52 PM

Best tv for pc resolution
 
On Thu, 27 Sep 2007 00:50:25 +0100, Roderick Stewart
wrote:

In article , Colin Stamp wrote:
I'm looking at a 1600x1200 screen right now. I'm sitting quite close to it*
looking at quite small text. I can't even be sure without checking the cables*
whether it is fed from the digital or analogue output from the computer,*
because I have tried both and cannot see any difference at all.


We've been though this before. You only have one system, you haven't
said whether it's LCD or CRT, and 1600 is significantly less than
1920.


You're right. Checking back I see that I didn't actually specify the type of
monitor in those terms, though I did say that my *monitor* was 1600 x 1200, which
would not be the case if it was a CRT, because the displayed resolution would
depend on the signal fed to it. Evidently that didn't make it as clear as I assumed
it would. Sorry for the misunderstanding.


It was indeed a bit too obscure for me. Apologies for missing it.

It is actually an LCD type


Well, that's that cleared up then.


However, I would dispute "significantly". Yes there will be a difference in the
frequency response required of the video amplifier, but it is not significant. I
haven't worked out exactly what it would be, but you suggested 20% and that sounds
about right. Hardly enough to make a difference unless the video amplifier had a
fairly steep HF filter somewhere between the upper limits of the two video signals,
which would be highly unusual as the filter itself would do more damage than
anything else.


I haven't worked out the bandwidth difference either. It's 20% more
pixels per line. You'll have to agree that there is an upper limit to
the horizontal resolution that the interface can handle reliably. We
can also safely assume that the limit varies wildly from one
implementation to another.


I also wonder how you think you can say that I have only one system. We are not
aquainted outside this newsgroup, so how could you possibly know?


Ah, it was an assumption on my part, based on you not having mentioned
any others. I was only referring to systems which are (supposedly)
relevant to this discussion. I certainly didn't mean to belittle your
overall computing power.

The best
experiment I've been able to try was on my main system, as described, because both
the graphic card and the monitor can handle digital and analogue signals, so it
really is an "all other things being equal" situation, but it is not the only
system I have, and certainly not the only one I have seen. I actually have three
working computers at the moment, but must have built dozens over the years, for
myself and other people. I am familiar with the behaviour of CRT and LCD monitors,
video amplifiers, and various ways of connecting them together, and have used them
for a variety of purposes.


It doesn't need to be an all-other-things-equal job. If it works on
VGA then that's one point for the "VGA always works" camp, but it's
still not 1920 pixels per line.




If a video amplifier works with one monitor but not*
another, or if it handles one video signal but not another in which the upper*
frequency limit is only 20% different, then it's badly designed.


And you don't think these "badly designed" ones are out there in
significant numbers then? The vast bulk of video cards haven't been
designed by broadcasters.


I'm sure there's a lot of rubbish out there, but it doesn't tell us anything
fundamental about the relative merits of digital versus analogue connections
between computers and monitors through 2 metres of cable. If something doesn't work
very well because it's badly designed, then that's the reason, not the fact that
it's analogue, or whatever.


This is the crux of our disagreement, and it seems to be bogus. I'm
not saying, and never have said, anything about the relative merits of
analogue versus digital video connections in general.

I'm talking about one particular interface, which became the de-facto
standard for PC monitors many years ago, and has been independently
developed by numerous manufacturers over the years. In the beginning,
there were only CRT displays, so the interface was designed without a
pixel-sync. Each entire line of pixels has to fall exactly into the
correct slots (using accurate timing alone) for a flat-panel to
display it properly. This was never an issue with CRTs.

All this has resulted in a very untidy situation where the VGA
interface can work in some situations but not others. Hence my
original comment that it doesn't really work properly. It's not
because it's analogue, it's because nobody expected it to be used for
displays with discrete pixels.

Like you say, there's a lot of rubbish out there, but it's real-world
rubbish and we have to take that into account when selecting displays
and signal sources that must be inter-operable.

If I were to buy a 1080 TV and a PC which had to work together, I
would make sure the PC had a DVI output and the TV had a spare input
to plug it into. Wouldn't you do the same?


If a video amplifier gives a performance that deteriorates as it warms up, then*
it's *very* badly designed. I'd suspect a fault in the monitor first.

Nope. It's not the TV. The effect is independent of the TV
temperature, but dependant on the PC temperature. I guess the video
card is a bit crap It was cheap after-all. It's disappointing that it
doesn't work perfectly at 1380 horizontal, but not a huge surprise.


Well, there's your answer - not a very good video card.


Indeed. Not very good and probably not very atypical.


Incidentally, the same PC worked apparently perfectly at 1600X1200 on
a CRT monitor for some time before being switched to TV duty.


I'd suggest that agrees with the above.


I'd suggest that it supports my assertion that a slightly ropey video
signal can show up more on a display with discrete pixels.


True, but I don't think this can have anything to do with the video amplifier*
or the cable.


Of course it does. The interface can be a bit dodgy and it'll just
marginally reduce the sharpness on a CRT screen - nobody will be any
the wiser. Use that same interface to drive an LCD display at the same
resolution, and the deficiency will be really obvious.


Assuming both monitors - LCD and CRT - are capable of displaying the full bandwidth
of a given video signal, can you explain the mechanism which will make a reduction
in HF of the signals fed to them more apparent on one display than the other? I
can't think how this could happen, and have never seen the effect.


Well, if you want to concentrate only on bandwidth, then Reducing the
HF of the signal to an LCD will, after a certain point, cause
individual pixels to interact with their neighbors. This is very
noticeable on displays which are expected to be razor sharp, with
perfectly rectangular pixels. On a CRT it'll produce a softening
effect which many people might not notice, or might happily tolerate.

But, of course, there are many more distortions which might affect the
signal other than bandwidth limitation. Now I think about it further,
the system that goes dodgy when the PC warms up is probably down to a
cheap crystal oscillator on the graphics card drifting with
temperature. Something that a pixel sync would completely sort out,
and something which would only show up on a CRT as a minute change in
picture size.

Cheers,

Colin.

Colin Stamp September 27th 07 07:52 PM

Best tv for pc resolution
 
On Thu, 27 Sep 2007 00:50:27 +0100, Roderick Stewart
wrote:

In article , Colin Stamp
wrote:
Well, it's all subjective, of course, but I'd say there's no point
sitting closer than 1M - the individual sub-pixels are descendible at
that point.*

Everything is comfortably readable at 2M.


I find the most comfortable place to sit to watch TV is on the sofa.

Ah, that's where I've been going wrong then. I suspected there must be
a more comfortable place than the top of the sideboard.

Cheers,

Colin.

ThePunisher September 30th 07 09:39 PM

Best tv for pc resolution
 
Colin Stamp wrote:
On Mon, 24 Sep 2007 22:29:57 GMT, "ThePunisher"
wrote:


What a load of ********.


Indeed you are.

Cheers,

Colin.


So all those years computer monitors have been using 15 pin connectors to do
better than 1080p, they haven't been doing a good job of it?

--
ThePunisher



ThePunisher September 30th 07 09:40 PM

Best tv for pc resolution
 
Andrew wrote:
On Tue, 25 Sep 2007 20:19:09 +0100, Colin Stamp
wrote:

You're agreeing, yet you haven't tried it any more than he has.


Thousands of people use 1080P via VGA on XBox 360's.


And some people use 1080p over component, Colin Stamp-collector has no idea
what he's talking about.

--
ThePunisher



Colin Stamp September 30th 07 10:41 PM

Best tv for pc resolution
 
On Sun, 30 Sep 2007 19:39:00 GMT, "ThePunisher"
wrote:

Colin Stamp wrote:
On Mon, 24 Sep 2007 22:29:57 GMT, "ThePunisher"
wrote:


What a load of ********.


Indeed you are.

Cheers,

Colin.


So all those years computer monitors have been using 15 pin connectors to do
better than 1080p, they haven't been doing a good job of it?


Well thankyou for finally voicing your concerns. Your earlier grunt
didn't give me much to go on I'm afraid.

Presumably you have direct experience of these, just as I have direct
experience that the interface increases in unreliability with
increasing horizontal resolution, starting well below 1920 pixels.

My experience of this phenomenon is confined to flat-panel displays
though, which are the only ones relevant today. I suggest you read the
rest of the thread to educate yourself as to why this is important.
I'll give you a clue - there's no pixel-sync on an analogue VGA
interface.

Cheers,

Colin.

Colin Stamp September 30th 07 10:49 PM

Best tv for pc resolution
 
On Sun, 30 Sep 2007 19:40:07 GMT, "ThePunisher"
wrote:

Andrew wrote:
On Tue, 25 Sep 2007 20:19:09 +0100, Colin Stamp
wrote:

You're agreeing, yet you haven't tried it any more than he has.


Thousands of people use 1080P via VGA on XBox 360's.


And some people use 1080p over component


And these are *all* pixel-perfect (as a PC display needs to be) are
they?


, Colin Stamp-collector

How old are you? The last time I heard that, I was still in primary
school.

has no idea what he's talking about.


Ooh, I feel so punished.

Cheers,

Colin.

[email protected] October 1st 07 10:48 AM

Best tv for pc resolution
 
On 30 Sep, 21:41, Colin Stamp wrote:
On Sun, 30 Sep 2007 19:39:00 GMT, "ThePunisher"
wrote:


So all those years computer monitors have been using 15 pin connectors to do
better than 1080p, they haven't been doing a good job of it?


Well thankyou for finally voicing your concerns. Your earlier grunt
didn't give me much to go on I'm afraid.

Presumably you have direct experience of these, just as I have direct
experience that the interface increases in unreliability with
increasing horizontal resolution, starting well below 1920 pixels.

My experience of this phenomenon is confined to flat-panel displays
though, which are the only ones relevant today. I suggest you read the
rest of the thread to educate yourself as to why this is important.
I'll give you a clue - there's no pixel-sync on an analogue VGA
interface.


1280x1024x75fps=98M pixels/sec
1920x1080x50fps=104M pixels/sec

The former is what I'm running right now. VGA connection. Samsung
SyncMaster 192V. Pixel perfect.

I _have_ seen VGALCD which wasn't pixel perfect. I agree it's
possible (easy?) to get it wrong. However, these Samsungs have some
kind of auto calibration which matches the pixels of the input signal
to those on the display. It seems to work consistently throughout the
various PCs with various Samsung panels throughout our office. The
exception is for the people who insist on running at 1024x786!

btw, you seem convinced that it's the horizontal pixel count that
matters - if you think for one second about the analogue signal,
you'll realise it's the pixel clock that defines the bandwidth
required. This is roughly proportional to the pixel count per second
(not exactly, because blanking and sync varies with display mode in a
not quite proportional way on "old" modes).

FWIW many of the visible differences between VGA, YPbPr, RGB, and DVI/
HDMI on displays are due to the different ways these displays
interpret these different formats. They assume different gain, set-up,
gamma, and overscan depending on the input. This is much more of a
headache on TVs than simple analogue/digital issues.

Cheers,
David.



All times are GMT +1. The time now is 06:46 PM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com