|
Best tv for pc resolution
I want to purchase a new lcd tv which will be connected to a pc as
well as my sky+. I want a 1080p resolution tv, but I also want to get the maximum resolution use from the connected pc. Those tv's that I have looked at so far allow a max res from pc input of 1024x768. Is this the best I can expect ? TIA |
Best tv for pc resolution
|
Best tv for pc resolution
On Sep 24, 4:45 pm, "Adrian A" wrote:
wrote: I want to purchase a new lcd tv which will be connected to a pc as well as my sky+. I want a 1080p resolution tv, but I also want to get the maximum resolution use from the connected pc. Those tv's that I have looked at so far allow a max res from pc input of 1024x768. Is this the best I can expect ? TIA No, the best you expect is 1920x1080p. Keep looking. Ok thanks. In that case can anyone recommend a good model ?? |
Best tv for pc resolution
wrote in message ups.com... On Sep 24, 4:45 pm, "Adrian A" wrote: wrote: I want to purchase a new lcd tv which will be connected to a pc as well as my sky+. I want a 1080p resolution tv, but I also want to get the maximum resolution use from the connected pc. Those tv's that I have looked at so far allow a max res from pc input of 1024x768. Is this the best I can expect ? TIA No, the best you expect is 1920x1080p. Keep looking. Ok thanks. In that case can anyone recommend a good model ?? Sony Bravia W or X series. I've got a 40" X series, it has a VGA input and the PC resolution is superb (the panel is capable of 1080p but I'm not sure what gets sent by the pc). I use the VGA input to view digital photos from a laptop and for that it's great. Cheers Chas |
Best tv for pc resolution
|
Best tv for pc resolution
|
Best tv for pc resolution
Colin Stamp wrote:
On Mon, 24 Sep 2007 15:43:57 -0000, wrote: I want to purchase a new lcd tv which will be connected to a pc as well as my sky+. I want a 1080p resolution tv, but I also want to get the maximum resolution use from the connected pc. Those tv's that I have looked at so far allow a max res from pc input of 1024x768. Is this the best I can expect ? TIA I think it's the PC that's your problem, not the TV. The 15-pin analogue VGA connector that's standard on PCs isn't really capable of doing 1080p resolution properly. Cheers, Colin. What a load of ********. -- ThePunisher |
Best tv for pc resolution
On 24 Sep, 20:39, Colin Stamp wrote:
On Mon, 24 Sep 2007 15:43:57 -0000, wrote: I want to purchase a new lcd tv which will be connected to a pc as well as my sky+. I want a 1080p resolution tv, but I also want to get the maximum resolution use from the connected pc. Those tv's that I have looked at so far allow a max res from pc input of 1024x768. Is this the best I can expect ? TIA I think it's the PC that's your problem, not the TV. The 15-pin analogue VGA connector that's standard on PCs isn't really capable of doing 1080p resolution properly. You can get PC graphics cards with DVI or even HDMI outputs, so you won't need to use the TV's ropey "PC" input. Then you should be able to drive the TV to its full resolution even if it quotes a lower value for PCs. Make sure you get a card that supports the TVs native resolution and a TV with enough inputs, so you can still plug in your DVD player, decoder etc. as well as the PC. Incidentally, If you wan't to be able to read the text at 1080p, you'll need to sit really close, or get a really big TV. Our 32 inch TV is only 720p and, from a "normal" TV viewing distance, the text is borderline unreadable. Control Panel, Display, Settings, Advanced, DPI setting. Change to suit. (some poorly written programs won't understand; some web pages will stupidly over ride - but most things work very well. I'm using 120DPI right now) Cheers, David. |
Best tv for pc resolution
Colin Stamp wrote:
On Mon, 24 Sep 2007 15:43:57 -0000, wrote: I want to purchase a new lcd tv which will be connected to a pc as well as my sky+. I want a 1080p resolution tv, but I also want to get the maximum resolution use from the connected pc. Those tv's that I have looked at so far allow a max res from pc input of 1024x768. Is this the best I can expect ? TIA I think it's the PC that's your problem, not the TV. The 15-pin analogue VGA connector that's standard on PCs isn't really capable of doing 1080p resolution properly. No. It's crappy tellies that can't do a decent res' via their VGA inputs... I've run 1600 x 1200 over VGA analogue with no probs... You can get PC graphics cards with DVI or even HDMI outputs, so you won't need to use the TV's ropey "PC" input. Then you should be able to drive the TV to its full resolution even if it quotes a lower value for PCs. Make sure you get a card that supports the TVs native resolution and a TV with enough inputs, so you can still plug in your DVD player, decoder etc. as well as the PC. Incidentally, If you wan't to be able to read the text at 1080p, you'll need to sit really close, or get a really big TV. Our 32 inch TV is only 720p and, from a "normal" TV viewing distance, the text is borderline unreadable. Then there is something wrong with your telly or the way it's connected up... ;-) I'm driving a 52" 1080 Sharp LCD TV from my PC at 1920 x 1080 using DVI to HDMI cable and it's pixel perfect. No problem reading text at all - even small stuff. Get a PC graphics card that does genuine HD1920 x 1080 via DVI connector. Use a DVI to HDMI cable and go in to the telly via HDMI (NOT PC/VGA) Make sure telly is not resizing / overscanning - most do (stupidly) even with a 'native' 1920 x 1080 signal. Guy Cheers, Colin. |
Best tv for pc resolution
On 24 Sep, 20:39, Colin Stamp wrote:
On Mon, 24 Sep 2007 15:43:57 -0000, wrote: I want to purchase a new lcd tv which will be connected to a pc as well as my sky+. I want a 1080p resolution tv, but I also want to get the maximum resolution use from the connected pc. Those tv's that I have looked at so far allow a max res from pc input of 1024x768. Is this the best I can expect ? TIA I think it's the PC that's your problem, not the TV. The 15-pin analogue VGA connector that's standard on PCs isn't really capable of doing 1080p resolution properly. Unfortunately this is totally untrue. You can get PC graphics cards with DVI or even HDMI outputs, so you won't need to use the TV's ropey "PC" input. Then you should be able to drive the TV to its full resolution even if it quotes a lower value for PCs. No you won't. Make sure you get a card that supports the TVs native resolution and a TV with enough inputs, so you can still plug in your DVD player, decoder etc. as well as the PC. Not necessary if you use 'powerstrip' Incidentally, If you wan't to be able to read the text at 1080p, you'll need to sit really close, or get a really big TV. Our 32 inch TV is only 720p and, from a "normal" TV viewing distance, the text is borderline unreadable. Nothing to do with how close you sit. Text is unreadable on your 32" telly because its only 720. Cheers, Colin. Doc |
Best tv for pc resolution
In article . com, Dr
Hfuhruhurr wrote: I think it's the PC that's your problem, not the TV. The 15-pin analogue VGA connector that's standard on PCs isn't really capable of doing 1080p resolution properly. Unfortunately this is totally untrue. Agreed. My PC monitor is 1600 x 1200 and has both the standard 15-pin VGA analogue input and a digital input. I've been perfectly happy with it connected via 15-pin analogue, but the graphic card on my most recent computer (built this year) has both types of output, giving me the opportunity to try both, so just out of curiosity, I did. I can't see any difference at all. Rod. |
Best tv for pc resolution
....snip...
Make sure you get a card that supports the TVs native resolution and a TV with enough inputs, so you can still plug in your DVD player, decoder etc. as well as the PC. Not necessary if you use 'powerstrip' Incidentally, If you wan't to be able to read the text at 1080p, you'll need to sit really close, or get a really big TV. Our 32 inch TV is only 720p and, from a "normal" TV viewing distance, the text is borderline unreadable. Nothing to do with how close you sit. Text is unreadable on your 32" telly because its only 720. It's also probably running interlaced which won't help. There are some sites on the web that suggest that you can run half the resolution but non-interlaced into most TVs. Not tried it yet but they claim the picture is more readable. Paul DS. |
Best tv for pc resolution
....snip..
Make sure you get a card that supports the TVs native resolution and a TV with enough inputs, so you can still plug in your DVD player, decoder etc. as well as the PC. Not necessary if you use 'powerstrip' Incidentally, If you wan't to be able to read the text at 1080p, you'll need to sit really close, or get a really big TV. Our 32 inch TV is only 720p and, from a "normal" TV viewing distance, the text is borderline unreadable. Nothing to do with how close you sit. Text is unreadable on your 32" telly because its only 720. Whoops - must also read closer. Your telly might support 720p but I'll bet the computer is outputting 720i. Powerstrip can help if that is the case and the text should become far more readable. 720p is not far off 768p which is what many older monitors use and the text is perfectly good on those. Paul DS |
Best tv for pc resolution
On Tue, 25 Sep 2007 12:58:11 +0100, Bigguy wrote:
Colin Stamp wrote: On Mon, 24 Sep 2007 15:43:57 -0000, wrote: I want to purchase a new lcd tv which will be connected to a pc as well as my sky+. I want a 1080p resolution tv, but I also want to get the maximum resolution use from the connected pc. Those tv's that I have looked at so far allow a max res from pc input of 1024x768. Is this the best I can expect ? TIA I think it's the PC that's your problem, not the TV. The 15-pin analogue VGA connector that's standard on PCs isn't really capable of doing 1080p resolution properly. No. It's crappy tellies that can't do a decent res' via their VGA inputs... I've run 1600 x 1200 over VGA analogue with no probs... Well bully for you. 1600 is rather less than 1920 though isn't it? Presumably, if you manage to get your car to go at 120MPH, you will assume that it, and all other cars are actually capable of at least 144MPH. I notice you're not using a VGA cable for your 1080p telly, by the way... You can get PC graphics cards with DVI or even HDMI outputs, so you won't need to use the TV's ropey "PC" input. Then you should be able to drive the TV to its full resolution even if it quotes a lower value for PCs. Make sure you get a card that supports the TVs native resolution and a TV with enough inputs, so you can still plug in your DVD player, decoder etc. as well as the PC. Incidentally, If you wan't to be able to read the text at 1080p, you'll need to sit really close, or get a really big TV. Our 32 inch TV is only 720p and, from a "normal" TV viewing distance, the text is borderline unreadable. Then there is something wrong with your telly or the way it's connected up... ;-) I'm driving a 52" 1080 Sharp LCD TV from my PC at 1920 x 1080 using DVI to HDMI cable and it's pixel perfect. No problem reading text at all - even small stuff. Like I said, you'll need a really big TV. Get a PC graphics card that does genuine HD1920 x 1080 via DVI connector. Use a DVI to HDMI cable and go in to the telly via HDMI (NOT PC/VGA) Make sure telly is not resizing / overscanning - most do (stupidly) even with a 'native' 1920 x 1080 signal. At last, we agree on something. Cheers, Colin. |
Best tv for pc resolution
On Mon, 24 Sep 2007 22:29:57 GMT, "ThePunisher"
wrote: What a load of ********. Indeed you are. Cheers, Colin. |
Best tv for pc resolution
On Tue, 25 Sep 2007 12:07:18 -0000, Dr Hfuhruhurr
wrote: On 24 Sep, 20:39, Colin Stamp wrote: On Mon, 24 Sep 2007 15:43:57 -0000, wrote: I want to purchase a new lcd tv which will be connected to a pc as well as my sky+. I want a 1080p resolution tv, but I also want to get the maximum resolution use from the connected pc. Those tv's that I have looked at so far allow a max res from pc input of 1024x768. Is this the best I can expect ? TIA I think it's the PC that's your problem, not the TV. The 15-pin analogue VGA connector that's standard on PCs isn't really capable of doing 1080p resolution properly. Unfortunately this is totally untrue. Unfortunately, you either didn't read what I said, or you don't know what you're talking about. You can get PC graphics cards with DVI or even HDMI outputs, so you won't need to use the TV's ropey "PC" input. Then you should be able to drive the TV to its full resolution even if it quotes a lower value for PCs. No you won't. Why Not? Make sure you get a card that supports the TVs native resolution and a TV with enough inputs, so you can still plug in your DVD player, decoder etc. as well as the PC. Not necessary if you use 'powerstrip' Incidentally, If you wan't to be able to read the text at 1080p, you'll need to sit really close, or get a really big TV. Our 32 inch TV is only 720p and, from a "normal" TV viewing distance, the text is borderline unreadable. Nothing to do with how close you sit. Oh yes it is. I feel a little better qualified than you to comment on this, since I live in the house but, to my knowledge, you've never even visited. Text is unreadable on your 32" telly because its only 720. Superb! So you reckon that, if my 32" TV was 1080 rather than 720, the text would be larger! I guess the laws of physics must be a little different round your way. Cheers, Colin. |
Best tv for pc resolution
On Tue, 25 Sep 2007 02:19:48 -0700, "
wrote: Control Panel, Display, Settings, Advanced, DPI setting. Change to suit. (some poorly written programs won't understand; some web pages will stupidly over ride - but most things work very well. I'm using 120DPI right now) I've tried bigger DPI settings, but actually "borderline unreadable" is fine for me. I don't normally try to read text from my TV viewing position. Others might though, hence my comment to the OP. Cheers, Colin. |
Best tv for pc resolution
On Tue, 25 Sep 2007 14:22:46 +0100, Roderick Stewart
wrote: In article . com, Dr Hfuhruhurr wrote: I think it's the PC that's your problem, not the TV. The 15-pin analogue VGA connector that's standard on PCs isn't really capable of doing 1080p resolution properly. Unfortunately this is totally untrue. Agreed. My PC monitor is 1600 x 1200 and has both the standard 15-pin VGA analogue input and a digital input. I've been perfectly happy with it connected via 15-pin analogue, but the graphic card on my most recent computer (built this year) has both types of output, giving me the opportunity to try both, so just out of curiosity, I did. I can't see any difference at all. What's agreed? That neither of you have tried 1920X1080 over a VGA connection? Actually, it's probably possible to get it to work, but it'll be hit-and-miss at best. Any TV manufacturer who claims to support 1920X1080 on their VGA input is opening themselves up to loads of complaints. Cheers, Colin |
Best tv for pc resolution
On Tue, 25 Sep 2007 14:33:38 +0100, "Paul D.Smith"
wrote: ...snip... Make sure you get a card that supports the TVs native resolution and a TV with enough inputs, so you can still plug in your DVD player, decoder etc. as well as the PC. Not necessary if you use 'powerstrip' Incidentally, If you wan't to be able to read the text at 1080p, you'll need to sit really close, or get a really big TV. Our 32 inch TV is only 720p and, from a "normal" TV viewing distance, the text is borderline unreadable. Nothing to do with how close you sit. Text is unreadable on your 32" telly because its only 720. It's also probably running interlaced which won't help. There are some sites on the web that suggest that you can run half the resolution but non-interlaced into most TVs. Not tried it yet but they claim the picture is more readable. I'm afraid you've been led up the garden path by Dr Hfuhruhurr. The PC display on my TV is perfect. The text is nearly unreadable because I sit too far away from it. Cheers, Colin. |
Best tv for pc resolution
Colin Stamp wrote:
On Tue, 25 Sep 2007 12:07:18 -0000, Dr Hfuhruhurr wrote: Text is unreadable on your 32" telly because its only 720. Superb! So you reckon that, if my 32" TV was 1080 rather than 720, the text would be larger! I guess the laws of physics must be a little different round your way. No, there would be more lines to display the font, making it clearer. -- Immunity is better than innoculation. Peter |
Best tv for pc resolution
On Tue, 25 Sep 2007 14:22:46 +0100, Roderick Stewart
wrote: but the graphic card on my most recent computer (built this year) has both types of output, giving me the opportunity to try both, so just out of curiosity, I did. I can't see any difference at all. Good heavens. I have two 20" displays at work. My last machine had DVI+VGA on its graphics card, and the difference in the available bandwidth was pretty obvious. I tried swapping the screens in case it was an issue with one screen's input, but the problem stayed on one side, indicating an issue with the VGA. Characters on the VGA-connected screen were always slightly fuzzy. Enough to make me do all text editing (I write a lot of code) on the DVI-connected screen. My new machine has quad DVI, and it made a big difference to usability to have both screens on a decent connection. VGA connection for 1600x1200 and above _is_ feasible, but I personally wouldn't do it. The limitations of the technology really start to become apparent. |
Best tv for pc resolution
|
Best tv for pc resolution
In article , Colin Stamp wrote:
Agreed. My PC monitor is 1600 x 1200 and has both the standard 15-pin VGA* analogue input and a digital input. I've been perfectly happy with it* connected via 15-pin analogue, but the graphic card on my most recent* computer (built this year) has both types of output, giving me the* opportunity to try both, so just out of curiosity, I did. I can't see any* difference at all. What's agreed? That neither of you have tried 1920X1080 over a VGA connection? I thought I was agreeing that it was nonsense to state that a 15-pin VGA analogue video output would have difficulty handling HD video. In reality I would expect it to have no difficulty at all. Actually, it's probably possible to get it to work, but it'll be hit-and-miss at best. 1920x1080 is practically the same number of pixels as 1600x1200, so pretty much the same frequency range would be required to handle it as an analogue signal. The fact that a 15-pin analogue output shows not the slightest hint of problems handling 1600x1200 leads me to expect it would handle the other easily. Performance does not look in the least "hit-and-miss". Analogue video amplifiers can be made which exceed the required performance by a generous margin. Rod. |
Best tv for pc resolution
In article , Bob Moore
wrote: VGA connection for 1600x1200 and above _is_ *feasible, but I personally wouldn't do it. The limitations of the technology really start to become apparent. What limitations? We're only considering a video amplifier that has to drive a signal along a couple of metres of cable. I saw analogue 1250/50 and 1125/60 video from tube cameras fed to CRT monitors way back in the 1980s, and the French were broadcasting 819/50 video several decades before that, in the days of valve amplifiers. Broadcast video signals are routinely fed along hundreds of metres of cables, and in that situation the analogue ones can manage greater cable lengths before the signal suffers. All it needs is a properly designed video output stage and the correct impedance cable. If your video output stage cannot manage 2 metres of cable, there is something wrong with it, or it's the wrong cable. Rod. |
Best tv for pc resolution
On Tue, 25 Sep 2007 19:53:02 +0100, Roderick Stewart
wrote: In article , Colin Stamp wrote: Agreed. My PC monitor is 1600 x 1200 and has both the standard 15-pin VGA* analogue input and a digital input. I've been perfectly happy with it* connected via 15-pin analogue, but the graphic card on my most recent* computer (built this year) has both types of output, giving me the* opportunity to try both, so just out of curiosity, I did. I can't see any* difference at all. What's agreed? That neither of you have tried 1920X1080 over a VGA connection? I thought I was agreeing that it was nonsense to state that a 15-pin VGA analogue video output would have difficulty handling HD video. In reality I would expect it to have no difficulty at all. You're agreeing, yet you haven't tried it any more than he has. Actually, it's probably possible to get it to work, but it'll be hit-and-miss at best. 1920x1080 is practically the same number of pixels as 1600x1200, so pretty much the same frequency range would be required to handle it as an analogue signal. It's the horizontal resolution that really matters, not the vertical, so it's 20% higher. The fact that a 15-pin analogue output shows not the slightest hint of problems handling 1600x1200 leads me to expect it would handle the other easily. Expect, but not know. Performance does not look in the least "hit-and-miss". Analogue video amplifiers can be made which exceed the required performance by a generous margin. It's hit-and-miss in that some systems will work and some won't. If you have one that works then that's wonderful - for you and you alone. But then again, we don't know that it works even for you. I've even seen one 1380X720 setup that's pixel-perfect until the PC starts to warm up, then starts to smear after half an hor or so. Flat-panel displays are *much* less forgiving than CRTs in this respect. The output *has* to be pixel-perfect or it shows up really badly. Cheers, Colin. |
Best tv for pc resolution
On Tue, 25 Sep 2007 19:53:03 +0100, Roderick Stewart
wrote: In article , Bob Moore wrote: VGA connection for 1600x1200 and above _is_ *feasible, but I personally wouldn't do it. The limitations of the technology really start to become apparent. What limitations? We're only considering a video amplifier that has to drive a signal along a couple of metres of cable. I saw analogue 1250/50 and 1125/60 video from tube cameras fed to CRT monitors way back in the 1980s, and the French were broadcasting 819/50 video several decades before that, in the days of valve amplifiers. Broadcast video signals are routinely fed along hundreds of metres of cables, and in that situation the analogue ones can manage greater cable lengths before the signal suffers. All it needs is a properly designed video output stage and the correct impedance cable. If your video output stage cannot manage 2 metres of cable, there is something wrong with it, or it's the wrong cable. But you're thinking back in the analogue realm, when a bit of timing skew didn't really matter one way or the other because it'd just get lost in the CRT displays where the number of phosphor dots exceeded the resolution of the incoming signal. With a flat-panel, if the signal moves by more than half a pixel's width, it'll turn up really visibly in the pixel next-door. Apologies if the above doesn't make sense. I've had a few... Cheers, Colin. |
Best tv for pc resolution
On Tue, 25 Sep 2007 20:19:09 +0100, Colin Stamp
wrote: You're agreeing, yet you haven't tried it any more than he has. Thousands of people use 1080P via VGA on XBox 360's. -- Andrew, contact via http://interpleb.googlepages.com Help make Usenet a better place: English is read downwards, please don't top post. Trim replies to quote only relevant text. Check groups.google.com before asking an obvious question. |
Best tv for pc resolution
On 25 Sep, 18:04, Colin Stamp wrote:
On Tue, 25 Sep 2007 14:33:38 +0100, "Paul D.Smith" wrote: ...snip... Make sure you get a card that supports the TVs native resolution and a TV with enough inputs, so you can still plug in your DVD player, decoder etc. as well as the PC. Not necessary if you use 'powerstrip' Incidentally, If you wan't to be able to read the text at 1080p, you'll need to sit really close, or get a really big TV. Our 32 inch TV is only 720p and, from a "normal" TV viewing distance, the text is borderline unreadable. Nothing to do with how close you sit. Text is unreadable on your 32" telly because its only 720. It's also probably running interlaced which won't help. There are some sites on the web that suggest that you can run half the resolution but non-interlaced into most TVs. Not tried it yet but they claim the picture is more readable. I'm afraid you've been led up the garden path by Dr Hfuhruhurr. The PC display on my TV is perfect. The text is nearly unreadable because I sit too far away from it. Well now, that wasn't exactly made clear now, was it? Doc |
Best tv for pc resolution
I'm afraid you've been led up the garden path by Dr Hfuhruhurr.
The PC display on my TV is perfect. The text is nearly unreadable because I sit too far away from it. Well now, that wasn't exactly made clear now, was it? Doc Got to agree with the good doctor here. The fact that you want to read the text from across the room was far from obvious. If you want to be able to read text from a distance, I'd suggest either increasing the font size or investing in a data projector so you can get a "6 foot screen". Paul DS. |
Best tv for pc resolution
In article , Colin Stamp wrote:
I thought I was agreeing that it was nonsense to state that a 15-pin VGA* analogue video output would have difficulty handling HD video. In reality I* would expect it to have no difficulty at all.* You're agreeing, yet you haven't tried it any more than he has. I'm looking at a 1600x1200 screen right now. I'm sitting quite close to it looking at quite small text. I can't even be sure without checking the cables whether it is fed from the digital or analogue output from the computer, because I have tried both and cannot see any difference at all. 1920x1080 is practically the same number of pixels as 1600x1200, so pretty* much the same frequency range would be required to handle it as an analogue* signal.* It's the horizontal resolution that really matters, not the vertical, so it's 20% higher. That's close enough not to matter. Typically the frequency response of a video amplifier and 2 metres of cable (or, for that matter, 20 metres of cable) will exceed the expected frequency range of the video signal by a very generous margin indeed, in percentage terms perhaps several hundred, so a mere 20% difference in the upper frequency limits of two signals is of no consequence. The fact that a 15-pin analogue output shows not the slightest hint* of problems handling 1600x1200 leads me to expect it would handle the other* easily.* Expect, but not know. My expectations are based on decades of practical experience using, and occasionally designing and building, video equipment of various types, and connecting it via cables of various lengths. In this context, if I say I would "expect" two very similar video signals to behave in a very similar manner when fed through two metres of cable, then I might be guilty of understatement to avoid appearing arrogant, but it's actually pretty close to a certainty. I may not have seen this exact situation, but I have seen plenty of equivalent ones. Performance does not look in the least "hit-and-miss". Analogue video* amplifiers can be made which exceed the required performance by a generous* margin. It's hit-and-miss in that some systems will work and some won't. If you have one that works then that's wonderful - for you and you alone. But then again, we don't know that it works even for you. Well, I know that it works for me, because I'm looking at it. Design of video amplifiers and impedance matching of cables is not hit-and-miss at all. There are ways of doing these things properly, and they are done routinely by broadcasters who convey their signals through hundreds of miles of cables, some of the circuits changing by the day, or by the hour, depending on the needs of particular programmes. If a video amplifier works with one monitor but not another, or if it handles one video signal but not another in which the upper frequency limit is only 20% different, then it's badly designed. I've even seen one 1380X720 setup that's pixel-perfect until the PC starts to warm up, then starts to smear after half an hor or so. If a video amplifier gives a performance that deteriorates as it warms up, then it's *very* badly designed. I'd suspect a fault in the monitor first. Flat-panel displays are *much* less forgiving than CRTs in this respect. The output *has* to be pixel-perfect or it shows up really badly. True, but I don't think this can have anything to do with the video amplifier or the cable. Rod. |
Best tv for pc resolution
In article , Colin Stamp wrote:
All it needs is a properly designed video output stage* and the correct impedance cable. If your video output stage cannot* manage 2 metres of cable, there is something wrong with it, or it's the* wrong cable. But you're thinking back in the analogue realm, when a bit of timing skew didn't really matter one way or the other because it'd just get lost in the CRT displays where the number of phosphor dots exceeded the resolution of the incoming signal. With a flat-panel, if the signal moves by more than half a pixel's width, it'll turn up really visibly in the pixel next-door. Apologies if the above doesn't make sense. I've had a few... It makes perfect sense, and yes, when considering a VGA analogue signal being fed through a cable, I am thinking in the analogue realm. However, I can't think of anything in a properly designed video amplifier that could change enough to cause pixels to "move", and have never seen this. As long as I choose the pixel size of the PC output video signal to match the physical pixel size of the display, there never seems to be a problem. Rod. |
Best tv for pc resolution
On Wed, 26 Sep 2007 11:17:05 +0100, "Paul D.Smith"
wrote: I'm afraid you've been led up the garden path by Dr Hfuhruhurr. The PC display on my TV is perfect. The text is nearly unreadable because I sit too far away from it. Well now, that wasn't exactly made clear now, was it? Doc Got to agree with the good doctor here. The fact that you want to read the text from across the room was far from obvious. How obvious do you need it? Here's the paragraph you both just misread... Incidentally, If you wan't to be able to read the text at 1080p, you'll need to sit really close, or get a really big TV. Our 32 inch TV is only 720p and, from a "normal" TV viewing distance, the text is borderline unreadable. Still seems pretty clear to me. I suppose I must just be strange... If you want to be able to read text from a distance, I'd suggest either increasing the font size or investing in a data projector so you can get a "6 foot screen". Again you've read something into it that was never there. It was an observation, not a complaint. I don't want to be able to read the text from a distance, I've just noticed that it's difficult and I thought the OP might benefit from the knowledge before selecting his own TV. Cheers, Colin. |
Best tv for pc resolution
On Wed, 26 Sep 2007 11:27:23 +0100, Roderick Stewart
wrote: In article , Colin Stamp wrote: All it needs is a properly designed video output stage* and the correct impedance cable. If your video output stage cannot* manage 2 metres of cable, there is something wrong with it, or it's the* wrong cable. But you're thinking back in the analogue realm, when a bit of timing skew didn't really matter one way or the other because it'd just get lost in the CRT displays where the number of phosphor dots exceeded the resolution of the incoming signal. With a flat-panel, if the signal moves by more than half a pixel's width, it'll turn up really visibly in the pixel next-door. Apologies if the above doesn't make sense. I've had a few... It makes perfect sense, and yes, when considering a VGA analogue signal being fed through a cable, I am thinking in the analogue realm. However, I can't think of anything in a properly designed video amplifier that could change enough to cause pixels to "move", If the bandwidth is limited, the rise and fall times will both extend, "moving" both edges of the pixel to the right on the screen. The effect would obviously be very slight, but then you've only got half a 1920 pixel's width to play with. and have never seen this. As long as I choose the pixel size of the PC output video signal to match the physical pixel size of the display, there never seems to be a problem. It's great that *you* haven't had problems up to 1600 horizontal (Is that on CRT or LCD, by the way?). It doesn't guarantee that others won't have problems at 1920 horizontal though. Cheers, Colin. |
Best tv for pc resolution
On Wed, 26 Sep 2007 08:45:38 +0100, Andrew wrote:
On Tue, 25 Sep 2007 20:19:09 +0100, Colin Stamp wrote: You're agreeing, yet you haven't tried it any more than he has. Thousands of people use 1080P via VGA on XBox 360's. OK, I'll take your word for it that they're all pixel-perfect, so that's exactly *one* VGA source that has been made to work at 1080p. Now, what about all the thousands of other VGA sources? Cheers, Colin. |
Best tv for pc resolution
On Wed, 26 Sep 2007 11:27:22 +0100, Roderick Stewart
wrote: In article , Colin Stamp wrote: I thought I was agreeing that it was nonsense to state that a 15-pin VGA* analogue video output would have difficulty handling HD video. In reality I* would expect it to have no difficulty at all.* You're agreeing, yet you haven't tried it any more than he has. I'm looking at a 1600x1200 screen right now. I'm sitting quite close to it looking at quite small text. I can't even be sure without checking the cables whether it is fed from the digital or analogue output from the computer, because I have tried both and cannot see any difference at all. We've been though this before. You only have one system, you haven't said whether it's LCD or CRT, and 1600 is significantly less than 1920. 1920x1080 is practically the same number of pixels as 1600x1200, so pretty* much the same frequency range would be required to handle it as an analogue* signal.* It's the horizontal resolution that really matters, not the vertical, so it's 20% higher. That's close enough not to matter. Typically the frequency response of a video amplifier and 2 metres of cable (or, for that matter, 20 metres of cable) will exceed the expected frequency range of the video signal by a very generous margin indeed, in percentage terms perhaps several hundred, so a mere 20% difference in the upper frequency limits of two signals is of no consequence. Yep, analogue VGA was no-doubt designed to have a very generous bandwidth - for 640 pixels horizontally. No doubt it's been improved a lot since then, but there's still a limit, and given the myriad of different graphics cards and displays, it's an extremely wooly limit. The grey area certainly extends into the gap between 1920 and 1600, and probably below. The fact that a 15-pin analogue output shows not the slightest hint* of problems handling 1600x1200 leads me to expect it would handle the other* easily.* Expect, but not know. My expectations are based on decades of practical experience using, and occasionally designing and building, video equipment of various types, and connecting it via cables of various lengths. In this context, if I say I would "expect" two very similar video signals to behave in a very similar manner when fed through two metres of cable, then I might be guilty of understatement to avoid appearing arrogant, but it's actually pretty close to a certainty. I may not have seen this exact situation, but I have seen plenty of equivalent ones. But you still haven't seen *every* combination of the thousands of video cards on the market plugged into the thousands of 1080 TVs on the market via VGA interfaces. In fact, you don't appear to have seen any at-all. Performance does not look in the least "hit-and-miss". Analogue video* amplifiers can be made which exceed the required performance by a generous* margin. It's hit-and-miss in that some systems will work and some won't. If you have one that works then that's wonderful - for you and you alone. But then again, we don't know that it works even for you. Well, I know that it works for me, because I'm looking at it. Design of video amplifiers and impedance matching of cables is not hit-and-miss at all. There are ways of doing these things properly, and they are done routinely by broadcasters who convey their signals through hundreds of miles of cables, some of the circuits changing by the day, or by the hour, depending on the needs of particular programmes. If a video amplifier works with one monitor but not another, or if it handles one video signal but not another in which the upper frequency limit is only 20% different, then it's badly designed. And you don't think these "badly designed" ones are out there in significant numbers then? The vast bulk of video cards haven't been designed by broadcasters. I've even seen one 1380X720 setup that's pixel-perfect until the PC starts to warm up, then starts to smear after half an hor or so. If a video amplifier gives a performance that deteriorates as it warms up, then it's *very* badly designed. I'd suspect a fault in the monitor first. Nope. It's not the TV. The effect is independent of the TV temperature, but dependant on the PC temperature. I guess the video card is a bit crap It was cheap after-all. It's disappointing that it doesn't work perfectly at 1380 horizontal, but not a huge surprise. Incidentally, the same PC worked apparently perfectly at 1600X1200 on a CRT monitor for some time before being switched to TV duty. Flat-panel displays are *much* less forgiving than CRTs in this respect. The output *has* to be pixel-perfect or it shows up really badly. True, but I don't think this can have anything to do with the video amplifier or the cable. Of course it does. The interface can be a bit dodgy and it'll just marginally reduce the sharpness on a CRT screen - nobody will be any the wiser. Use that same interface to drive an LCD display at the same resolution, and the deficiency will be really obvious. Cheers, Colin |
Best tv for pc resolution
On Wed, 26 Sep 2007 20:36:31 +0100, Mike Henry
wrote: In , Colin Stamp wrote: On Wed, 26 Sep 2007 11:27:22 +0100, Roderick Stewart wrote: My expectations are based on decades of practical experience using, and occasionally designing and building, video equipment of various types, and connecting it via cables of various lengths. In this context, if I say I would "expect" two very similar video signals to behave in a very similar manner when fed through two metres of cable, then I might be guilty of understatement to avoid appearing arrogant, but it's actually pretty close to a certainty. I may not have seen this exact situation, but I have seen plenty of equivalent ones. But you still haven't seen *every* combination of the thousands of video cards on the market plugged into the thousands of 1080 TVs on the market via VGA interfaces. In fact, you don't appear to have seen any at-all. Oh dear. You must be new here. I'd take anything Roderick says above anything you say at any time. Well it sure beats listening to the arguments and forming an opinion, I suppose. Cheers, Colin. |
Best tv for pc resolution
The bottom line is, the more detail you want on your screen, the bigger the screen will need to be, or the closer you will need to sit. It sounds really obvious, I know, but I bet it's often not thought of until after the small, high-resolution screen gets installed ten feet away from the sofa, and the PC boots for the first time.. Ok your 32" TV what is the optimum distance to read small text? and at what distance is it ok and at what distance is it unacceptable. An example to explain what I mean is, you might say: the optimum distance is 5 foot, at 4 foot and 6 foot I can read the text ok and at =7 foot I can't read the text and at =3 foot although i can read the text, I am sat too close to the tv. |
Best tv for pc resolution
On Wed, 26 Sep 2007 22:03:56 +0100, "Jane T" wrote:
The bottom line is, the more detail you want on your screen, the bigger the screen will need to be, or the closer you will need to sit. It sounds really obvious, I know, but I bet it's often not thought of until after the small, high-resolution screen gets installed ten feet away from the sofa, and the PC boots for the first time.. Ok your 32" TV what is the optimum distance to read small text? and at what distance is it ok and at what distance is it unacceptable. An example to explain what I mean is, you might say: the optimum distance is 5 foot, at 4 foot and 6 foot I can read the text ok and at =7 foot I can't read the text and at =3 foot although i can read the text, I am sat too close to the tv. Well, it's all subjective, of course, but I'd say there's no point sitting closer than 1M - the individual sub-pixels are descendible at that point. Everything is comfortably readable at 2M. Where I normally sit, at 3M it's definitely borderline. I find myself guessing a bit, which works fine for things like menus and icon titles, but not so well for text I haven't read before. Also, I need good contrast to have a chance, so I have to lean forward for greyed-out menu options, or text on busy backgrounds. At 4M, any reading is mostly guesswork. The resolution is 1360X768 and I use the normal size (96 DPI) fonts, which makes the standard text for things like folder listings, menus etc. around 4.5mm high. My eyes are un-modified and 40 years old. Cheers, Colin. |
Best tv for pc resolution
In article , Colin Stamp wrote:
I'm looking at a 1600x1200 screen right now. I'm sitting quite close to it* looking at quite small text. I can't even be sure without checking the cables* whether it is fed from the digital or analogue output from the computer,* because I have tried both and cannot see any difference at all. We've been though this before. You only have one system, you haven't said whether it's LCD or CRT, and 1600 is significantly less than 1920. You're right. Checking back I see that I didn't actually specify the type of monitor in those terms, though I did say that my *monitor* was 1600 x 1200, which would not be the case if it was a CRT, because the displayed resolution would depend on the signal fed to it. Evidently that didn't make it as clear as I assumed it would. Sorry for the misunderstanding. It is actually an LCD type with a physical pixel structure of 1600 x 1200, so of course best results are obtained by presenting it with a video signal with that pixel size. It does not make the slightest difference that I can see whether I use the digital or analogue signal, and I have looked very carefully because I was interested t osee if there was one. However, I would dispute "significantly". Yes there will be a difference in the frequency response required of the video amplifier, but it is not significant. I haven't worked out exactly what it would be, but you suggested 20% and that sounds about right. Hardly enough to make a difference unless the video amplifier had a fairly steep HF filter somewhere between the upper limits of the two video signals, which would be highly unusual as the filter itself would do more damage than anything else. I also wonder how you think you can say that I have only one system. We are not aquainted outside this newsgroup, so how could you possibly know? The best experiment I've been able to try was on my main system, as described, because both the graphic card and the monitor can handle digital and analogue signals, so it really is an "all other things being equal" situation, but it is not the only system I have, and certainly not the only one I have seen. I actually have three working computers at the moment, but must have built dozens over the years, for myself and other people. I am familiar with the behaviour of CRT and LCD monitors, video amplifiers, and various ways of connecting them together, and have used them for a variety of purposes. If a video amplifier works with one monitor but not* another, or if it handles one video signal but not another in which the upper* frequency limit is only 20% different, then it's badly designed. And you don't think these "badly designed" ones are out there in significant numbers then? The vast bulk of video cards haven't been designed by broadcasters. I'm sure there's a lot of rubbish out there, but it doesn't tell us anything fundamental about the relative merits of digital versus analogue connections between computers and monitors through 2 metres of cable. If something doesn't work very well because it's badly designed, then that's the reason, not the fact that it's analogue, or whatever. If a video amplifier gives a performance that deteriorates as it warms up, then* it's *very* badly designed. I'd suspect a fault in the monitor first. Nope. It's not the TV. The effect is independent of the TV temperature, but dependant on the PC temperature. I guess the video card is a bit crap It was cheap after-all. It's disappointing that it doesn't work perfectly at 1380 horizontal, but not a huge surprise. Well, there's your answer - not a very good video card. Incidentally, the same PC worked apparently perfectly at 1600X1200 on a CRT monitor for some time before being switched to TV duty. I'd suggest that agrees with the above. True, but I don't think this can have anything to do with the video amplifier* or the cable. Of course it does. The interface can be a bit dodgy and it'll just marginally reduce the sharpness on a CRT screen - nobody will be any the wiser. Use that same interface to drive an LCD display at the same resolution, and the deficiency will be really obvious. Assuming both monitors - LCD and CRT - are capable of displaying the full bandwidth of a given video signal, can you explain the mechanism which will make a reduction in HF of the signals fed to them more apparent on one display than the other? I can't think how this could happen, and have never seen the effect. Rod. |
| All times are GMT +1. The time now is 06:46 PM. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com