|
Questions on how to connect computer to TV
On Apr 28, 10:29*am, wrote:
Agent_C wrote: Go over to ecost.com and get yourself a referb Westinghouse with a VGA input. Anything you do with HDMI and a computer is potentially *very* problematic. really? so its BEST to use the vga connector on a TV for this? Potentially less _troublesome_. Video performance of DVI is the same as HDMI but does not suport Digital Rights Management so is not trouble ( but no audio in DVI ) at all but I'm having bad feelings about getting a BluRay drive for the PC. I was experimenting a little this week comparing VGA to DVI into a Samsung DLP set. The DVI is a 1:1 pixel relationship so the Windows desktop shows text that is absolutely clear with no 'rounding' errors. The VGA is almost as good but shows some 'fat' spots across the screen of only vertical lines because the computer and monitor are not at the same pixel rate. Doing a 'zoom' in the TV blurs it all so it is to be avoided. GG |
Questions on how to connect computer to TV
I see where adapters exist galore to take PC VGA outputs to TV RCA
inputs. But I also see a warning that the video card involved must have TV Output function built into VGA port. Anyone know if my RADEON 9250 AGP card has that? I have lost the manual. I also have a RADEON 7000 series AGP video card. Same question. Thanks -GECKO |
Questions on how to connect computer to TV
On Mon, 28 Apr 2008 13:02:26 -0700 (PDT) G-squared wrote:
| The VGA is almost as good but shows some 'fat' spots across the screen | of only vertical lines because the computer and monitor are not at the | same pixel rate. Doing a 'zoom' in the TV blurs it all so it is to be | avoided. An LCD monitor intended for computer use generally has a clock rate adjustment that allows you to adjust it so the sampling coincides with the pixel clock of the video graphics card. Then there is a phase adjustment to be sure the point of sampling is in the middle of a pixel instead of near between pixels (which can make the picture more fuzzy or more noisy or both). It then remembers this setting separate for each different line rate and/or line count it detects (in the better monitors). TVs with VGA inputs don't seem to do this as well. If you can use the HDMI input with a DVI to HDMI cable from a DVI capable video graphics card, that eliminates the pixel syncronization issues. -- |WARNING: Due to extreme spam, I no longer see any articles originating from | | Google Groups. If you want your postings to be seen by more readers | | you will need to find a different place to post on Usenet. | | Phil Howard KA9WGN (email for humans: first name in lower case at ipal.net) | |
Questions on how to connect computer to TV
On Mon, 28 Apr 2008 20:43:17 GMT gecko wrote:
| I see where adapters exist galore to take PC VGA outputs to TV RCA | inputs. But I also see a warning that the video card involved must | have TV Output function built into VGA port. Anyone know if my RADEON | 9250 AGP card has that? I have lost the manual. That kind of "TV output" is most likely RS-170 timing (e.g. NTSC-like but RGB component instead of having a subcarrier for chroma). If you want to get full 1366x768 or 1920x1080 resolution, the "TV output" may not be what you really want. -- |WARNING: Due to extreme spam, I no longer see any articles originating from | | Google Groups. If you want your postings to be seen by more readers | | you will need to find a different place to post on Usenet. | | Phil Howard KA9WGN (email for humans: first name in lower case at ipal.net) | |
Questions on how to connect computer to TV
On Mon, 28 Apr 2008 20:43:17 +0000, gecko wrote:
I see where adapters exist galore to take PC VGA outputs to TV RCA inputs. But I also see a warning that the video card involved must have TV Output function built into VGA port. Anyone know if my RADEON 9250 AGP card has that? I have lost the manual. Download it or just look at the card. It's not hard to see what connections it has. Don't use the VGA connector unless your TV has a vga input. The Visiontek xtasy 9250 has vga, DVI, and analog outputs (either composite or S-video) I also have a RADEON 7000 series AGP video card. Same question. Same answer as before. Look at it. My guess would be that both cards have analog out. -- Want the ultimate in free OTA SD/HDTV Recorder? http://mythtv.org My Tivo Experience http://wesnewell.no-ip.com/tivo.htm Tivo HD/S3 compared http://wesnewell.no-ip.com/mythtivo.htm AMD cpu help http://wesnewell.no-ip.com/cpu.php |
Questions on how to connect computer to TV
"G-squared" wrote in message
On Apr 28, 10:29 am, wrote: Agent_C wrote: Go over to ecost.com and get yourself a referb Westinghouse with a VGA input. Anything you do with HDMI and a computer is potentially *very* problematic. really? so its BEST to use the vga connector on a TV for this? Potentially less _troublesome_. Video performance of DVI is the same as HDMI Agreed. DVI's digital facility can be thought of as HDMI on a different connector. DVI also has an analog facility, but it there more for backwards compatibility. but does not suport Digital Rights Management HDCP is a technology that is widely used for digital rights management. Both DVI-HDCP and HDMI-HDCP are supported. so is not trouble ( but no audio in DVI ) DVI's digital interface supports both sound and video on digital streams. at all but I'm having bad feelings about getting a BluRay drive for the PC. Been there, done that. The video cards that the BluRay PC software supports implement HDCP over DVI or HDMI. I was experimenting a little this week comparing VGA to DVI into a Samsung DLP set. The DVI is a 1:1 pixel relationship so the Windows desktop shows text that is absolutely clear with no 'rounding' errors. DVI or HDCP are the preferred means for driving a LCD, DLP, or plasma display. The VGA is almost as good but shows some 'fat' spots across the screen of only vertical lines because the computer and monitor are not at the same pixel rate. You give yourself that problem any number of ways. Doing a 'zoom' in the TV blurs it all so it is to be avoided. Feeding analog into a display that is basically digital is to be avoided wherever possible. |
Questions on how to connect computer to TV
On Apr 28, 2:18*pm, "Arny Krueger" wrote:
snip ] so is not trouble ( but no audio in DVI ) DVI's digital interface supports both sound and video on digital streams. Didn't know about audio in DVI at *all but I'm having bad feelings about getting a BluRay drive for the PC. Been there, done that. The video cards that the BluRay PC software supports implement HDCP over DVI or HDMI. My 'problem' is the 4 1/2 year old Samsung DLP. I fear it knows little to nothing about HDCP so getting a BluRay and then a new computer and THEN a new TV too.. Well, it all works well now and I don't 'rock the boat' just for fun. The ATI DVD player really does an outstanding job with SD DVD so I may just skip it for now. I was experimenting a little this week comparing VGA to DVI into a Samsung DLP set. The DVI is a 1:1 pixel relationship so the Windows desktop shows text that is absolutely clear with no 'rounding' errors. DVI or HDCP are the preferred means for driving a LCD, DLP, or plasma display. The VGA is almost as good but shows some 'fat' spots across the screen of only vertical lines because the computer and monitor are not at the same pixel rate. You give yourself that problem any number of ways. Doing a 'zoom' in the TV blurs it all so it is to be avoided. Feeding analog into a display that is basically digital is to be avoided wherever possible. Agreed. The reason I was fiddling with it was a donated 50" Sony LCD RPTV at work and when I connected one of my PC's to it (had it at work to give it its annual dust blow out with the compressor), the video was unusually poor. I could not find a setting on the ATI 9600XT to go native to the Sony and the Sony does not report back to the PC like the Samsung does. Basically I was curious to see if I could get the Samsung to look as bad as the Sony. I can but it's an abnormal run situation. GG |
Questions on how to connect computer to TV
"G-squared" wrote in message
On Apr 28, 2:18 pm, "Arny Krueger" wrote: snip ] so is not trouble ( but no audio in DVI ) DVI's digital interface supports both sound and video on digital streams. Didn't know about audio in DVI That's because I was wrong about that. You have to add sound to DVI when you convert it to full-function HDMI. My bad. at all but I'm having bad feelings about getting a BluRay drive for the PC. Been there, done that. The video cards that the BluRay PC software supports implement HDCP over DVI or HDMI. My 'problem' is the 4 1/2 year old Samsung DLP. I fear it knows little to nothing about HDCP so getting a BluRay and then a new computer and THEN a new TV too.. There is or at least was a month ago, a HDCP-compliant HDMI receiver that puts out RGBHV analog. Well, it all works well now and I don't 'rock the boat' just for fun. The ATI DVD player really does an outstanding job with SD DVD so I may just skip it for now. If you upgrade your HTPC to Blu ray, it will no doubt take a Blu Ray drive and a video card plus the software which will probably come with the drive. AFAIK, all of the HDCP-compliant video cards are PCI-E. So, if your HTPC is AGP, its time for a new motherboard. I was experimenting a little this week comparing VGA to DVI into a Samsung DLP set. The DVI is a 1:1 pixel relationship so the Windows desktop shows text that is absolutely clear with no 'rounding' errors. DVI or HDCP are the preferred means for driving a LCD, DLP, or plasma display. The VGA is almost as good but shows some 'fat' spots across the screen of only vertical lines because the computer and monitor are not at the same pixel rate. You give yourself that problem any number of ways. Doing a 'zoom' in the TV blurs it all so it is to be avoided. Feeding analog into a display that is basically digital is to be avoided wherever possible. Agreed. The reason I was fiddling with it was a donated 50" Sony LCD RPTV at work and when I connected one of my PC's to it (had it at work to give it its annual dust blow out with the compressor), the video was unusually poor. I could not find a setting on the ATI 9600XT to go native to the Sony and the Sony does not report back to the PC like the Samsung does. I found the same problem with a 5200 series Nvidia card, and new drivers added a ton of new modes in the middle. ATI may be competitive - a new driver might help. |
Questions on how to connect computer to TV
Agent_C wrote:
On Mon, 28 Apr 2008 12:29:11 -0500, wrote: really? so its BEST to use the vga connector on a TV for this? As a practical matter yes, or a DVI. You don't need anything too tricked out to do HD on a computer. For example, I have a budget Dell machine connected to a Sony 46" XBR via the VGA input. I get full HD resolution (1920 x 1080) via the onboard Intel graphics. A_C DVI is not the best because you don't get to see the entire booting of the computer. This is not a problem unless there is a problem. With DVI, at least under Windows, the image doesn't start showing on the TV until the driver is started, and this doesn't seem to occur until Windows has pretty much started up. |
Questions on how to connect computer to TV
|
| All times are GMT +1. The time now is 07:32 PM. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com