HomeCinemaBanter

HomeCinemaBanter (http://www.homecinemabanter.com/index.php)
-   High definition TV (http://www.homecinemabanter.com/forumdisplay.php?f=6)
-   -   How can a TV know that an image is coming from a computer, not acomsumer set-top box? (http://www.homecinemabanter.com/showthread.php?t=55341)

John Rumm December 15th 07 04:05 PM

How can a TV know that an image is coming from a computer, nota comsumer set-top box?
 
wrote:

use HDCP. It's only for protected content that it is expected to use
HDCP to ensure you cannot use a monitor that is really something like a
recorder, or let you tap the HDMI cable wires (it's encrypted in HDCP).


Unless you install AnyDVD on it first, and then the whole sorry mess
ceases to matter. ;-)

--
Cheers,

John.

/================================================== ===============\
| Internode Ltd -
http://www.internode.co.uk |
|-----------------------------------------------------------------|
| John Rumm - john(at)internode(dot)co(dot)uk |
\================================================= ================/

ray[_2_] December 15th 07 07:55 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
On Fri, 14 Dec 2007 04:51:41 -0800, D wrote:

Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima


It probably can't tell that. But it can 'tell' if it's asked to do
something outside it's range of capabilities.


Glenn Millar December 15th 07 11:03 PM

How can a TV know that an image is coming from a computer, nota comsumer set-top box?
 
Woody wrote:
"T Shadow" wrote in message
...
"D" wrote in message
...
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is
Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima

Wouldn't rule out technical reasons but probably they just don't want
to
answer questions about it. Puts the onus on you.


Likely because HDMI has authentication handshaking built in to its
protocol and the PC may not be savvy to such things.


In reality, you wouldn't want to use the HDMI connection to connect a
PC. The best results on my 50" samsung is via the VGA connector. That
way I get full [email protected] progressive whereas 720p or 1080i is in
actually a lesser resolution.

Give it a try on the HDMI input. It just work out. On the other hand if
you have a 1080p screen, buy good card with a HDMI output capable of 1080p.

Regards
Glenn.

steveo December 15th 07 11:07 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 

wrote in message
...
In alt.tv.tech.hdtv Woody wrote:
|
| "T Shadow" wrote in message
| ...
| "D" wrote in message
|
...
| Hello!
| According to Samsung LE-32r71b HDTV manual the TV cannot receive an
| image from a computer through its HDMI input, but through its d-sub
| only. Is it really true? How can the TV know that an image is coming
| from a computer, not a comsumer set-top box? My video card is
| Gigabyte
| HD 2600Pro. I would like to use a DVI-HDMI cable.
| Regards,
| Dima
|
| Wouldn't rule out technical reasons but probably they just don't want
| to
| answer questions about it. Puts the onus on you.
|
|
| Likely because HDMI has authentication handshaking built in to its
| protocol and the PC may not be savvy to such things.

HDMI and DVI are essentially the same thing, but with different connection
and no standard for audio over DVI. Presumably you can even do HDCP over
DVI if it doesn't need the sound are part of its authentication checks.


Most cable STB use DVI and they most certainly have HDCP enabled.

steveo


GeorgeB December 15th 07 11:36 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
On Sat, 15 Dec 2007 22:03:14 +0000, Glenn Millar
wrote:

In reality, you wouldn't want to use the HDMI connection to connect a
PC. The best results on my 50" samsung is via the VGA connector. That
way I get full [email protected] progressive whereas 720p or 1080i is in
actually a lesser resolution.

Regards
Glenn.


The VGA 15 pin D style handles higher resolutions pretty well; I use
1600x1200 via that connection and it has no problems I detect.

What I'd like, without spending money, of course, is 2400x1600 ... or
maybe 3200x1200 dual monitor.

But that costs money. With 6+ megapixel cameras, can the monitors be
far behind?

Scott Alfter December 15th 07 11:48 PM

How can a TV know that an image is coming from a computer, nota comsumer set-top box?
 
In article ,
Glenn Millar wrote:
In reality, you wouldn't want to use the HDMI connection to connect a
PC. The best results on my 50" samsung is via the VGA connector. That
way I get full [email protected] progressive whereas 720p or 1080i is in
actually a lesser resolution.


There's no reason you can't use the same modeline over DVI or HDMI that you
would use with VGA. On the contrary, in my experience it's been much easier
to get LCDs working on a digital connection than on an analog connection.
LCDs sold for computer use have a button on them that usually allows them to
sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
correspondence between pixels in the framebuffer and pixels on the screen,
you then have to do extensive tinkering with modelines...and you might never
come up with a working modeline.

My TV has a native resolution of 1280x768. I generated a modeline for that
resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
Works.

_/_
/ v \ Scott Alfter (remove the obvious to send mail)
(IIGS( http://alfter.us/ Top-posting!
\_^_/ rm -rf /bin/laden What's the most annoying thing on Usenet?


G-squared December 15th 07 11:48 PM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 15, 2:03 pm, Glenn Millar wrote:
snip
In reality, you wouldn't want to use the HDMI connection to connect

a
PC. The best results on my 50" samsung is via the VGA connector.

That
way I get full [email protected] progressive whereas 720p or 1080i is

in
actually a lesser resolution.

Give it a try on the HDMI input. It just work out. On the other

hand if
you have a 1080p screen, buy good card with a HDMI output capable

of 1080p.

Regards
Glenn.


How is 1080 less than 768?

GG

Glenn Millar December 16th 07 12:01 AM

How can a TV know that an image is coming from a computer, nota comsumer set-top box?
 
Scott Alfter wrote:
In article ,
Glenn Millar wrote:
In reality, you wouldn't want to use the HDMI connection to connect a
PC. The best results on my 50" samsung is via the VGA connector. That
way I get full [email protected] progressive whereas 720p or 1080i is in
actually a lesser resolution.


There's no reason you can't use the same modeline over DVI or HDMI that you
would use with VGA. On the contrary, in my experience it's been much easier
to get LCDs working on a digital connection than on an analog connection.
LCDs sold for computer use have a button on them that usually allows them to
sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
correspondence between pixels in the framebuffer and pixels on the screen,
you then have to do extensive tinkering with modelines...and you might never
come up with a working modeline.

My TV has a native resolution of 1280x768. I generated a modeline for that
resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
Works.

_/_
/ v \ Scott Alfter (remove the obvious to send mail)
(IIGS( http://alfter.us/ Top-posting!
\_^_/ rm -rf /bin/laden What's the most annoying thing on Usenet?


Your experience may well be correct with other LCD or Plasma TV's but my
reply was in relation to the Samsung screens. They don't like being
connected to a PC via HDMI. If someone get a profile for PowerStrip that
works correctly with a samsung i'd like a copy.

Regards
Glenn.

Måns Rullgård December 16th 07 01:00 AM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
Glenn Millar writes:

Scott Alfter wrote:
In article ,
Glenn Millar wrote:
In reality, you wouldn't want to use the HDMI connection to connect
a PC. The best results on my 50" samsung is via the VGA
connector. That way I get full [email protected] progressive whereas
720p or 1080i is in actually a lesser resolution.

There's no reason you can't use the same modeline over DVI or HDMI
that you
would use with VGA. On the contrary, in my experience it's been much easier
to get LCDs working on a digital connection than on an analog
connection. LCDs sold for computer use have a button on them that
usually allows them to
sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
correspondence between pixels in the framebuffer and pixels on the screen,
you then have to do extensive tinkering with modelines...and you might never
come up with a working modeline. My TV has a native resolution of
1280x768. I generated a modeline for that
resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
Works.


Your experience may well be correct with other LCD or Plasma TV's but
my reply was in relation to the Samsung screens. They don't like being
connected to a PC via HDMI. If someone get a profile for PowerStrip
that works correctly with a samsung i'd like a copy.


My Samsung TV (LE26R41BD, panel resolution 1366x768) happily accepts
any reasonable input over HDMI. If queried, it claims to only support
the usual HDTV modes (720x480/576, 1280x720, 1920x1080i), but if
another mode is forced it works just fine. For the VGA input, all the
usual adjustments are possible through the onscreen menu.

--
Måns Rullgård


[email protected] December 16th 07 02:37 AM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
In alt.tv.tech.hdtv steveo wrote:
|
| wrote in message
| ...
| In alt.tv.tech.hdtv Woody wrote:
| |
| | "T Shadow" wrote in message
| | ...
| | "D" wrote in message
| |
| ...
| | Hello!
| | According to Samsung LE-32r71b HDTV manual the TV cannot receive an
| | image from a computer through its HDMI input, but through its d-sub
| | only. Is it really true? How can the TV know that an image is coming
| | from a computer, not a comsumer set-top box? My video card is
| | Gigabyte
| | HD 2600Pro. I would like to use a DVI-HDMI cable.
| | Regards,
| | Dima
| |
| | Wouldn't rule out technical reasons but probably they just don't want
| | to
| | answer questions about it. Puts the onus on you.
| |
| |
| | Likely because HDMI has authentication handshaking built in to its
| | protocol and the PC may not be savvy to such things.
|
| HDMI and DVI are essentially the same thing, but with different connection
| and no standard for audio over DVI. Presumably you can even do HDCP over
| DVI if it doesn't need the sound are part of its authentication checks.
|
| Most cable STB use DVI and they most certainly have HDCP enabled.

DVI? Really? So when you hook it to your TV with a DVI-to-HDMI cable,
do you hear anything?

--
|---------------------------------------/----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/-------------------------------------|


All times are GMT +1. The time now is 06:59 AM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com