HomeCinemaBanter

HomeCinemaBanter (http://www.homecinemabanter.com/index.php)
-   High definition TV (http://www.homecinemabanter.com/forumdisplay.php?f=6)
-   -   How can a TV know that an image is coming from a computer, not acomsumer set-top box? (http://www.homecinemabanter.com/showthread.php?t=55341)

[email protected] December 19th 07 03:30 AM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
In alt.tv.tech.hdtv D wrote:

| Most of ATI Redeon HD cards output HD audio trough DVI. See
| http://www.giga-byte.co.uk/Products/...ProductID=2589
| or http://ati.amd.com/products/Radeonhd2400/specs.html

According to the description in the first URL, and suggestive similarly by
the second URL, an adaptor is required to do audio, and it does HDMI. I do
not know what kind of adaptor is referred to. It could be a sidekick PCI
card that has an HDMI ouput. It could be something that plugs into the DVI
port and gives HDMI out. In the latter case, there would certainly have to
be audio going through the DVI port. This could be done in a proprietary
way, such as sending the audio as data over the 2nd screen data lines, or
over the analog lines. The card could detect if the adaptor is present and
switch from DVI standards compliance to any proprietary protocol.

--
|---------------------------------------/----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/-------------------------------------|

[email protected] December 19th 07 03:32 AM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
In alt.tv.tech.hdtv T Shadow wrote:

| I'd assume "HDMI ready" means something else is needed and not presently
| supported. DVI is not HDMI and has no pins for sound..

The description from the page in a URL he gave elsewhere says an adapator
is needed. That opens the possibility of, once the adapter is detected
by the video card, changing some of the pin out usage from standard to a
proprietary usage that the adapter is made for, such as audio over either
the 2nd video port lines or the analog lines.

--
|---------------------------------------/----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/-------------------------------------|

[email protected] December 19th 07 04:05 AM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
In alt.tv.tech.hdtv J. Clarke wrote:

| FWIW, HDMI doesn't have separate wires for audio, so it has to be
| multiplexed into the datastream. That being the case there's no
| reason it can't be carried over DVI.

Or re-purpose some of the other wires, specifically for the adaptor the
documentation says is needed for audio.

--
|---------------------------------------/----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/-------------------------------------|

D[_2_] December 19th 07 11:44 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 19, 10:20 am, "T Shadow" wrote:
wrote in message

...

In alt.tv.tech.hdtv T Shadow wrote:


| I'd assume "HDMI ready" means something else is needed and not presently
| supported. DVI is not HDMI and has no pins for sound..


The description from the page in a URL he gave elsewhere says an adapator
is needed. That opens the possibility of, once the adapter is detected
by the video card, changing some of the pin out usage from standard to a
proprietary usage that the adapter is made for, such as audio over either
the 2nd video port lines or the analog lines.


--


OK, I found reference to it in the Wikipedia DVI page. I assume only the
dongle can read the audio so couldn't be used with any other device. IOW,
who makes a device that reads audio from a DVI port.

Now, at the end you understood! Congratulations!

D[_2_] December 19th 07 11:49 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 19, 10:21 am, "T Shadow" wrote:
"J. Clarke" wrote in message

...



D wrote:
On Dec 18, 8:32 am, "steveo" wrote:
wrote in message


...


In alt.tv.tech.hdtv steveo wrote:


wrote in message
...
In alt.tv.tech.hdtv Woody wrote:


"T Shadow" wrote in message
...
"D" wrote in message


...



Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot
receive an image from a computer through its HDMI input, but
through its d-sub only. Is it really true? How can the TV
know
that an image is coming from a computer, not a comsumer
set-top box? My video card is Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima


Wouldn't rule out technical reasons but probably they just
don't want to
answer questions about it. Puts the onus on you.


Likely because HDMI has authentication handshaking built in to
its protocol and the PC may not be savvy to such things.


HDMI and DVI are essentially the same thing, but with different
connection and no standard for audio over DVI. Presumably you
can even do HDCP over DVI if it doesn't need the sound are part
of its authentication checks.


Most cable STB use DVI and they most certainly have HDCP enabled.


DVI? Really? So when you hook it to your TV with a DVI-to-HDMI
cable, do you hear anything?


Yes, DVI can carry HDCP encrypted video and yes it is a common
interface from cable STB. They do not pass audio over DVI. The
two
different HD STBs I have had from Cox have had both coaxial and
optical audio connections, along with stereo RCA of course.


I keep see people making comments that they expect it to. Is there
some specification for DVI to carry audio that some computers and
monitors use? I'm pretty sure no monitors marketed as TVs accept
audio over DVI and everything I have ever read on DVI says that it
does not carry audio.


steveo- Hide quoted text -


- Show quoted text -
Most of ATI Redeon HD cards output HD audio trough DVI. See


http://www.giga-byte.co.uk/Products/...ew.aspx?Produc...

orhttp://ati.amd.com/products/Radeonhd2400/specs.html


FWIW, HDMI doesn't have separate wires for audio, so it has to be
multiplexed into the datastream. That being the case there's no
reason it can't be carried over DVI.


--


Ok, wasn't thinking digital enough. Would it be DVI? Could another DVI
device read it? Has any been manufactured?- Hide quoted text -

- Show quoted text -- Hide quoted text -

- Show quoted text -

Yes, at least ATI manufactures the adaptor, but ATI does not sell it
in Moscow, Russia, only cards.

D[_2_] December 19th 07 12:40 PM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 18, 9:59 pm, "T Shadow" wrote:
"steveo" wrote in message

...

"T Shadow" wrote in message
...
"D" wrote in message


...

http://www.giga-byte.co.uk/Products/...ew.aspx?Produc...



I'd assume "HDMI ready" means something else is needed and not presently
supported. DVI is not HDMI and has no pins for sound..


DVI can carry HDCP encrypted content, but not all implementations of DVI
have the decoders.


steveo


True, but it doesn't carry audio.

I asked not about DVI standard, but about custom implementation of DVI
output on Radeon HD cards.

D[_2_] December 19th 07 01:24 PM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 19, 5:30 am, wrote:
In alt.tv.tech.hdtv D wrote:

| Most of ATI Redeon HD cards output HD audio trough DVI. See
|http://www.giga-byte.co.uk/Products/...ew.aspx?Produc...
| orhttp://ati.amd.com/products/Radeonhd2400/specs.html

According to the description in the first URL, and suggestive similarly by
the second URL, an adaptor is required to do audio, and it does HDMI. I do
not know what kind of adaptor is referred to. It could be a sidekick PCI
card that has an HDMI ouput. It could be something that plugs into the DVI
port and gives HDMI out. In the latter case, there would certainly have to
be audio going through the DVI port. This could be done in a proprietary
way, such as sending the audio as data over the 2nd screen data lines, or
over the analog lines. The card could detect if the adaptor is present and
switch from DVI standards compliance to any proprietary protocol.

--
|---------------------------------------/-----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/--------------------------------------|

Yes, Phil, you are right: the ATI DVI-HDMI adapter is something that
plugs into the DVI port and gives HDMI out. There is audio going
through the DVI port. This is done in a proprietary way.
Regards,
Dima

D[_2_] December 19th 07 01:28 PM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 18, 12:34 pm, "T Shadow" wrote:
"D" wrote in message

... On Dec 17, 1:38 am, "T Shadow" wrote:
"D" wrote in message


...





On Dec 16, 1:03 am, Glenn Millar wrote:
Woody wrote:
"T Shadow" wrote in message
...
"D" wrote in message


...
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive

an
image from a computer through its HDMI input, but through its

d-sub
only. Is it really true? How can the TV know that an image is

coming
from a computer, not a comsumer set-top box? My video card is
Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima
Wouldn't rule out technical reasons but probably they just don't

want
to
answer questions about it. Puts the onus on you.


Likely because HDMI has authentication handshaking built in to its
protocol and the PC may not be savvy to such things.


In reality, you wouldn't want to use the HDMI connection to connect

a
PC. The best results on my 50" samsung is via the VGA connector.

That
way I get full [email protected] progressive whereas 720p or 1080i is in
actually a lesser resolution.


Give it a try on the HDMI input. It just work out. On the other hand

if
you have a 1080p screen, buy good card with a HDMI output capable of
1080p.


Regards
Glenn.- Hide quoted text -


- Show quoted text -
Hello!
I have bought Gembird DVI-HDMI cable. Samsung LE-32r71b does show
video through HDMI input from a computer DVI output, but of much
lower
quality than through D-sub input: there ara black borders around the
image, the image is much less sharp.
Regards,
Dima


I was under the impression HDCP was only used for protected content but
according to the link below it may be needed to get full resolution or

even
a picture. Does your video card support HDCP? An article I read last

year
indicated none did. That probably has changed.


http://en.wikipedia.org/wiki/Hdcp-Hide quoted text -


- Show quoted text -

Yes, my video card supports HDCP:


http://www.giga-byte.co.uk/Products/...ew.aspx?Produc...

I'd assume "HDMI ready" means something else is needed and not presently
supported. DVI is not HDMI and has no pins for sound..- Hide quoted text -

- Show quoted text -

The DVI-I socket has more contacts than needed just for video. Over
contacts could be used for digital audio, for example.
Regards,
Dima

D[_2_] December 19th 07 01:34 PM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 18, 7:15 pm, "steveo" wrote:
"D" wrote in message

...





On Dec 18, 8:32 am, "steveo" wrote:
wrote in message


...


In alt.tv.tech.hdtv steveo wrote:
|
| wrote in message
...
| In alt.tv.tech.hdtv Woody wrote:
| |
| | "T Shadow" wrote in message
| ...
| | "D" wrote in message
| |
|
...
| | Hello!
| | According to Samsung LE-32r71b HDTV manual the TV cannot receive
an
| | image from a computer through its HDMI input, but through its
d-sub
| | only. Is it really true? How can the TV know that an image is
coming
| | from a computer, not a comsumer set-top box? My video card is
| | Gigabyte
| | HD 2600Pro. I would like to use a DVI-HDMI cable.
| | Regards,
| | Dima
| |
| | Wouldn't rule out technical reasons but probably they just don't
want
| | to
| | answer questions about it. Puts the onus on you.
| |
| |
| | Likely because HDMI has authentication handshaking built in to its
| | protocol and the PC may not be savvy to such things.
|
| HDMI and DVI are essentially the same thing, but with different
connection
| and no standard for audio over DVI. Presumably you can even do HDCP
over
| DVI if it doesn't need the sound are part of its authentication
checks.
|
| Most cable STB use DVI and they most certainly have HDCP enabled.


DVI? Really? So when you hook it to your TV with a DVI-to-HDMI cable,
do you hear anything?


Yes, DVI can carry HDCP encrypted video and yes it is a common interface
from cable STB. They do not pass audio over DVI. The two different HD
STBs
I have had from Cox have had both coaxial and optical audio connections,
along with stereo RCA of course.


I keep see people making comments that they expect it to. Is there some
specification for DVI to carry audio that some computers and monitors
use?
I'm pretty sure no monitors marketed as TVs accept audio over DVI and
everything I have ever read on DVI says that it does not carry audio.


steveo- Hide quoted text -


- Show quoted text -

Most of ATI Redeon HD cards output HD audio trough DVI. See
http://www.giga-byte.co.uk/Products/...ew.aspx?Produc...
orhttp://ati.amd.com/products/Radeonhd2400/specs.html


No where on that page does it say that there is audio over the DVI. It does
say that you can get "HDMI and 5.1 surround audio (by optional adapter)" but
that would be achieved by combining the output from the DVI port and one of
the audio ports through the aforementioned adapter.

steveo- Hide quoted text -

- Show quoted text -

Dear steveo,
To rule out your "would" just read complementary information on that
sites.
Regards,
Dima

JW[_3_] December 19th 07 03:05 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
The only current implementations of the extra 4 pins that I have seen are
to support analog interfaces using a dongle such as VGA or
S-video/composite.
"D" wrote in message
...
On Dec 18, 12:34 pm, "T Shadow" wrote:
"D" wrote in message

...
On Dec 17, 1:38 am, "T Shadow" wrote:
"D" wrote in message


...





On Dec 16, 1:03 am, Glenn Millar
wrote:
Woody wrote:
"T Shadow" wrote in message
...
"D" wrote in message


...
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot
receive

an
image from a computer through its HDMI input, but through its

d-sub
only. Is it really true? How can the TV know that an image is

coming
from a computer, not a comsumer set-top box? My video card is
Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima
Wouldn't rule out technical reasons but probably they just
don't

want
to
answer questions about it. Puts the onus on you.


Likely because HDMI has authentication handshaking built in to
its
protocol and the PC may not be savvy to such things.


In reality, you wouldn't want to use the HDMI connection to
connect

a
PC. The best results on my 50" samsung is via the VGA connector.

That
way I get full [email protected] progressive whereas 720p or 1080i is
in
actually a lesser resolution.


Give it a try on the HDMI input. It just work out. On the other
hand

if
you have a 1080p screen, buy good card with a HDMI output capable
of
1080p.


Regards
Glenn.- Hide quoted text -


- Show quoted text -
Hello!
I have bought Gembird DVI-HDMI cable. Samsung LE-32r71b does show
video through HDMI input from a computer DVI output, but of much
lower
quality than through D-sub input: there ara black borders around
the
image, the image is much less sharp.
Regards,
Dima


I was under the impression HDCP was only used for protected content
but
according to the link below it may be needed to get full resolution
or

even
a picture. Does your video card support HDCP? An article I read last

year
indicated none did. That probably has changed.


http://en.wikipedia.org/wiki/Hdcp-Hide quoted text -


- Show quoted text -
Yes, my video card supports HDCP:


http://www.giga-byte.co.uk/Products/...ew.aspx?Produc...

I'd assume "HDMI ready" means something else is needed and not presently
supported. DVI is not HDMI and has no pins for sound..- Hide quoted
text -

- Show quoted text -

The DVI-I socket has more contacts than needed just for video. Over
contacts could be used for digital audio, for example.
Regards,
Dima




All times are GMT +1. The time now is 06:59 AM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com