HomeCinemaBanter

HomeCinemaBanter (http://www.homecinemabanter.com/index.php)
-   High definition TV (http://www.homecinemabanter.com/forumdisplay.php?f=6)
-   -   How can a TV know that an image is coming from a computer, not acomsumer set-top box? (http://www.homecinemabanter.com/showthread.php?t=55341)

D[_2_] December 14th 07 01:51 PM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima

Dr Hfuhruhurr December 14th 07 02:08 PM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On 14 Dec, 12:51, D wrote:
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima


Some possible reasons below

http://www.behardware.com/articles/6...nightmare.html

http://www.drmblog.com/index.php?/ar...__BAD_DRM.html

Doc

Flasherly December 14th 07 03:54 PM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 14, 7:51 am, D wrote:
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima


DVI uses discrete time binary addressing, whereas a sampling is
derived from the DB15's pin-out voltages (potentially leading to
crosstalk between pixel addressing). Two of three DVI versions, DVI-
digital and DVI-integrated (digital and analog), are within HDMI
standards. Seems like a question as to what version of HDMI (version
2.3b is the latest) Samsung employs. If within industry
specifications, I'd obtain the cable you're contemplating to test the
videoboard's DVI output and verify Samsung's statement that its HDMI
does not meet industry compliance. I'd also first identify the DVI
output of the videoboard for the cable pinout forms illustrated he

http://en.wikipedia.org/wiki/Image:D...ctor_Types.svg

If your output is visually a pin-match of the DVI-A illustration, then
I wouldn't.

[email protected] December 14th 07 03:56 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
In alt.tv.tech.hdtv Dr Hfuhruhurr wrote:
| On 14 Dec, 12:51, D wrote:
| Hello!
| According to Samsung LE-32r71b HDTV manual the TV cannot receive an
| image from a computer through its HDMI input, but through its d-sub
| only. Is it really true? How can the TV know that an image is coming
| from a computer, not a comsumer set-top box? My video card is Gigabyte
| HD 2600Pro. I would like to use a DVI-HDMI cable.
| Regards,
| Dima
|
| Some possible reasons below
|
| http://www.behardware.com/articles/6...nightmare.html
|
| http://www.drmblog.com/index.php?/ar...__BAD_DRM.html

NON-encrypted video over the DVI/HDMI is supposed to be displayed OK.

When DRM restricted content is being played, the player is supposed to
engage HDCP with includes encrypting the digital data over the HDMI wires
so you can't tap into it (otherwise that would be a massively huge hole
in the whole works). In the computer, much of that work is in the video
card (the HDCP part of it) and much of it is in the player software (the
DRM part of it, and making sure the video card driver turns on HDCP before
sending it any restricted video).

For ordinary computer desktop video, that should not have HDCP engaged and
thus the video over the DVI/HDMI wires should not be encrypted. Monitors
and TVs should display that. Those that do not are defective. When you
play a restricted video, then things change and it starts the HDCP unless
you are operating the playback in a reduced mode acceptable to the video
source (like 480i/576i in a small window).

But it is rather well known that firmware programmers make lots of defects.
I've not only seen such in products, I've even dealt with such programmers
in a tech support role (though none of these were developing TV monitor or
DVD player firmware). A lot of them can't program their way out of a soap
bubble. Just last week I had to explain how the POSIX standard read() and
write() functions work to one that had supposedly be programming embedded
systems for years. So I would not be surprised at all if mistakes are made
in such programming in the case of TVs that fail to handle NON-encrypted
video over HDMI (which would break a lot of things, as there are also some
DVD players and set top boxes that have HDMI without HDCP).

For example, LCD does not flicker even at low frame rates. So if the video
is literally being transmitted at 24 frames per second, which would totally
suck on a CRT if it tried to display that directly without upconversion, an
LCD display should have no problem with it. Yet, LCDs are made which will
refuse to display if the frame rate is below 50 fps. I can understand a
limit on the upper end (might exceed the speed the circuits are able to
handle). But on the lower end at 50 fps? LCD should be fine down to 20
fps displaying it directly, and even lower.

OTOH, this could also just be a similar defect on the part of documentation
writers.

--
|---------------------------------------/----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/-------------------------------------|

Bob[_5_] December 14th 07 05:35 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
I think you can use the "input" button on your TV remote to choose HDMI
input.

"D" wrote in message
...
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima



Wes Newell December 14th 07 07:46 PM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Fri, 14 Dec 2007 04:51:41 -0800, D wrote:

Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable. Regards,
Dima


Don't know if it's true or not, but my Olevia 427V has no problem with it.
Why don't you just try it and see?

Cross posting removed.

--
Want the ultimate in free OTA SD/HDTV Recorder? http://mythtv.org
My Tivo Experience http://wesnewell.no-ip.com/tivo.htm
Tivo HD/S3 compared http://wesnewell.no-ip.com/mythtivo.htm
AMD cpu help http://wesnewell.no-ip.com/cpu.php

[email protected] December 14th 07 08:49 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
In alt.tv.tech.hdtv D wrote:

| According to Samsung LE-32r71b HDTV manual the TV cannot receive an
| image from a computer through its HDMI input, but through its d-sub
| only. Is it really true? How can the TV know that an image is coming
| from a computer, not a comsumer set-top box? My video card is Gigabyte
| HD 2600Pro. I would like to use a DVI-HDMI cable.

A colleague at work has verified that his Sharp Aquos 37" TV works fine
with his video card DVI output connected to the TV HDMI input via a DVI
to HDMI cable. His computer is running Linux, not Windows. So in this
case, the TV is accepting NON-encrypted digital video correctly as it
should. Hopefully, for non-protected content, Windows will properly NOT
use HDCP. It's only for protected content that it is expected to use
HDCP to ensure you cannot use a monitor that is really something like a
recorder, or let you tap the HDMI cable wires (it's encrypted in HDCP).

--
|---------------------------------------/----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/-------------------------------------|

Woody[_2_] December 15th 07 09:47 AM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 

"T Shadow" wrote in message
...
"D" wrote in message
...
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is
Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima


Wouldn't rule out technical reasons but probably they just don't want
to
answer questions about it. Puts the onus on you.


Likely because HDMI has authentication handshaking built in to its
protocol and the PC may not be savvy to such things.


--
Woody

harrogate three at ntlworld dot com



D[_2_] December 15th 07 11:04 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 14, 4:08 pm, Dr Hfuhruhurr wrote:
On 14 Dec, 12:51, D wrote:

Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima


Some possible reasons below

http://www.behardware.com/articles/6...ic-card-and-mo...

http://www.drmblog.com/index.php?/ar...V_+_HDMI_+_HDC...

Doc

Thanks Doc for replying!
I have bought Gembird DVI-HDMI cable. Samsung is WRONG: Samsung
LE-32r71b does show video through HDMI input from a computer DVI
output (not a blank screen as Samsung affirms), but of much lower
quality than through D-sub input. There is no audio through HDMI also,
although Gigabyte HD 2600Pro outputs audio through DVI output.
Regards,
Dima

[email protected] December 15th 07 03:48 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
In alt.tv.tech.hdtv Woody wrote:
|
| "T Shadow" wrote in message
| ...
| "D" wrote in message
| ...
| Hello!
| According to Samsung LE-32r71b HDTV manual the TV cannot receive an
| image from a computer through its HDMI input, but through its d-sub
| only. Is it really true? How can the TV know that an image is coming
| from a computer, not a comsumer set-top box? My video card is
| Gigabyte
| HD 2600Pro. I would like to use a DVI-HDMI cable.
| Regards,
| Dima
|
| Wouldn't rule out technical reasons but probably they just don't want
| to
| answer questions about it. Puts the onus on you.
|
|
| Likely because HDMI has authentication handshaking built in to its
| protocol and the PC may not be savvy to such things.

HDMI and DVI are essentially the same thing, but with different connection
and no standard for audio over DVI. Presumably you can even do HDCP over
DVI if it doesn't need the sound are part of its authentication checks.

--
|---------------------------------------/----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/-------------------------------------|

John Rumm December 15th 07 04:05 PM

How can a TV know that an image is coming from a computer, nota comsumer set-top box?
 
wrote:

use HDCP. It's only for protected content that it is expected to use
HDCP to ensure you cannot use a monitor that is really something like a
recorder, or let you tap the HDMI cable wires (it's encrypted in HDCP).


Unless you install AnyDVD on it first, and then the whole sorry mess
ceases to matter. ;-)

--
Cheers,

John.

/================================================== ===============\
| Internode Ltd -
http://www.internode.co.uk |
|-----------------------------------------------------------------|
| John Rumm - john(at)internode(dot)co(dot)uk |
\================================================= ================/

ray[_2_] December 15th 07 07:55 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
On Fri, 14 Dec 2007 04:51:41 -0800, D wrote:

Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima


It probably can't tell that. But it can 'tell' if it's asked to do
something outside it's range of capabilities.


Glenn Millar December 15th 07 11:03 PM

How can a TV know that an image is coming from a computer, nota comsumer set-top box?
 
Woody wrote:
"T Shadow" wrote in message
...
"D" wrote in message
...
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is
Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima

Wouldn't rule out technical reasons but probably they just don't want
to
answer questions about it. Puts the onus on you.


Likely because HDMI has authentication handshaking built in to its
protocol and the PC may not be savvy to such things.


In reality, you wouldn't want to use the HDMI connection to connect a
PC. The best results on my 50" samsung is via the VGA connector. That
way I get full [email protected] progressive whereas 720p or 1080i is in
actually a lesser resolution.

Give it a try on the HDMI input. It just work out. On the other hand if
you have a 1080p screen, buy good card with a HDMI output capable of 1080p.

Regards
Glenn.

steveo December 15th 07 11:07 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 

wrote in message
...
In alt.tv.tech.hdtv Woody wrote:
|
| "T Shadow" wrote in message
| ...
| "D" wrote in message
|
...
| Hello!
| According to Samsung LE-32r71b HDTV manual the TV cannot receive an
| image from a computer through its HDMI input, but through its d-sub
| only. Is it really true? How can the TV know that an image is coming
| from a computer, not a comsumer set-top box? My video card is
| Gigabyte
| HD 2600Pro. I would like to use a DVI-HDMI cable.
| Regards,
| Dima
|
| Wouldn't rule out technical reasons but probably they just don't want
| to
| answer questions about it. Puts the onus on you.
|
|
| Likely because HDMI has authentication handshaking built in to its
| protocol and the PC may not be savvy to such things.

HDMI and DVI are essentially the same thing, but with different connection
and no standard for audio over DVI. Presumably you can even do HDCP over
DVI if it doesn't need the sound are part of its authentication checks.


Most cable STB use DVI and they most certainly have HDCP enabled.

steveo


GeorgeB December 15th 07 11:36 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
On Sat, 15 Dec 2007 22:03:14 +0000, Glenn Millar
wrote:

In reality, you wouldn't want to use the HDMI connection to connect a
PC. The best results on my 50" samsung is via the VGA connector. That
way I get full [email protected] progressive whereas 720p or 1080i is in
actually a lesser resolution.

Regards
Glenn.


The VGA 15 pin D style handles higher resolutions pretty well; I use
1600x1200 via that connection and it has no problems I detect.

What I'd like, without spending money, of course, is 2400x1600 ... or
maybe 3200x1200 dual monitor.

But that costs money. With 6+ megapixel cameras, can the monitors be
far behind?

Scott Alfter December 15th 07 11:48 PM

How can a TV know that an image is coming from a computer, nota comsumer set-top box?
 
In article ,
Glenn Millar wrote:
In reality, you wouldn't want to use the HDMI connection to connect a
PC. The best results on my 50" samsung is via the VGA connector. That
way I get full [email protected] progressive whereas 720p or 1080i is in
actually a lesser resolution.


There's no reason you can't use the same modeline over DVI or HDMI that you
would use with VGA. On the contrary, in my experience it's been much easier
to get LCDs working on a digital connection than on an analog connection.
LCDs sold for computer use have a button on them that usually allows them to
sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
correspondence between pixels in the framebuffer and pixels on the screen,
you then have to do extensive tinkering with modelines...and you might never
come up with a working modeline.

My TV has a native resolution of 1280x768. I generated a modeline for that
resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
Works.

_/_
/ v \ Scott Alfter (remove the obvious to send mail)
(IIGS( http://alfter.us/ Top-posting!
\_^_/ rm -rf /bin/laden What's the most annoying thing on Usenet?


G-squared December 15th 07 11:48 PM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 15, 2:03 pm, Glenn Millar wrote:
snip
In reality, you wouldn't want to use the HDMI connection to connect

a
PC. The best results on my 50" samsung is via the VGA connector.

That
way I get full [email protected] progressive whereas 720p or 1080i is

in
actually a lesser resolution.

Give it a try on the HDMI input. It just work out. On the other

hand if
you have a 1080p screen, buy good card with a HDMI output capable

of 1080p.

Regards
Glenn.


How is 1080 less than 768?

GG

Glenn Millar December 16th 07 12:01 AM

How can a TV know that an image is coming from a computer, nota comsumer set-top box?
 
Scott Alfter wrote:
In article ,
Glenn Millar wrote:
In reality, you wouldn't want to use the HDMI connection to connect a
PC. The best results on my 50" samsung is via the VGA connector. That
way I get full [email protected] progressive whereas 720p or 1080i is in
actually a lesser resolution.


There's no reason you can't use the same modeline over DVI or HDMI that you
would use with VGA. On the contrary, in my experience it's been much easier
to get LCDs working on a digital connection than on an analog connection.
LCDs sold for computer use have a button on them that usually allows them to
sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
correspondence between pixels in the framebuffer and pixels on the screen,
you then have to do extensive tinkering with modelines...and you might never
come up with a working modeline.

My TV has a native resolution of 1280x768. I generated a modeline for that
resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
Works.

_/_
/ v \ Scott Alfter (remove the obvious to send mail)
(IIGS( http://alfter.us/ Top-posting!
\_^_/ rm -rf /bin/laden What's the most annoying thing on Usenet?


Your experience may well be correct with other LCD or Plasma TV's but my
reply was in relation to the Samsung screens. They don't like being
connected to a PC via HDMI. If someone get a profile for PowerStrip that
works correctly with a samsung i'd like a copy.

Regards
Glenn.

Måns Rullgård December 16th 07 01:00 AM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
Glenn Millar writes:

Scott Alfter wrote:
In article ,
Glenn Millar wrote:
In reality, you wouldn't want to use the HDMI connection to connect
a PC. The best results on my 50" samsung is via the VGA
connector. That way I get full [email protected] progressive whereas
720p or 1080i is in actually a lesser resolution.

There's no reason you can't use the same modeline over DVI or HDMI
that you
would use with VGA. On the contrary, in my experience it's been much easier
to get LCDs working on a digital connection than on an analog
connection. LCDs sold for computer use have a button on them that
usually allows them to
sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
correspondence between pixels in the framebuffer and pixels on the screen,
you then have to do extensive tinkering with modelines...and you might never
come up with a working modeline. My TV has a native resolution of
1280x768. I generated a modeline for that
resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
Works.


Your experience may well be correct with other LCD or Plasma TV's but
my reply was in relation to the Samsung screens. They don't like being
connected to a PC via HDMI. If someone get a profile for PowerStrip
that works correctly with a samsung i'd like a copy.


My Samsung TV (LE26R41BD, panel resolution 1366x768) happily accepts
any reasonable input over HDMI. If queried, it claims to only support
the usual HDTV modes (720x480/576, 1280x720, 1920x1080i), but if
another mode is forced it works just fine. For the VGA input, all the
usual adjustments are possible through the onscreen menu.

--
Måns Rullgård


[email protected] December 16th 07 02:37 AM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
In alt.tv.tech.hdtv steveo wrote:
|
| wrote in message
| ...
| In alt.tv.tech.hdtv Woody wrote:
| |
| | "T Shadow" wrote in message
| | ...
| | "D" wrote in message
| |
| ...
| | Hello!
| | According to Samsung LE-32r71b HDTV manual the TV cannot receive an
| | image from a computer through its HDMI input, but through its d-sub
| | only. Is it really true? How can the TV know that an image is coming
| | from a computer, not a comsumer set-top box? My video card is
| | Gigabyte
| | HD 2600Pro. I would like to use a DVI-HDMI cable.
| | Regards,
| | Dima
| |
| | Wouldn't rule out technical reasons but probably they just don't want
| | to
| | answer questions about it. Puts the onus on you.
| |
| |
| | Likely because HDMI has authentication handshaking built in to its
| | protocol and the PC may not be savvy to such things.
|
| HDMI and DVI are essentially the same thing, but with different connection
| and no standard for audio over DVI. Presumably you can even do HDCP over
| DVI if it doesn't need the sound are part of its authentication checks.
|
| Most cable STB use DVI and they most certainly have HDCP enabled.

DVI? Really? So when you hook it to your TV with a DVI-to-HDMI cable,
do you hear anything?

--
|---------------------------------------/----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/-------------------------------------|

[email protected] December 16th 07 02:41 AM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
In alt.tv.tech.hdtv Scott Alfter wrote:
| In article ,
| Glenn Millar wrote:
|In reality, you wouldn't want to use the HDMI connection to connect a
|PC. The best results on my 50" samsung is via the VGA connector. That
|way I get full [email protected] progressive whereas 720p or 1080i is in
|actually a lesser resolution.
|
| There's no reason you can't use the same modeline over DVI or HDMI that you
| would use with VGA. On the contrary, in my experience it's been much easier
| to get LCDs working on a digital connection than on an analog connection.
| LCDs sold for computer use have a button on them that usually allows them to
| sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
| correspondence between pixels in the framebuffer and pixels on the screen,
| you then have to do extensive tinkering with modelines...and you might never
| come up with a working modeline.
|
| My TV has a native resolution of 1280x768. I generated a modeline for that
| resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
| Works.

Don't forget that some poor saps are stuck with Windows and don't know how
to get into the registry.

Now if I could only find a TV _or_ monitor that would do video at 23.976 Hz
frame rate, in LCD, in the size and resolution of interest.

--
|---------------------------------------/----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/-------------------------------------|

[email protected] December 16th 07 02:42 AM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
In alt.tv.tech.hdtv M?ns Rullg?rd wrote:
| Glenn Millar writes:
|
| Scott Alfter wrote:
| In article ,
| Glenn Millar wrote:
| In reality, you wouldn't want to use the HDMI connection to connect
| a PC. The best results on my 50" samsung is via the VGA
| connector. That way I get full [email protected] progressive whereas
| 720p or 1080i is in actually a lesser resolution.
| There's no reason you can't use the same modeline over DVI or HDMI
| that you
| would use with VGA. On the contrary, in my experience it's been much easier
| to get LCDs working on a digital connection than on an analog
| connection. LCDs sold for computer use have a button on them that
| usually allows them to
| sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
| correspondence between pixels in the framebuffer and pixels on the screen,
| you then have to do extensive tinkering with modelines...and you might never
| come up with a working modeline. My TV has a native resolution of
| 1280x768. I generated a modeline for that
| resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
| Works.
|
| Your experience may well be correct with other LCD or Plasma TV's but
| my reply was in relation to the Samsung screens. They don't like being
| connected to a PC via HDMI. If someone get a profile for PowerStrip
| that works correctly with a samsung i'd like a copy.
|
| My Samsung TV (LE26R41BD, panel resolution 1366x768) happily accepts
| any reasonable input over HDMI. If queried, it claims to only support
| the usual HDTV modes (720x480/576, 1280x720, 1920x1080i), but if
| another mode is forced it works just fine. For the VGA input, all the
| usual adjustments are possible through the onscreen menu.

Any chance it "works just fine" on frame rates below 50 Hz, like maybe at
24 Hz?

--
|---------------------------------------/----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/-------------------------------------|

D[_2_] December 16th 07 09:06 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 14, 4:08 pm, Dr Hfuhruhurr wrote:
On 14 Dec, 12:51, D wrote:

Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima


Some possible reasons below

http://www.behardware.com/articles/6...ic-card-and-mo...

http://www.drmblog.com/index.php?/ar...V_+_HDMI_+_HDC...

Doc

Hello!
I have bought Gembird DVI-HDMI cable. Samsung LE-32r71b does show
video through HDMI input from a computer DVI output, but of much lower
quality than through D-sub input: there ara black borders around the
image, the image is much less sharp.
Regards,
Dima

Nigel Barker December 16th 07 09:48 AM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
On Sat, 15 Dec 2007 16:48:01 -0600, (Scott Alfter) wrote:

My TV has a native resolution of 1280x768. I generated a modeline for that
resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
Works.


Are you sure about that resolution? I have seen 1024x768, 1366x768 but never 1280 pixels as the
horizontal resolution. It likely only supports 1280 over DVI/HDMI but the native resolution is
actually 1366 if you connected via VGA.
--

Cheers

Nigel Barker
Live from the sunny Cote d'Azur
MCE MVP

Wes Newell December 16th 07 11:09 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Sun, 16 Dec 2007 09:48:40 +0100, Nigel Barker wrote:

On Sat, 15 Dec 2007 16:48:01 -0600,
(Scott Alfter) wrote:

My TV has a native resolution of 1280x768. I generated a modeline for
that resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it
Just Works.


Are you sure about that resolution? I have seen 1024x768, 1366x768 but
never 1280 pixels as the horizontal resolution. It likely only supports
1280 over DVI/HDMI but the native resolution is actually 1366 if you
connected via VGA.


Mines set to 1280x720. That feeds the TV an ATSC standard via either DVI
or VGA. I let the TV upscale it to 1366x768 in both cases. Works just like
it's supposed to.



--
Want the ultimate in free OTA SD/HDTV Recorder?
http://mythtv.org
My Tivo Experience http://wesnewell.no-ip.com/tivo.htm
Tivo HD/S3 compared http://wesnewell.no-ip.com/mythtivo.htm
AMD cpu help http://wesnewell.no-ip.com/cpu.php

Måns Rullgård December 16th 07 12:27 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
writes:

In alt.tv.tech.hdtv M?ns Rullg?rd wrote:
| Glenn Millar writes:
|
| Scott Alfter wrote:
| In article ,
| Glenn Millar wrote:
| In reality, you wouldn't want to use the HDMI connection to connect
| a PC. The best results on my 50" samsung is via the VGA
| connector. That way I get full [email protected] progressive whereas
| 720p or 1080i is in actually a lesser resolution.
| There's no reason you can't use the same modeline over DVI or
| HDMI that you would use with VGA. On the contrary, in my
| experience it's been much easier to get LCDs working on a
| digital connection than on an analog connection. LCDs sold for
| computer use have a button on them that usually allows them to
| sync up to a VGA signal, but LCD TVs rarely have this option.
| To get a 1:1 correspondence between pixels in the framebuffer
| and pixels on the screen, you then have to do extensive
| tinkering with modelines...and you might never come up with a
| working modeline. My TV has a native resolution of 1280x768. I
| generated a modeline for that resolution at 60 Hz and plugged it
| into xorg.conf, and over DVI, it Just Works.
|
| Your experience may well be correct with other LCD or Plasma TV's but
| my reply was in relation to the Samsung screens. They don't like being
| connected to a PC via HDMI. If someone get a profile for PowerStrip
| that works correctly with a samsung i'd like a copy.
|
| My Samsung TV (LE26R41BD, panel resolution 1366x768) happily accepts
| any reasonable input over HDMI. If queried, it claims to only support
| the usual HDTV modes (720x480/576, 1280x720, 1920x1080i), but if
| another mode is forced it works just fine. For the VGA input, all the
| usual adjustments are possible through the onscreen menu.

Any chance it "works just fine" on frame rates below 50 Hz, like maybe at
24 Hz?


I haven't tried, so I don't know.

--
Måns Rullgård


Bigguy[_3_] December 16th 07 02:18 PM

How can a TV know that an image is coming from a computer, nota comsumer set-top box?
 
wrote:
In alt.tv.tech.hdtv D wrote:


A colleague at work has verified that his Sharp Aquos 37" TV works fine
with his video card DVI output connected to the TV HDMI input via a DVI
to HDMI cable.


Ditto here with 52" Sharp Aquos - 1920 x 1080 from PC's DVI out.

Guy

D[_2_] December 16th 07 04:14 PM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 16, 1:03 am, Glenn Millar wrote:
Woody wrote:
"T Shadow" wrote in message
...
"D" wrote in message
...
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is
Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima
Wouldn't rule out technical reasons but probably they just don't want
to
answer questions about it. Puts the onus on you.


Likely because HDMI has authentication handshaking built in to its
protocol and the PC may not be savvy to such things.


In reality, you wouldn't want to use the HDMI connection to connect a
PC. The best results on my 50" samsung is via the VGA connector. That
way I get full [email protected] progressive whereas 720p or 1080i is in
actually a lesser resolution.

Give it a try on the HDMI input. It just work out. On the other hand if
you have a 1080p screen, buy good card with a HDMI output capable of 1080p.

Regards
Glenn.- Hide quoted text -

- Show quoted text -

Hello!
I have bought Gembird DVI-HDMI cable. Samsung LE-32r71b does show
video through HDMI input from a computer DVI output, but of much
lower
quality than through D-sub input: there ara black borders around the
image, the image is much less sharp.
Regards,
Dima

D[_2_] December 16th 07 04:38 PM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 16, 3:00 am, Måns Rullgård wrote:
Glenn Millar writes:
Scott Alfter wrote:
In article ,
Glenn Millar wrote:
In reality, you wouldn't want to use the HDMI connection to connect
a PC. The best results on my 50" samsung is via the VGA
connector. That way I get full [email protected] progressive whereas
720p or 1080i is in actually a lesser resolution.
There's no reason you can't use the same modeline over DVI or HDMI
that you
would use with VGA. On the contrary, in my experience it's been much easier
to get LCDs working on a digital connection than on an analog
connection. LCDs sold for computer use have a button on them that
usually allows them to
sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
correspondence between pixels in the framebuffer and pixels on the screen,
you then have to do extensive tinkering with modelines...and you might never
come up with a working modeline. My TV has a native resolution of
1280x768. I generated a modeline for that
resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
Works.


Your experience may well be correct with other LCD or Plasma TV's but
my reply was in relation to the Samsung screens. They don't like being
connected to a PC via HDMI. If someone get a profile for PowerStrip
that works correctly with a samsung i'd like a copy.


My Samsung TV (LE26R41BD, panel resolution 1366x768) happily accepts
any reasonable input over HDMI. If queried, it claims to only support
the usual HDTV modes (720x480/576, 1280x720, 1920x1080i), but if
another mode is forced it works just fine. For the VGA input, all the
usual adjustments are possible through the onscreen menu.

--
Måns Rullgård
- Hide quoted text -

- Show quoted text -

Hello!
I have bought Gembird DVI-HDMI cable. Samsung LE-32r71b does show
video through HDMI input from a computer DVI output, but of much
lower
quality than through D-sub input: there ara black borders around the
image, the image is much less sharp. I do not change output resolution
1360*768 when switching from d-sub to DVI.
Regards,
Dima

J. Clarke December 16th 07 11:36 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
John Rumm wrote:
wrote:

use HDCP. It's only for protected content that it is expected to
use
HDCP to ensure you cannot use a monitor that is really something
like a recorder, or let you tap the HDMI cable wires (it's
encrypted
in HDCP).


Unless you install AnyDVD on it first, and then the whole sorry mess
ceases to matter. ;-)


Does it in fact allow the display of HD content off blu-ray disks at
1920x1080 resolution on a non-HDCP compliant monitor? I was under the
impression that down-conversion was implemented at the firmware level
or below.

--
--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)



D[_2_] December 17th 07 07:11 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 16, 1:09 pm, Wes Newell wrote:
On Sun, 16 Dec 2007 09:48:40 +0100, Nigel Barker wrote:
On Sat, 15 Dec 2007 16:48:01 -0600,
(Scott Alfter) wrote:


My TV has a native resolution of 1280x768. I generated a modeline for
that resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it
Just Works.


Are you sure about that resolution? I have seen 1024x768, 1366x768 but
never 1280 pixels as the horizontal resolution. It likely only supports
1280 over DVI/HDMI but the native resolution is actually 1366 if you
connected via VGA.


Mines set to 1280x720. That feeds the TV an ATSC standard via either DVI
or VGA. I let the TV upscale it to 1366x768 in both cases. Works just like
it's supposed to.

--
Want the ultimate in free OTA SD/HDTV Recorder?http://mythtv.org
My Tivo Experiencehttp://wesnewell.no-ip.com/tivo.htm
Tivo HD/S3 comparedhttp://wesnewell.no-ip.com/mythtivo.htm
AMD cpu helphttp://wesnewell.no-ip.com/cpu.php

The upscaling makes the image less sharp, doesn't it?
Regards,
Dima

D[_2_] December 17th 07 07:13 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 16, 4:41 am, wrote:
In alt.tv.tech.hdtv Scott Alfter wrote:
| In article ,
| Glenn Millar wrote:
|In reality, you wouldn't want to use the HDMI connection to connect a
|PC. The best results on my 50" samsung is via the VGA connector. That
|way I get full [email protected] progressive whereas 720p or 1080i is in
|actually a lesser resolution.
|
| There's no reason you can't use the same modeline over DVI or HDMI that you
| would use with VGA. On the contrary, in my experience it's been much easier
| to get LCDs working on a digital connection than on an analog connection.
| LCDs sold for computer use have a button on them that usually allows them to
| sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
| correspondence between pixels in the framebuffer and pixels on the screen,
| you then have to do extensive tinkering with modelines...and you might never
| come up with a working modeline.
|
| My TV has a native resolution of 1280x768. I generated a modeline for that
| resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
| Works.

Don't forget that some poor saps are stuck with Windows and don't know how
to get into the registry.

Now if I could only find a TV _or_ monitor that would do video at 23.976 Hz
frame rate, in LCD, in the size and resolution of interest.

--
|---------------------------------------/-----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/--------------------------------------|


Why do you need the 23.976 Hz frame rate in LCD?
Regards,
Dima

D[_2_] December 17th 07 07:21 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 16, 3:00 am, Måns Rullgård wrote:
Glenn Millar writes:
Scott Alfter wrote:
In article ,
Glenn Millar wrote:
In reality, you wouldn't want to use the HDMI connection to connect
a PC. The best results on my 50" samsung is via the VGA
connector. That way I get full [email protected] progressive whereas
720p or 1080i is in actually a lesser resolution.
There's no reason you can't use the same modeline over DVI or HDMI
that you
would use with VGA. On the contrary, in my experience it's been much easier
to get LCDs working on a digital connection than on an analog
connection. LCDs sold for computer use have a button on them that
usually allows them to
sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
correspondence between pixels in the framebuffer and pixels on the screen,
you then have to do extensive tinkering with modelines...and you might never
come up with a working modeline. My TV has a native resolution of
1280x768. I generated a modeline for that
resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
Works.


Your experience may well be correct with other LCD or Plasma TV's but
my reply was in relation to the Samsung screens. They don't like being
connected to a PC via HDMI. If someone get a profile for PowerStrip
that works correctly with a samsung i'd like a copy.


My Samsung TV (LE26R41BD, panel resolution 1366x768) happily accepts
any reasonable input over HDMI. If queried, it claims to only support
the usual HDTV modes (720x480/576, 1280x720, 1920x1080i), but if
another mode is forced it works just fine. For the VGA input, all the
usual adjustments are possible through the onscreen menu.

--
Måns Rullgård
- Hide quoted text -

- Show quoted text -

Thanks Måns Rullgård for your reply!
My Samsung TV (LE32R71B, panel resolution 1366x768) accepts 1360x768
resolution over HDMI too, but the image is much less sharp than over d-
sub (after applying the TV automatic calibration), and there are black
borders around the image. Do you have the same borders and less sharp
image? If queried, it claims to only support the usual HDTV modes
(720x480/576, 1280x720, 1920x1080i) too.
Regards,
Dima

D[_2_] December 17th 07 07:22 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 16, 4:42 am, wrote:
In alt.tv.tech.hdtv M?ns Rullg?rd wrote:
| Glenn Millar writes:
|| Scott Alfter wrote:

| In article ,
| Glenn Millar wrote:
| In reality, you wouldn't want to use the HDMI connection to connect
| a PC. The best results on my 50" samsung is via the VGA
| connector. That way I get full [email protected] progressive whereas
| 720p or 1080i is in actually a lesser resolution.
| There's no reason you can't use the same modeline over DVI or HDMI
| that you
| would use with VGA. On the contrary, in my experience it's been much easier
| to get LCDs working on a digital connection than on an analog
| connection. LCDs sold for computer use have a button on them that
| usually allows them to
| sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
| correspondence between pixels in the framebuffer and pixels on the screen,
| you then have to do extensive tinkering with modelines...and you might never
| come up with a working modeline. My TV has a native resolution of
| 1280x768. I generated a modeline for that
| resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
| Works.
|
| Your experience may well be correct with other LCD or Plasma TV's but
| my reply was in relation to the Samsung screens. They don't like being
| connected to a PC via HDMI. If someone get a profile for PowerStrip
| that works correctly with a samsung i'd like a copy.
|
| My Samsung TV (LE26R41BD, panel resolution 1366x768) happily accepts
| any reasonable input over HDMI. If queried, it claims to only support
| the usual HDTV modes (720x480/576, 1280x720, 1920x1080i), but if
| another mode is forced it works just fine. For the VGA input, all the
| usual adjustments are possible through the onscreen menu.

Any chance it "works just fine" on frame rates below 50 Hz, like maybe at
24 Hz?

--
|---------------------------------------/-----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/--------------------------------------|

Thanks Phil Howard for your reply!
Why do you need frame rates below 50 Hz, like 24 Hz?
Regards,
Dima

steveo December 18th 07 06:32 AM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 

wrote in message
...
In alt.tv.tech.hdtv steveo wrote:
|
| wrote in message
| ...
| In alt.tv.tech.hdtv Woody wrote:
| |
| | "T Shadow" wrote in message
| | ...
| | "D" wrote in message
| |
|
...
| | Hello!
| | According to Samsung LE-32r71b HDTV manual the TV cannot receive an
| | image from a computer through its HDMI input, but through its d-sub
| | only. Is it really true? How can the TV know that an image is
coming
| | from a computer, not a comsumer set-top box? My video card is
| | Gigabyte
| | HD 2600Pro. I would like to use a DVI-HDMI cable.
| | Regards,
| | Dima
| |
| | Wouldn't rule out technical reasons but probably they just don't
want
| | to
| | answer questions about it. Puts the onus on you.
| |
| |
| | Likely because HDMI has authentication handshaking built in to its
| | protocol and the PC may not be savvy to such things.
|
| HDMI and DVI are essentially the same thing, but with different
connection
| and no standard for audio over DVI. Presumably you can even do HDCP
over
| DVI if it doesn't need the sound are part of its authentication checks.
|
| Most cable STB use DVI and they most certainly have HDCP enabled.

DVI? Really? So when you hook it to your TV with a DVI-to-HDMI cable,
do you hear anything?


Yes, DVI can carry HDCP encrypted video and yes it is a common interface
from cable STB. They do not pass audio over DVI. The two different HD STBs
I have had from Cox have had both coaxial and optical audio connections,
along with stereo RCA of course.

I keep see people making comments that they expect it to. Is there some
specification for DVI to carry audio that some computers and monitors use?
I'm pretty sure no monitors marketed as TVs accept audio over DVI and
everything I have ever read on DVI says that it does not carry audio.

steveo


D[_2_] December 18th 07 09:59 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 17, 1:38 am, "T Shadow" wrote:
"D" wrote in message

...





On Dec 16, 1:03 am, Glenn Millar wrote:
Woody wrote:
"T Shadow" wrote in message
...
"D" wrote in message


...
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is
Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima
Wouldn't rule out technical reasons but probably they just don't want
to
answer questions about it. Puts the onus on you.


Likely because HDMI has authentication handshaking built in to its
protocol and the PC may not be savvy to such things.


In reality, you wouldn't want to use the HDMI connection to connect a
PC. The best results on my 50" samsung is via the VGA connector. That
way I get full [email protected] progressive whereas 720p or 1080i is in
actually a lesser resolution.


Give it a try on the HDMI input. It just work out. On the other hand if
you have a 1080p screen, buy good card with a HDMI output capable of

1080p.

Regards
Glenn.- Hide quoted text -


- Show quoted text -

Hello!
I have bought Gembird DVI-HDMI cable. Samsung LE-32r71b does show
video through HDMI input from a computer DVI output, but of much
lower
quality than through D-sub input: there ara black borders around the
image, the image is much less sharp.
Regards,
Dima


I was under the impression HDCP was only used for protected content but
according to the link below it may be needed to get full resolution or even
a picture. Does your video card support HDCP? An article I read last year
indicated none did. That probably has changed.

http://en.wikipedia.org/wiki/Hdcp- Hide quoted text -

- Show quoted text -

Yes, my video card supports HDCP:
http://www.giga-byte.co.uk/Products/...ProductID=2589

D[_2_] December 18th 07 10:06 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 18, 8:32 am, "steveo" wrote:
wrote in message

...





In alt.tv.tech.hdtv steveo wrote:
|
| wrote in message
...
| In alt.tv.tech.hdtv Woody wrote:
| |
| | "T Shadow" wrote in message
| ...
| | "D" wrote in message
| |
|
...
| | Hello!
| | According to Samsung LE-32r71b HDTV manual the TV cannot receive an
| | image from a computer through its HDMI input, but through its d-sub
| | only. Is it really true? How can the TV know that an image is
coming
| | from a computer, not a comsumer set-top box? My video card is
| | Gigabyte
| | HD 2600Pro. I would like to use a DVI-HDMI cable.
| | Regards,
| | Dima
| |
| | Wouldn't rule out technical reasons but probably they just don't
want
| | to
| | answer questions about it. Puts the onus on you.
| |
| |
| | Likely because HDMI has authentication handshaking built in to its
| | protocol and the PC may not be savvy to such things.
|
| HDMI and DVI are essentially the same thing, but with different
connection
| and no standard for audio over DVI. Presumably you can even do HDCP
over
| DVI if it doesn't need the sound are part of its authentication checks.
|
| Most cable STB use DVI and they most certainly have HDCP enabled.


DVI? Really? So when you hook it to your TV with a DVI-to-HDMI cable,
do you hear anything?


Yes, DVI can carry HDCP encrypted video and yes it is a common interface
from cable STB. They do not pass audio over DVI. The two different HD STBs
I have had from Cox have had both coaxial and optical audio connections,
along with stereo RCA of course.

I keep see people making comments that they expect it to. Is there some
specification for DVI to carry audio that some computers and monitors use?
I'm pretty sure no monitors marketed as TVs accept audio over DVI and
everything I have ever read on DVI says that it does not carry audio.

steveo- Hide quoted text -

- Show quoted text -

Most of ATI Redeon HD cards output HD audio trough DVI. See
http://www.giga-byte.co.uk/Products/...ProductID=2589
or http://ati.amd.com/products/Radeonhd2400/specs.html

J. Clarke December 18th 07 02:56 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
D wrote:
On Dec 18, 8:32 am, "steveo" wrote:
wrote in message

...





In alt.tv.tech.hdtv steveo wrote:

wrote in message
...
In alt.tv.tech.hdtv Woody wrote:

"T Shadow" wrote in message
...
"D" wrote in message


...
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot
receive an image from a computer through its HDMI input, but
through its d-sub only. Is it really true? How can the TV
know
that an image is coming from a computer, not a comsumer
set-top box? My video card is Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima

Wouldn't rule out technical reasons but probably they just
don't want to
answer questions about it. Puts the onus on you.


Likely because HDMI has authentication handshaking built in to
its protocol and the PC may not be savvy to such things.

HDMI and DVI are essentially the same thing, but with different
connection and no standard for audio over DVI. Presumably you
can even do HDCP over DVI if it doesn't need the sound are part
of its authentication checks.

Most cable STB use DVI and they most certainly have HDCP enabled.


DVI? Really? So when you hook it to your TV with a DVI-to-HDMI
cable, do you hear anything?


Yes, DVI can carry HDCP encrypted video and yes it is a common
interface from cable STB. They do not pass audio over DVI. The
two
different HD STBs I have had from Cox have had both coaxial and
optical audio connections, along with stereo RCA of course.

I keep see people making comments that they expect it to. Is there
some specification for DVI to carry audio that some computers and
monitors use? I'm pretty sure no monitors marketed as TVs accept
audio over DVI and everything I have ever read on DVI says that it
does not carry audio.

steveo- Hide quoted text -

- Show quoted text -

Most of ATI Redeon HD cards output HD audio trough DVI. See
http://www.giga-byte.co.uk/Products/...ProductID=2589
or http://ati.amd.com/products/Radeonhd2400/specs.html


FWIW, HDMI doesn't have separate wires for audio, so it has to be
multiplexed into the datastream. That being the case there's no
reason it can't be carried over DVI.

--
--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)



steveo December 18th 07 05:15 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 

"D" wrote in message
...
On Dec 18, 8:32 am, "steveo" wrote:
wrote in message

...





In alt.tv.tech.hdtv steveo wrote:
|
| wrote in message
...
| In alt.tv.tech.hdtv Woody wrote:
| |
| | "T Shadow" wrote in message
| ...
| | "D" wrote in message
| |
|
...
| | Hello!
| | According to Samsung LE-32r71b HDTV manual the TV cannot receive
an
| | image from a computer through its HDMI input, but through its
d-sub
| | only. Is it really true? How can the TV know that an image is
coming
| | from a computer, not a comsumer set-top box? My video card is
| | Gigabyte
| | HD 2600Pro. I would like to use a DVI-HDMI cable.
| | Regards,
| | Dima
| |
| | Wouldn't rule out technical reasons but probably they just don't
want
| | to
| | answer questions about it. Puts the onus on you.
| |
| |
| | Likely because HDMI has authentication handshaking built in to its
| | protocol and the PC may not be savvy to such things.
|
| HDMI and DVI are essentially the same thing, but with different
connection
| and no standard for audio over DVI. Presumably you can even do HDCP
over
| DVI if it doesn't need the sound are part of its authentication
checks.
|
| Most cable STB use DVI and they most certainly have HDCP enabled.


DVI? Really? So when you hook it to your TV with a DVI-to-HDMI cable,
do you hear anything?


Yes, DVI can carry HDCP encrypted video and yes it is a common interface
from cable STB. They do not pass audio over DVI. The two different HD
STBs
I have had from Cox have had both coaxial and optical audio connections,
along with stereo RCA of course.

I keep see people making comments that they expect it to. Is there some
specification for DVI to carry audio that some computers and monitors
use?
I'm pretty sure no monitors marketed as TVs accept audio over DVI and
everything I have ever read on DVI says that it does not carry audio.

steveo- Hide quoted text -

- Show quoted text -

Most of ATI Redeon HD cards output HD audio trough DVI. See
http://www.giga-byte.co.uk/Products/...ProductID=2589
or http://ati.amd.com/products/Radeonhd2400/specs.html


No where on that page does it say that there is audio over the DVI. It does
say that you can get "HDMI and 5.1 surround audio (by optional adapter)" but
that would be achieved by combining the output from the DVI port and one of
the audio ports through the aforementioned adapter.

steveo


steveo December 18th 07 05:17 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 

"T Shadow" wrote in message
...
"D" wrote in message
...
On Dec 17, 1:38 am, "T Shadow" wrote:
"D" wrote in message


...





On Dec 16, 1:03 am, Glenn Millar wrote:
Woody wrote:
"T Shadow" wrote in message
...
"D" wrote in message


...
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot
receive

an
image from a computer through its HDMI input, but through its

d-sub
only. Is it really true? How can the TV know that an image is

coming
from a computer, not a comsumer set-top box? My video card is
Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima
Wouldn't rule out technical reasons but probably they just don't

want
to
answer questions about it. Puts the onus on you.

Likely because HDMI has authentication handshaking built in to
its
protocol and the PC may not be savvy to such things.

In reality, you wouldn't want to use the HDMI connection to connect

a
PC. The best results on my 50" samsung is via the VGA connector.

That
way I get full [email protected] progressive whereas 720p or 1080i is
in
actually a lesser resolution.

Give it a try on the HDMI input. It just work out. On the other
hand

if
you have a 1080p screen, buy good card with a HDMI output capable
of
1080p.

Regards
Glenn.- Hide quoted text -

- Show quoted text -
Hello!
I have bought Gembird DVI-HDMI cable. Samsung LE-32r71b does show
video through HDMI input from a computer DVI output, but of much
lower
quality than through D-sub input: there ara black borders around the
image, the image is much less sharp.
Regards,
Dima

I was under the impression HDCP was only used for protected content but
according to the link below it may be needed to get full resolution or

even
a picture. Does your video card support HDCP? An article I read last

year
indicated none did. That probably has changed.

http://en.wikipedia.org/wiki/Hdcp- Hide quoted text -

- Show quoted text -

Yes, my video card supports HDCP:

http://www.giga-byte.co.uk/Products/...ProductID=2589

I'd assume "HDMI ready" means something else is needed and not presently
supported. DVI is not HDMI and has no pins for sound..


DVI can carry HDCP encrypted content, but not all implementations of DVI
have the decoders.

steveo



All times are GMT +1. The time now is 06:59 AM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com