HomeCinemaBanter

HomeCinemaBanter (http://www.homecinemabanter.com/index.php)
-   High definition TV (http://www.homecinemabanter.com/forumdisplay.php?f=6)
-   -   How can a TV know that an image is coming from a computer, not acomsumer set-top box? (http://www.homecinemabanter.com/showthread.php?t=55341)

D[_2_] December 17th 07 07:11 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 16, 1:09 pm, Wes Newell wrote:
On Sun, 16 Dec 2007 09:48:40 +0100, Nigel Barker wrote:
On Sat, 15 Dec 2007 16:48:01 -0600,
(Scott Alfter) wrote:


My TV has a native resolution of 1280x768. I generated a modeline for
that resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it
Just Works.


Are you sure about that resolution? I have seen 1024x768, 1366x768 but
never 1280 pixels as the horizontal resolution. It likely only supports
1280 over DVI/HDMI but the native resolution is actually 1366 if you
connected via VGA.


Mines set to 1280x720. That feeds the TV an ATSC standard via either DVI
or VGA. I let the TV upscale it to 1366x768 in both cases. Works just like
it's supposed to.

--
Want the ultimate in free OTA SD/HDTV Recorder?http://mythtv.org
My Tivo Experiencehttp://wesnewell.no-ip.com/tivo.htm
Tivo HD/S3 comparedhttp://wesnewell.no-ip.com/mythtivo.htm
AMD cpu helphttp://wesnewell.no-ip.com/cpu.php

The upscaling makes the image less sharp, doesn't it?
Regards,
Dima

D[_2_] December 17th 07 07:13 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 16, 4:41 am, wrote:
In alt.tv.tech.hdtv Scott Alfter wrote:
| In article ,
| Glenn Millar wrote:
|In reality, you wouldn't want to use the HDMI connection to connect a
|PC. The best results on my 50" samsung is via the VGA connector. That
|way I get full [email protected] progressive whereas 720p or 1080i is in
|actually a lesser resolution.
|
| There's no reason you can't use the same modeline over DVI or HDMI that you
| would use with VGA. On the contrary, in my experience it's been much easier
| to get LCDs working on a digital connection than on an analog connection.
| LCDs sold for computer use have a button on them that usually allows them to
| sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
| correspondence between pixels in the framebuffer and pixels on the screen,
| you then have to do extensive tinkering with modelines...and you might never
| come up with a working modeline.
|
| My TV has a native resolution of 1280x768. I generated a modeline for that
| resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
| Works.

Don't forget that some poor saps are stuck with Windows and don't know how
to get into the registry.

Now if I could only find a TV _or_ monitor that would do video at 23.976 Hz
frame rate, in LCD, in the size and resolution of interest.

--
|---------------------------------------/-----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/--------------------------------------|


Why do you need the 23.976 Hz frame rate in LCD?
Regards,
Dima

D[_2_] December 17th 07 07:21 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 16, 3:00 am, Måns Rullgård wrote:
Glenn Millar writes:
Scott Alfter wrote:
In article ,
Glenn Millar wrote:
In reality, you wouldn't want to use the HDMI connection to connect
a PC. The best results on my 50" samsung is via the VGA
connector. That way I get full [email protected] progressive whereas
720p or 1080i is in actually a lesser resolution.
There's no reason you can't use the same modeline over DVI or HDMI
that you
would use with VGA. On the contrary, in my experience it's been much easier
to get LCDs working on a digital connection than on an analog
connection. LCDs sold for computer use have a button on them that
usually allows them to
sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
correspondence between pixels in the framebuffer and pixels on the screen,
you then have to do extensive tinkering with modelines...and you might never
come up with a working modeline. My TV has a native resolution of
1280x768. I generated a modeline for that
resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
Works.


Your experience may well be correct with other LCD or Plasma TV's but
my reply was in relation to the Samsung screens. They don't like being
connected to a PC via HDMI. If someone get a profile for PowerStrip
that works correctly with a samsung i'd like a copy.


My Samsung TV (LE26R41BD, panel resolution 1366x768) happily accepts
any reasonable input over HDMI. If queried, it claims to only support
the usual HDTV modes (720x480/576, 1280x720, 1920x1080i), but if
another mode is forced it works just fine. For the VGA input, all the
usual adjustments are possible through the onscreen menu.

--
Måns Rullgård
- Hide quoted text -

- Show quoted text -

Thanks Måns Rullgård for your reply!
My Samsung TV (LE32R71B, panel resolution 1366x768) accepts 1360x768
resolution over HDMI too, but the image is much less sharp than over d-
sub (after applying the TV automatic calibration), and there are black
borders around the image. Do you have the same borders and less sharp
image? If queried, it claims to only support the usual HDTV modes
(720x480/576, 1280x720, 1920x1080i) too.
Regards,
Dima

D[_2_] December 17th 07 07:22 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 16, 4:42 am, wrote:
In alt.tv.tech.hdtv M?ns Rullg?rd wrote:
| Glenn Millar writes:
|| Scott Alfter wrote:

| In article ,
| Glenn Millar wrote:
| In reality, you wouldn't want to use the HDMI connection to connect
| a PC. The best results on my 50" samsung is via the VGA
| connector. That way I get full [email protected] progressive whereas
| 720p or 1080i is in actually a lesser resolution.
| There's no reason you can't use the same modeline over DVI or HDMI
| that you
| would use with VGA. On the contrary, in my experience it's been much easier
| to get LCDs working on a digital connection than on an analog
| connection. LCDs sold for computer use have a button on them that
| usually allows them to
| sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
| correspondence between pixels in the framebuffer and pixels on the screen,
| you then have to do extensive tinkering with modelines...and you might never
| come up with a working modeline. My TV has a native resolution of
| 1280x768. I generated a modeline for that
| resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
| Works.
|
| Your experience may well be correct with other LCD or Plasma TV's but
| my reply was in relation to the Samsung screens. They don't like being
| connected to a PC via HDMI. If someone get a profile for PowerStrip
| that works correctly with a samsung i'd like a copy.
|
| My Samsung TV (LE26R41BD, panel resolution 1366x768) happily accepts
| any reasonable input over HDMI. If queried, it claims to only support
| the usual HDTV modes (720x480/576, 1280x720, 1920x1080i), but if
| another mode is forced it works just fine. For the VGA input, all the
| usual adjustments are possible through the onscreen menu.

Any chance it "works just fine" on frame rates below 50 Hz, like maybe at
24 Hz?

--
|---------------------------------------/-----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/--------------------------------------|

Thanks Phil Howard for your reply!
Why do you need frame rates below 50 Hz, like 24 Hz?
Regards,
Dima

steveo December 18th 07 06:32 AM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 

wrote in message
...
In alt.tv.tech.hdtv steveo wrote:
|
| wrote in message
| ...
| In alt.tv.tech.hdtv Woody wrote:
| |
| | "T Shadow" wrote in message
| | ...
| | "D" wrote in message
| |
|
...
| | Hello!
| | According to Samsung LE-32r71b HDTV manual the TV cannot receive an
| | image from a computer through its HDMI input, but through its d-sub
| | only. Is it really true? How can the TV know that an image is
coming
| | from a computer, not a comsumer set-top box? My video card is
| | Gigabyte
| | HD 2600Pro. I would like to use a DVI-HDMI cable.
| | Regards,
| | Dima
| |
| | Wouldn't rule out technical reasons but probably they just don't
want
| | to
| | answer questions about it. Puts the onus on you.
| |
| |
| | Likely because HDMI has authentication handshaking built in to its
| | protocol and the PC may not be savvy to such things.
|
| HDMI and DVI are essentially the same thing, but with different
connection
| and no standard for audio over DVI. Presumably you can even do HDCP
over
| DVI if it doesn't need the sound are part of its authentication checks.
|
| Most cable STB use DVI and they most certainly have HDCP enabled.

DVI? Really? So when you hook it to your TV with a DVI-to-HDMI cable,
do you hear anything?


Yes, DVI can carry HDCP encrypted video and yes it is a common interface
from cable STB. They do not pass audio over DVI. The two different HD STBs
I have had from Cox have had both coaxial and optical audio connections,
along with stereo RCA of course.

I keep see people making comments that they expect it to. Is there some
specification for DVI to carry audio that some computers and monitors use?
I'm pretty sure no monitors marketed as TVs accept audio over DVI and
everything I have ever read on DVI says that it does not carry audio.

steveo


D[_2_] December 18th 07 09:59 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 17, 1:38 am, "T Shadow" wrote:
"D" wrote in message

...





On Dec 16, 1:03 am, Glenn Millar wrote:
Woody wrote:
"T Shadow" wrote in message
...
"D" wrote in message


...
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is
Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima
Wouldn't rule out technical reasons but probably they just don't want
to
answer questions about it. Puts the onus on you.


Likely because HDMI has authentication handshaking built in to its
protocol and the PC may not be savvy to such things.


In reality, you wouldn't want to use the HDMI connection to connect a
PC. The best results on my 50" samsung is via the VGA connector. That
way I get full [email protected] progressive whereas 720p or 1080i is in
actually a lesser resolution.


Give it a try on the HDMI input. It just work out. On the other hand if
you have a 1080p screen, buy good card with a HDMI output capable of

1080p.

Regards
Glenn.- Hide quoted text -


- Show quoted text -

Hello!
I have bought Gembird DVI-HDMI cable. Samsung LE-32r71b does show
video through HDMI input from a computer DVI output, but of much
lower
quality than through D-sub input: there ara black borders around the
image, the image is much less sharp.
Regards,
Dima


I was under the impression HDCP was only used for protected content but
according to the link below it may be needed to get full resolution or even
a picture. Does your video card support HDCP? An article I read last year
indicated none did. That probably has changed.

http://en.wikipedia.org/wiki/Hdcp- Hide quoted text -

- Show quoted text -

Yes, my video card supports HDCP:
http://www.giga-byte.co.uk/Products/...ProductID=2589

D[_2_] December 18th 07 10:06 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 18, 8:32 am, "steveo" wrote:
wrote in message

...





In alt.tv.tech.hdtv steveo wrote:
|
| wrote in message
...
| In alt.tv.tech.hdtv Woody wrote:
| |
| | "T Shadow" wrote in message
| ...
| | "D" wrote in message
| |
|
...
| | Hello!
| | According to Samsung LE-32r71b HDTV manual the TV cannot receive an
| | image from a computer through its HDMI input, but through its d-sub
| | only. Is it really true? How can the TV know that an image is
coming
| | from a computer, not a comsumer set-top box? My video card is
| | Gigabyte
| | HD 2600Pro. I would like to use a DVI-HDMI cable.
| | Regards,
| | Dima
| |
| | Wouldn't rule out technical reasons but probably they just don't
want
| | to
| | answer questions about it. Puts the onus on you.
| |
| |
| | Likely because HDMI has authentication handshaking built in to its
| | protocol and the PC may not be savvy to such things.
|
| HDMI and DVI are essentially the same thing, but with different
connection
| and no standard for audio over DVI. Presumably you can even do HDCP
over
| DVI if it doesn't need the sound are part of its authentication checks.
|
| Most cable STB use DVI and they most certainly have HDCP enabled.


DVI? Really? So when you hook it to your TV with a DVI-to-HDMI cable,
do you hear anything?


Yes, DVI can carry HDCP encrypted video and yes it is a common interface
from cable STB. They do not pass audio over DVI. The two different HD STBs
I have had from Cox have had both coaxial and optical audio connections,
along with stereo RCA of course.

I keep see people making comments that they expect it to. Is there some
specification for DVI to carry audio that some computers and monitors use?
I'm pretty sure no monitors marketed as TVs accept audio over DVI and
everything I have ever read on DVI says that it does not carry audio.

steveo- Hide quoted text -

- Show quoted text -

Most of ATI Redeon HD cards output HD audio trough DVI. See
http://www.giga-byte.co.uk/Products/...ProductID=2589
or http://ati.amd.com/products/Radeonhd2400/specs.html

J. Clarke December 18th 07 02:56 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
D wrote:
On Dec 18, 8:32 am, "steveo" wrote:
wrote in message

...





In alt.tv.tech.hdtv steveo wrote:

wrote in message
...
In alt.tv.tech.hdtv Woody wrote:

"T Shadow" wrote in message
...
"D" wrote in message


...
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot
receive an image from a computer through its HDMI input, but
through its d-sub only. Is it really true? How can the TV
know
that an image is coming from a computer, not a comsumer
set-top box? My video card is Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima

Wouldn't rule out technical reasons but probably they just
don't want to
answer questions about it. Puts the onus on you.


Likely because HDMI has authentication handshaking built in to
its protocol and the PC may not be savvy to such things.

HDMI and DVI are essentially the same thing, but with different
connection and no standard for audio over DVI. Presumably you
can even do HDCP over DVI if it doesn't need the sound are part
of its authentication checks.

Most cable STB use DVI and they most certainly have HDCP enabled.


DVI? Really? So when you hook it to your TV with a DVI-to-HDMI
cable, do you hear anything?


Yes, DVI can carry HDCP encrypted video and yes it is a common
interface from cable STB. They do not pass audio over DVI. The
two
different HD STBs I have had from Cox have had both coaxial and
optical audio connections, along with stereo RCA of course.

I keep see people making comments that they expect it to. Is there
some specification for DVI to carry audio that some computers and
monitors use? I'm pretty sure no monitors marketed as TVs accept
audio over DVI and everything I have ever read on DVI says that it
does not carry audio.

steveo- Hide quoted text -

- Show quoted text -

Most of ATI Redeon HD cards output HD audio trough DVI. See
http://www.giga-byte.co.uk/Products/...ProductID=2589
or http://ati.amd.com/products/Radeonhd2400/specs.html


FWIW, HDMI doesn't have separate wires for audio, so it has to be
multiplexed into the datastream. That being the case there's no
reason it can't be carried over DVI.

--
--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)



steveo December 18th 07 05:15 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 

"D" wrote in message
...
On Dec 18, 8:32 am, "steveo" wrote:
wrote in message

...





In alt.tv.tech.hdtv steveo wrote:
|
| wrote in message
...
| In alt.tv.tech.hdtv Woody wrote:
| |
| | "T Shadow" wrote in message
| ...
| | "D" wrote in message
| |
|
...
| | Hello!
| | According to Samsung LE-32r71b HDTV manual the TV cannot receive
an
| | image from a computer through its HDMI input, but through its
d-sub
| | only. Is it really true? How can the TV know that an image is
coming
| | from a computer, not a comsumer set-top box? My video card is
| | Gigabyte
| | HD 2600Pro. I would like to use a DVI-HDMI cable.
| | Regards,
| | Dima
| |
| | Wouldn't rule out technical reasons but probably they just don't
want
| | to
| | answer questions about it. Puts the onus on you.
| |
| |
| | Likely because HDMI has authentication handshaking built in to its
| | protocol and the PC may not be savvy to such things.
|
| HDMI and DVI are essentially the same thing, but with different
connection
| and no standard for audio over DVI. Presumably you can even do HDCP
over
| DVI if it doesn't need the sound are part of its authentication
checks.
|
| Most cable STB use DVI and they most certainly have HDCP enabled.


DVI? Really? So when you hook it to your TV with a DVI-to-HDMI cable,
do you hear anything?


Yes, DVI can carry HDCP encrypted video and yes it is a common interface
from cable STB. They do not pass audio over DVI. The two different HD
STBs
I have had from Cox have had both coaxial and optical audio connections,
along with stereo RCA of course.

I keep see people making comments that they expect it to. Is there some
specification for DVI to carry audio that some computers and monitors
use?
I'm pretty sure no monitors marketed as TVs accept audio over DVI and
everything I have ever read on DVI says that it does not carry audio.

steveo- Hide quoted text -

- Show quoted text -

Most of ATI Redeon HD cards output HD audio trough DVI. See
http://www.giga-byte.co.uk/Products/...ProductID=2589
or http://ati.amd.com/products/Radeonhd2400/specs.html


No where on that page does it say that there is audio over the DVI. It does
say that you can get "HDMI and 5.1 surround audio (by optional adapter)" but
that would be achieved by combining the output from the DVI port and one of
the audio ports through the aforementioned adapter.

steveo


steveo December 18th 07 05:17 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 

"T Shadow" wrote in message
...
"D" wrote in message
...
On Dec 17, 1:38 am, "T Shadow" wrote:
"D" wrote in message


...





On Dec 16, 1:03 am, Glenn Millar wrote:
Woody wrote:
"T Shadow" wrote in message
...
"D" wrote in message


...
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot
receive

an
image from a computer through its HDMI input, but through its

d-sub
only. Is it really true? How can the TV know that an image is

coming
from a computer, not a comsumer set-top box? My video card is
Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima
Wouldn't rule out technical reasons but probably they just don't

want
to
answer questions about it. Puts the onus on you.

Likely because HDMI has authentication handshaking built in to
its
protocol and the PC may not be savvy to such things.

In reality, you wouldn't want to use the HDMI connection to connect

a
PC. The best results on my 50" samsung is via the VGA connector.

That
way I get full [email protected] progressive whereas 720p or 1080i is
in
actually a lesser resolution.

Give it a try on the HDMI input. It just work out. On the other
hand

if
you have a 1080p screen, buy good card with a HDMI output capable
of
1080p.

Regards
Glenn.- Hide quoted text -

- Show quoted text -
Hello!
I have bought Gembird DVI-HDMI cable. Samsung LE-32r71b does show
video through HDMI input from a computer DVI output, but of much
lower
quality than through D-sub input: there ara black borders around the
image, the image is much less sharp.
Regards,
Dima

I was under the impression HDCP was only used for protected content but
according to the link below it may be needed to get full resolution or

even
a picture. Does your video card support HDCP? An article I read last

year
indicated none did. That probably has changed.

http://en.wikipedia.org/wiki/Hdcp- Hide quoted text -

- Show quoted text -

Yes, my video card supports HDCP:

http://www.giga-byte.co.uk/Products/...ProductID=2589

I'd assume "HDMI ready" means something else is needed and not presently
supported. DVI is not HDMI and has no pins for sound..


DVI can carry HDCP encrypted content, but not all implementations of DVI
have the decoders.

steveo



All times are GMT +1. The time now is 06:59 AM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com