HomeCinemaBanter

HomeCinemaBanter (http://www.homecinemabanter.com/index.php)
-   High definition TV (http://www.homecinemabanter.com/forumdisplay.php?f=6)
-   -   How can a TV know that an image is coming from a computer, not acomsumer set-top box? (http://www.homecinemabanter.com/showthread.php?t=55341)

[email protected] December 16th 07 02:41 AM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
In alt.tv.tech.hdtv Scott Alfter wrote:
| In article ,
| Glenn Millar wrote:
|In reality, you wouldn't want to use the HDMI connection to connect a
|PC. The best results on my 50" samsung is via the VGA connector. That
|way I get full [email protected] progressive whereas 720p or 1080i is in
|actually a lesser resolution.
|
| There's no reason you can't use the same modeline over DVI or HDMI that you
| would use with VGA. On the contrary, in my experience it's been much easier
| to get LCDs working on a digital connection than on an analog connection.
| LCDs sold for computer use have a button on them that usually allows them to
| sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
| correspondence between pixels in the framebuffer and pixels on the screen,
| you then have to do extensive tinkering with modelines...and you might never
| come up with a working modeline.
|
| My TV has a native resolution of 1280x768. I generated a modeline for that
| resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
| Works.

Don't forget that some poor saps are stuck with Windows and don't know how
to get into the registry.

Now if I could only find a TV _or_ monitor that would do video at 23.976 Hz
frame rate, in LCD, in the size and resolution of interest.

--
|---------------------------------------/----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/-------------------------------------|

[email protected] December 16th 07 02:42 AM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
In alt.tv.tech.hdtv M?ns Rullg?rd wrote:
| Glenn Millar writes:
|
| Scott Alfter wrote:
| In article ,
| Glenn Millar wrote:
| In reality, you wouldn't want to use the HDMI connection to connect
| a PC. The best results on my 50" samsung is via the VGA
| connector. That way I get full [email protected] progressive whereas
| 720p or 1080i is in actually a lesser resolution.
| There's no reason you can't use the same modeline over DVI or HDMI
| that you
| would use with VGA. On the contrary, in my experience it's been much easier
| to get LCDs working on a digital connection than on an analog
| connection. LCDs sold for computer use have a button on them that
| usually allows them to
| sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
| correspondence between pixels in the framebuffer and pixels on the screen,
| you then have to do extensive tinkering with modelines...and you might never
| come up with a working modeline. My TV has a native resolution of
| 1280x768. I generated a modeline for that
| resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
| Works.
|
| Your experience may well be correct with other LCD or Plasma TV's but
| my reply was in relation to the Samsung screens. They don't like being
| connected to a PC via HDMI. If someone get a profile for PowerStrip
| that works correctly with a samsung i'd like a copy.
|
| My Samsung TV (LE26R41BD, panel resolution 1366x768) happily accepts
| any reasonable input over HDMI. If queried, it claims to only support
| the usual HDTV modes (720x480/576, 1280x720, 1920x1080i), but if
| another mode is forced it works just fine. For the VGA input, all the
| usual adjustments are possible through the onscreen menu.

Any chance it "works just fine" on frame rates below 50 Hz, like maybe at
24 Hz?

--
|---------------------------------------/----------------------------------|
| Phil Howard KA9WGN (ka9wgn.ham.org) / Do not send to the address below |
| first name lower case at ipal.net / |
|------------------------------------/-------------------------------------|

D[_2_] December 16th 07 09:06 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 14, 4:08 pm, Dr Hfuhruhurr wrote:
On 14 Dec, 12:51, D wrote:

Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima


Some possible reasons below

http://www.behardware.com/articles/6...ic-card-and-mo...

http://www.drmblog.com/index.php?/ar...V_+_HDMI_+_HDC...

Doc

Hello!
I have bought Gembird DVI-HDMI cable. Samsung LE-32r71b does show
video through HDMI input from a computer DVI output, but of much lower
quality than through D-sub input: there ara black borders around the
image, the image is much less sharp.
Regards,
Dima

Nigel Barker December 16th 07 09:48 AM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
On Sat, 15 Dec 2007 16:48:01 -0600, (Scott Alfter) wrote:

My TV has a native resolution of 1280x768. I generated a modeline for that
resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
Works.


Are you sure about that resolution? I have seen 1024x768, 1366x768 but never 1280 pixels as the
horizontal resolution. It likely only supports 1280 over DVI/HDMI but the native resolution is
actually 1366 if you connected via VGA.
--

Cheers

Nigel Barker
Live from the sunny Cote d'Azur
MCE MVP

Wes Newell December 16th 07 11:09 AM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Sun, 16 Dec 2007 09:48:40 +0100, Nigel Barker wrote:

On Sat, 15 Dec 2007 16:48:01 -0600,
(Scott Alfter) wrote:

My TV has a native resolution of 1280x768. I generated a modeline for
that resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it
Just Works.


Are you sure about that resolution? I have seen 1024x768, 1366x768 but
never 1280 pixels as the horizontal resolution. It likely only supports
1280 over DVI/HDMI but the native resolution is actually 1366 if you
connected via VGA.


Mines set to 1280x720. That feeds the TV an ATSC standard via either DVI
or VGA. I let the TV upscale it to 1366x768 in both cases. Works just like
it's supposed to.



--
Want the ultimate in free OTA SD/HDTV Recorder?
http://mythtv.org
My Tivo Experience http://wesnewell.no-ip.com/tivo.htm
Tivo HD/S3 compared http://wesnewell.no-ip.com/mythtivo.htm
AMD cpu help http://wesnewell.no-ip.com/cpu.php

Måns Rullgård December 16th 07 12:27 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
writes:

In alt.tv.tech.hdtv M?ns Rullg?rd wrote:
| Glenn Millar writes:
|
| Scott Alfter wrote:
| In article ,
| Glenn Millar wrote:
| In reality, you wouldn't want to use the HDMI connection to connect
| a PC. The best results on my 50" samsung is via the VGA
| connector. That way I get full [email protected] progressive whereas
| 720p or 1080i is in actually a lesser resolution.
| There's no reason you can't use the same modeline over DVI or
| HDMI that you would use with VGA. On the contrary, in my
| experience it's been much easier to get LCDs working on a
| digital connection than on an analog connection. LCDs sold for
| computer use have a button on them that usually allows them to
| sync up to a VGA signal, but LCD TVs rarely have this option.
| To get a 1:1 correspondence between pixels in the framebuffer
| and pixels on the screen, you then have to do extensive
| tinkering with modelines...and you might never come up with a
| working modeline. My TV has a native resolution of 1280x768. I
| generated a modeline for that resolution at 60 Hz and plugged it
| into xorg.conf, and over DVI, it Just Works.
|
| Your experience may well be correct with other LCD or Plasma TV's but
| my reply was in relation to the Samsung screens. They don't like being
| connected to a PC via HDMI. If someone get a profile for PowerStrip
| that works correctly with a samsung i'd like a copy.
|
| My Samsung TV (LE26R41BD, panel resolution 1366x768) happily accepts
| any reasonable input over HDMI. If queried, it claims to only support
| the usual HDTV modes (720x480/576, 1280x720, 1920x1080i), but if
| another mode is forced it works just fine. For the VGA input, all the
| usual adjustments are possible through the onscreen menu.

Any chance it "works just fine" on frame rates below 50 Hz, like maybe at
24 Hz?


I haven't tried, so I don't know.

--
Måns Rullgård


Bigguy[_3_] December 16th 07 02:18 PM

How can a TV know that an image is coming from a computer, nota comsumer set-top box?
 
wrote:
In alt.tv.tech.hdtv D wrote:


A colleague at work has verified that his Sharp Aquos 37" TV works fine
with his video card DVI output connected to the TV HDMI input via a DVI
to HDMI cable.


Ditto here with 52" Sharp Aquos - 1920 x 1080 from PC's DVI out.

Guy

D[_2_] December 16th 07 04:14 PM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 16, 1:03 am, Glenn Millar wrote:
Woody wrote:
"T Shadow" wrote in message
...
"D" wrote in message
...
Hello!
According to Samsung LE-32r71b HDTV manual the TV cannot receive an
image from a computer through its HDMI input, but through its d-sub
only. Is it really true? How can the TV know that an image is coming
from a computer, not a comsumer set-top box? My video card is
Gigabyte
HD 2600Pro. I would like to use a DVI-HDMI cable.
Regards,
Dima
Wouldn't rule out technical reasons but probably they just don't want
to
answer questions about it. Puts the onus on you.


Likely because HDMI has authentication handshaking built in to its
protocol and the PC may not be savvy to such things.


In reality, you wouldn't want to use the HDMI connection to connect a
PC. The best results on my 50" samsung is via the VGA connector. That
way I get full [email protected] progressive whereas 720p or 1080i is in
actually a lesser resolution.

Give it a try on the HDMI input. It just work out. On the other hand if
you have a 1080p screen, buy good card with a HDMI output capable of 1080p.

Regards
Glenn.- Hide quoted text -

- Show quoted text -

Hello!
I have bought Gembird DVI-HDMI cable. Samsung LE-32r71b does show
video through HDMI input from a computer DVI output, but of much
lower
quality than through D-sub input: there ara black borders around the
image, the image is much less sharp.
Regards,
Dima

D[_2_] December 16th 07 04:38 PM

How can a TV know that an image is coming from a computer, not acomsumer set-top box?
 
On Dec 16, 3:00 am, Måns Rullgård wrote:
Glenn Millar writes:
Scott Alfter wrote:
In article ,
Glenn Millar wrote:
In reality, you wouldn't want to use the HDMI connection to connect
a PC. The best results on my 50" samsung is via the VGA
connector. That way I get full [email protected] progressive whereas
720p or 1080i is in actually a lesser resolution.
There's no reason you can't use the same modeline over DVI or HDMI
that you
would use with VGA. On the contrary, in my experience it's been much easier
to get LCDs working on a digital connection than on an analog
connection. LCDs sold for computer use have a button on them that
usually allows them to
sync up to a VGA signal, but LCD TVs rarely have this option. To get a 1:1
correspondence between pixels in the framebuffer and pixels on the screen,
you then have to do extensive tinkering with modelines...and you might never
come up with a working modeline. My TV has a native resolution of
1280x768. I generated a modeline for that
resolution at 60 Hz and plugged it into xorg.conf, and over DVI, it Just
Works.


Your experience may well be correct with other LCD or Plasma TV's but
my reply was in relation to the Samsung screens. They don't like being
connected to a PC via HDMI. If someone get a profile for PowerStrip
that works correctly with a samsung i'd like a copy.


My Samsung TV (LE26R41BD, panel resolution 1366x768) happily accepts
any reasonable input over HDMI. If queried, it claims to only support
the usual HDTV modes (720x480/576, 1280x720, 1920x1080i), but if
another mode is forced it works just fine. For the VGA input, all the
usual adjustments are possible through the onscreen menu.

--
Måns Rullgård
- Hide quoted text -

- Show quoted text -

Hello!
I have bought Gembird DVI-HDMI cable. Samsung LE-32r71b does show
video through HDMI input from a computer DVI output, but of much
lower
quality than through D-sub input: there ara black borders around the
image, the image is much less sharp. I do not change output resolution
1360*768 when switching from d-sub to DVI.
Regards,
Dima

J. Clarke December 16th 07 11:36 PM

How can a TV know that an image is coming from a computer, not a comsumer set-top box?
 
John Rumm wrote:
wrote:

use HDCP. It's only for protected content that it is expected to
use
HDCP to ensure you cannot use a monitor that is really something
like a recorder, or let you tap the HDMI cable wires (it's
encrypted
in HDCP).


Unless you install AnyDVD on it first, and then the whole sorry mess
ceases to matter. ;-)


Does it in fact allow the display of HD content off blu-ray disks at
1920x1080 resolution on a non-HDCP compliant monitor? I was under the
impression that down-conversion was implemented at the firmware level
or below.

--
--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)




All times are GMT +1. The time now is 06:59 AM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com