|
On Wed, 24 Nov 2004 23:19:13 -0000, "Stephen Neal"
wrote: "Champ" wrote in message .. . On Wed, 24 Nov 2004 15:30:11 -0000, "Tiny Tim" wrote: [snip] Hauppauge PVR-350 - except I've just realised this doesn't support MCE (or vice versa). Hardware support for MCE is a pain I could do without right now. Have a look at www.shspvr.com - they have MCE compatible drivers for most of the Win PVR range - I got my PVR USB2 working - and others report that the 250 and 350 are working fine in MCE with the drivers from that site. Ooh, ta. Noted The bit I'm vague about at the moment is all the stuff around connecting it to the TV. I just want this box to sit with the other home cinema kit, and not use a separate monitor (what's the longest a monitor cable can be, anyway?). My TV is a 3 year old Sony Triniton KV-32FX60D. According to the book of words, it accepts "Normal audio/video and RGB" on Scart 1, and "Normal audio/video and s-video" on Scart 2 and 3. So, what feed will I actually take out of the PC, and what cable will I use? Will I need one of these modded cables with a couple of capacitors in it? Is the fact that the TV runs at 100 Hz a factor too? You can feed your TV via a TV output from a video card - which will be s-video or composite and scaled. This is the easiest solution (especially if the Shuttle has on-board TV out support). It does, according to the Shuttle site. The quality won't be as good as the RGB feed from a set top box - but might be perfectly acceptable to you. Well, I'm aiming to get things working, and then go chasing quality, I think. It is helpful if you have a second PC when doing this though - as you can then use an application called VNC to remotely control your "TV PC" - so that you can still use it when the video output is unusable! yeah, I know about this; I'd probably use Terminal Services (which is part of WinXP and Win2000 nowadays) to do this. Thanks very much for the advice. -- This sig under construction |
In ,
Stephen Neal wrote: "Tony Houghton" wrote in message . uk... I don't suppose you know whether MCE can handle displaying interlaced DVB on an interlaced output? I'm not sure - I'm pretty certain Im getting 50Hz motion (not 25Hz flicker vision) - the tickers on the ITV News Channel and Sky, as well as rolling credits - look fluid rather than jerku, and there is no double imaging. This could of-course be 50Hz interlaced DVB up converted to 50Hz progressive in the card and then downconverted to 50Hz interlaced again for output. I will fiddle with the nVidia decoder settings to see what happens if I turn all the de-interlacing off (or can force weave or film mode - which should mean all lines remain un-processed I think) I'm a bit confused, I thought you were using an ATI card with RGB. With S-Video, both ATI and NVidia have hardware scaling, so I suppose that would deinterlace and reinterlace as well. If you've got a card with onboard MPEG decoder that works with RGB and interlacing that's ideal. I don't think Linux supports MPEG decoding on ATI or NVidia cards, but it has access to some acceleration functions for video. Currently I use a DXR3 in Linux. Either it syncs the interlacing or it deinterlaces and reinterlaces, I just know it's the best thing I've tried so far. I can play DivX on it too by getting the video player to apply an MPEG1 conversion filter. Can you do that sort of thing on Windows? Ie as far as Linux is concerned I've got some things that output MPEG and a device that accepts MPEG, so I can just feed one to the other as I please, whereas in Windows you seem to be stuck with using the DXR3's own DVD player and showing the output from the Hauppauge Nova-T in a window, never the twain shall meet. The DXR3 does have a tendency to lose A/V sync though, so I've bought a Matrox Marvel from eBay for 99p (+ VAT). I don't know when I'll get it, because I'm trying to arrange for a friend who lives near the seller to pick it up to avoid the rip off £10 + VAT postage and having to post a cheque ;-). The reason I'm trying that is I've recently heard Matrox are the only cards that can provide interrupts to sync interlacing, and they have very good Linux support. They can do this syncing in Windows too, but I think it may need a specific application. TV out on Linux is tricky with them though and I may have to find a patch to get the interlacing trick working. And it's a G200, and Linux needs a G400 or better for TV out, at least with the driver I have in mind. Except apparently if you upgrade the card's BIOS, it's compatible with a G400! What are film, weave and progressive mode? All I knew about was interlaced and non-interlaced. -- The address in the Reply-To is genuine and should not be edited. See http://www.realh.co.uk/contact.html for more reliable contact addresses. |
"Tony Houghton" wrote in message . uk... In , Stephen Neal wrote: "Tony Houghton" wrote in message . uk... I don't suppose you know whether MCE can handle displaying interlaced DVB on an interlaced output? I'm not sure - I'm pretty certain Im getting 50Hz motion (not 25Hz flicker vision) - the tickers on the ITV News Channel and Sky, as well as rolling credits - look fluid rather than jerku, and there is no double imaging. This could of-course be 50Hz interlaced DVB up converted to 50Hz progressive in the card and then downconverted to 50Hz interlaced again for output. I will fiddle with the nVidia decoder settings to see what happens if I turn all the de-interlacing off (or can force weave or film mode - which should mean all lines remain un-processed I think) I'm a bit confused, I thought you were using an ATI card with RGB. Yes - I am. What I don't know is how the ATI generates the interlaced RGB output - it might not be as simple as it first seems. (I'm coming at this from a broadcast engineering point of view - where some high-end post production gear has to de-interlace and re-interlace to do high-end picture processing - like sub-sample level re-sizing and moving of live video) My guess is that the following is happening : 1. 50Hz interlaced MPEG2 video is received by the DVB card and squirted into Windows Media Center. 2. Windows Media Center squirts this into the installed MPEG2 decoder - in my case nVidia's DVD stuff. 3. The nVidia MPEG2 decoder decodes the MPEG2 data, de-interlaces it (as configured in the properties) and in association with my Radeon card it produces a 50Hz progressive feed, which is rendered to the ATI video card's frame store at 50Hz. 4. The ATI video output stage then clocks this out to the video DACs - but as it is configured for interlaced display it does it by scanning out every other line in each consecutive 50Hz field. In other words there is no re-interlacing filter and half the lines from each 50Hz frame aren't used - just a straight odd lines in field one, even lines in field two kind of thing. It may be that it actually does a decent de-interlace and renders a 25Hz progressive feed - but I'm pretty certain I'm getting some 50Hz motion. Fiddling with the nVidia MPEG2 properties I am certainly able to get some horrid effects - but it doesn't allow basic Force Weave, Force Bob as I understand it (and AIUI a Force Weave - if things are all going correctly - should either give me perfect fluid motion or horrid incorrect field dominance judder - if each consecutive Windows display line is correctly going into alternate fields) With S-Video, both ATI and NVidia have hardware scaling, so I suppose that would deinterlace and reinterlace as well. Yep - as far as the PC is concerned it is running normally, feeding a progressive VGA display, so all the MPEG2 decoding is as if it is for a progressive display (though some TV outs require the VGA out to run at the same frame rate as the TV out field rate. nVidia don't seem to - ATI do.) The TV out gubbins handles the conversion to interlaced output - and usually includes vertical filtering to reduce interline "twitter" (where vertical detail flickers at half the field rate) - as well as re-scaling and possibly field/frame rate conversion. If you've got a card with onboard MPEG decoder that works with RGB and interlacing that's ideal. Yep - I have - the XCard (which is like the DXR 3 I think) - but it isn't supported by Windows Media Center. Whilst the Radeon has on board MPEG decoding assistance, and RGB output, it still squirts the video around Windows using DirectX kind of stuff I think... Why is nothing ever simple? I don't think Linux supports MPEG decoding on ATI or NVidia cards, but it has access to some acceleration functions for video. Currently I use a DXR3 in Linux. Either it syncs the interlacing or it deinterlaces and reinterlaces, I just know it's the best thing I've tried so far. Isn't the DXR3 an earlier incarnation of the XCard? In other words it is a hardware MPEG2 decoder - it just outputs composite, S-video (and I think RGB with some persuasion) - and doesn't go anywhere near any de- or re-interlacing - it just converts the MPEG2 interlaced video as normal TV resolution interlaced video. (In pretty much the same way as a standard DVD player or TV set top box would - not a progressive format in sight) The XCard also has the ability to hardware de-interlace and re-scale to VGA and HDTV resolutions - not sure if the Creative card does this - but this is unrelated to the video output - which is effectively clean and has no nasty de- / re- interlacing processes. I can play DivX on it too by getting the video player to apply an MPEG1 conversion filter. Can you do that sort of thing on Windows? The XCard supports some flavours of MPEG4/Divx in hardware - and also supports PC conversion in realtime in software if it doesn't like the flavour for hardware decoding. This worked OK - but because it is a bit niche the software isn't great. There was some DVB support - and the quality was fantastic - but the EPG integration, support for a decent user interface etc. was a bit lacking. Ie as far as Linux is concerned I've got some things that output MPEG and a device that accepts MPEG, so I can just feed one to the other as I please, whereas in Windows you seem to be stuck with using the DXR3's own DVD player and showing the output from the Hauppauge Nova-T in a window, never the twain shall meet. Well the Nova-T was compatible with the XCard - and the quality was, as I said, great. However even using TVEdia - a 3rd party application that worked nicely - the EPG and stuff was still not up to scratch. I wasn't really using it that much as a result. I suspect MCE 2005 will be used a lot more. The DXR3 does have a tendency to lose A/V sync though, so I've bought a Matrox Marvel from eBay for 99p (+ VAT). I don't know when I'll get it, because I'm trying to arrange for a friend who lives near the seller to pick it up to avoid the rip off £10 + VAT postage and having to post a cheque ;-). The reason I'm trying that is I've recently heard Matrox are the only cards that can provide interrupts to sync interlacing, and they have very good Linux support. They can do this syncing in Windows too, but I think it may need a specific application. TV out on Linux is tricky with them though and I may have to find a patch to get the interlacing trick working. And it's a G200, and Linux needs a G400 or better for TV out, at least with the driver I have in mind. Except apparently if you upgrade the card's BIOS, it's compatible with a G400! What are film, weave and progressive mode? All I knew about was interlaced and non-interlaced. Progressive is where every line in a frame is displayed during every screen refresh, it is the opposite of interlaced, where only alternate lines from a frame are displayed in each screen refresh. This doesn't mean that you have to wait for two screen refresh periods before changing the content of the frame though - just that you won't see all the lines from all the frames if you do! Weave is where you show both fields from an interlaced frame simultaneously in a progressive frame (I think). It is great if there is no motion between the fields (as is the case for 25Hz film or 25Hz progressive video) - but if there is motion between fields (as is the case with 50Hz interlaced video camera sourced material) you get the horrid "combing" effect where you see the two fields split. Steve |
"-= a q u a b u b b l e =-" wrote in message ... Tiny Tim wrote: - 250GB SATA HDD (haven't chosen model - any suggestions?) Any reason for going SATA? At this point in time, there is no benefit to SATA, especially if all you're doing is HTPC: the HDD activity is not too intensive. Makes running cable a lot easier which can be an important consideration in a small box or if in my case you have six 250GB hard drives. My Silverstone TJ-06 was a real mess with IDE cables for six drives. Sata really cleaned that up plus when you hit the next upgrade cycle you won't have to convert to sata. I actually lucked out and made a little bit of profit selling my IDE drives when I converted to sata. - 2*256MB PC3200 DDR400 ram (any reason not to simply get budget Crucial/generic memory?) I'd be tempted to go up to 1GB here. No reason not to get budget RAM: high performance is not the issue with XP MCE, but the more memory the better. With a HTPC, you don't want pauses as the OS swaps out to disk. Agreed and you can do that in two steps buy one 512mb now and one later - Hauppauge PVR-150-MCE TV card (to record from Sky satellite set top box - no UHF TV required) Is this available in the UK yet? Also consider the Black Gold DVB-T or Nebula cards. - nVidia FX5200 128MB DVI video card (no brand chosen but anything cheapish so long as it keeps MCE 2005 happy) I would avoid being cheap in the video card department. Check around in the HTPC forums and see which cards have the best TV out. Not every Tv-out chip is the same. If you can find a store with a goood return policy then you cab try it before committing to it. I'd go for an ATI 9600 non-pro 128MB. It has no fan and you can use the VGA-out to feed your RGB-enabled SCART on your telly. Alternatively, consider passively cooled versions of the 9600Pro or 9800Pro. Although another poster recommended the Matrox card, which does have excellent TV-out, MCE2005 requires a VMR9 capable card (i.e. DirectX9), which the Matrox isn't. |
In ,
Stephen Neal wrote: "Tony Houghton" wrote in message I'm a bit confused, I thought you were using an ATI card with RGB. Yes - I am. What I don't know is how the ATI generates the interlaced RGB output - it might not be as simple as it first seems. (I'm coming at this from a broadcast engineering point of view - where some high-end post production gear has to de-interlace and re-interlace to do high-end picture processing - like sub-sample level re-sizing and moving of live video) My guess is that the following is happening : 1. 50Hz interlaced MPEG2 video is received by the DVB card and squirted into Windows Media Center. 2. Windows Media Center squirts this into the installed MPEG2 decoder - in my case nVidia's DVD stuff. Oh right, the NVidia bit is just a software MPEG decoder which is compatible with ATI cards too? 3. The nVidia MPEG2 decoder decodes the MPEG2 data, de-interlaces it (as configured in the properties) and in association with my Radeon card it produces a 50Hz progressive feed, which is rendered to the ATI video card's frame store at 50Hz. Is that why deinterlacing filters take a lot of processing power, because they're producing a 50Hz progressive stream and have to do some fancy interpolation because they're effectively doubling the amount of data? 4. The ATI video output stage then clocks this out to the video DACs - but as it is configured for interlaced display it does it by scanning out every other line in each consecutive 50Hz field. In other words there is no re-interlacing filter and half the lines from each 50Hz frame aren't used - just a straight odd lines in field one, even lines in field two kind of thing. I've just been thinking about why the interlacing should go wrong if there's no synchronisation or processing filter. I guess the interlaced display must generate vertical blank interrupts at 50Hz, so the player software has no way of knowing whether the output is currently odd or even fields. With 25Hz interrupts all it would have to do is paste the input fields together into a 25Hz progressive feed, synchronised to the 25Hz interrupts, then the output circuitry would automatically output the fields in the correct order. I suppose it would have to add a 1/25s delay to the sound too, but that's not a problem. It may be that it actually does a decent de-interlace and renders a 25Hz progressive feed - but I'm pretty certain I'm getting some 50Hz motion. Fiddling with the nVidia MPEG2 properties I am certainly able to get some horrid effects - but it doesn't allow basic Force Weave, Force Bob as I understand it (and AIUI a Force Weave - if things are all going correctly - should either give me perfect fluid motion or horrid incorrect field dominance judder - if each consecutive Windows display line is correctly going into alternate fields) There's another term I missed. Bob? Isn't the DXR3 an earlier incarnation of the XCard? In other words it is a hardware MPEG2 decoder - it just outputs composite, S-video (and I think RGB with some persuasion) - and doesn't go anywhere near any de- or re-interlacing - it just converts the MPEG2 interlaced video as normal TV resolution interlaced video. (In pretty much the same way as a standard DVD player or TV set top box would - not a progressive format in sight) I don't know what its relationship is with the XCard. It's got a VGA socket which I haven't used. I think it's just so that you can pass your normal video card's output through it (with signal degradation - anyone remember the Voodoo 1 & 2 passthroughs?) so that you can view the video in a window with hardware overlay. -- The address in the Reply-To is genuine and should not be edited. See http://www.realh.co.uk/contact.html for more reliable contact addresses. |
Tony Houghton wrote:
In , Stephen Neal wrote: "Tony Houghton" wrote in message [snip] My guess is that the following is happening : 1. 50Hz interlaced MPEG2 video is received by the DVB card and squirted into Windows Media Center. 2. Windows Media Center squirts this into the installed MPEG2 decoder - in my case nVidia's DVD stuff. Oh right, the NVidia bit is just a software MPEG decoder which is compatible with ATI cards too? Yep - confusingly nVidia seem to be one of the best suppliers of MCE 2005 compatible MPEG2 decoders. (Believe it or not, MCE 2005 doesn't include MPEG2 codecs as standard. No wonder it isn't on sale to the general public via anything but OEM routes) 3. The nVidia MPEG2 decoder decodes the MPEG2 data, de-interlaces it (as configured in the properties) and in association with my Radeon card it produces a 50Hz progressive feed, which is rendered to the ATI video card's frame store at 50Hz. Is that why deinterlacing filters take a lot of processing power, because they're producing a 50Hz progressive stream and have to do some fancy interpolation because they're effectively doubling the amount of data? Yep - and the really good ones (as implemented in dScaler) do very processor intensive stuff to produce the best interpolation. (Win DVD 6 Platinum includes something called Trimension - which is I think based on the Philips Natural Motion 100Hz TV algorithm - which actually interpolates extra information on FILM sourced stuff - interpolating frames that weren't there. Converting 25fps to 50fps with full fluid "interpolated" motion - making film DVDs look like video. Very unnerving it is too!) 4. The ATI video output stage then clocks this out to the video DACs - but as it is configured for interlaced display it does it by scanning out every other line in each consecutive 50Hz field. In other words there is no re-interlacing filter and half the lines from each 50Hz frame aren't used - just a straight odd lines in field one, even lines in field two kind of thing. I've just been thinking about why the interlacing should go wrong if there's no synchronisation or processing filter. I guess the interlaced display must generate vertical blank interrupts at 50Hz, so the player software has no way of knowing whether the output is currently odd or even fields. With 25Hz interrupts all it would have to do is paste the input fields together into a 25Hz progressive feed, synchronised to the 25Hz interrupts, then the output circuitry would automatically output the fields in the correct order. I suppose it would have to add a 1/25s delay to the sound too, but that's not a problem. I guess so. From memory the field dominance in NTSC and PAL is different as well. You are right - I think - that if all the 50Hz interlaced video is effectively treated as 25Hz progressive (with no scaling) all the way through the PC until it is output interlaced - the interlaced video will survive as interlaced video at 50Hz. (The reverse is also true and used when using interlaced 50Hz broadcast kit to process 25Hz progressive video stuff) It may be that it actually does a decent de-interlace and renders a 25Hz progressive feed - but I'm pretty certain I'm getting some 50Hz motion. Fiddling with the nVidia MPEG2 properties I am certainly able to get some horrid effects - but it doesn't allow basic Force Weave, Force Bob as I understand it (and AIUI a Force Weave - if things are all going correctly - should either give me perfect fluid motion or horrid incorrect field dominance judder - if each consecutive Windows display line is correctly going into alternate fields) There's another term I missed. Bob? From memory Weave is where the two fields are merged, so a 25Hz progressive frame is created by just adding the odd and even lines of both 50Hz fields, Bob is where one field is ditched and the 25Hz frame is created by replicating just one of the two 50Hz fields. Weave works well for static stuff - and maximised the vertical resolution (and also works for moving stuff with no intra-frame movement between fields, or slow movement). Bob works where there is fast movement between fields - but the vertical resolution drops to half that of the frame resolution. Isn't the DXR3 an earlier incarnation of the XCard? In other words it is a hardware MPEG2 decoder - it just outputs composite, S-video (and I think RGB with some persuasion) - and doesn't go anywhere near any de- or re-interlacing - it just converts the MPEG2 interlaced video as normal TV resolution interlaced video. (In pretty much the same way as a standard DVD player or TV set top box would - not a progressive format in sight) I don't know what its relationship is with the XCard. It's got a VGA socket which I haven't used. I think it's just so that you can pass your normal video card's output through it (with signal degradation - anyone remember the Voodoo 1 & 2 passthroughs?) so that you can view the video in a window with hardware overlay. Yep - the DXR3 is I think a close relative of the Sigma Hollywood Plus card (itself a predecessor to the XCard). There is no issue of de- and re- interlacing on the standard video outputs as it is pumping out standard interlaced video directly from the MPEG2 hardware decoder. Only the VGA stuff is de-interlaced. Steve |
On Wed, 24 Nov 2004 19:35:07 +0000 (UTC), Tony Houghton
wrote: Ah, so that's why the graphics card requirements are so high (but why no R9500/9600)? ATI claims to support MCE 2005 with the 9550 & 9600 cards. http://www.ati.com/buy/promotions/mcesolutions/ -- Nigel Barker Live from the sunny Cote d'Azur |
On Wed, 24 Nov 2004 21:12:43 GMT, "-= a q u a b u b b l e =-"
wrote: Sorry yes, I misread your original post about this... if you want the very best input quality, have you considered the Sweetspot card, which has RGB input? http://www.pluggedin.tv/sweetspot/ It's not on the MCE2005 HCL yet, but I'm not aware of any problems with it. Apart from the problem that it will not work with MCE which requires a card capable of DX9 preferably in hardware. The Sweetspot looks like a great solution for high quality video it just is not suitable for MCE. -- Nigel Barker Live from the sunny Cote d'Azur |
On Wed, 24 Nov 2004 22:55:27 -0700, "Leadfoot" wrote:
I would avoid being cheap in the video card department. Check around in the HTPC forums and see which cards have the best TV out. Not every Tv-out chip is the same. If you can find a store with a goood return policy then you cab try it before committing to it. I don't believe that there any cards on the MCE HCL that have good TV-out. Average or adequate would be a better description. -- Nigel Barker Live from the sunny Cote d'Azur |
Nigel Barker wrote:
On Wed, 24 Nov 2004 22:55:27 -0700, "Leadfoot" wrote: I would avoid being cheap in the video card department. Check around in the HTPC forums and see which cards have the best TV out. Not every Tv-out chip is the same. If you can find a store with a goood return policy then you cab try it before committing to it. I don't believe that there any cards on the MCE HCL that have good TV-out. Average or adequate would be a better description. Yep - this seems to be my experience. I am quite surprised that such expensive and high quality video cards have such low quality TV outputs - especially when it is possible to get a decent quality output using a VGA-RGB SCART cable, a suitable video card and Powerstrip. (I guess this is why Powerstrip is so popular in the US, where it can be used to drive HDTVs via a VGA-Component converter - or in some cases the video cards themselves can switch to component output) I am amazed at the difference in quality that I have got from Windows MCE since I switched from S-video TV-out to a VGA-SCART RGB solution and Powerstrip! Steve |
| All times are GMT +1. The time now is 06:00 AM. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com