![]() |
| If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|||||||
|
|
Thread Tools | Display Modes |
|
#141
|
|||
|
|||
|
"Marky P" wrote in message ... On Wed, 15 Oct 2008 00:24:46 +0100, "Bill Wright" wrote: Cucumbers? Bill Don't need cucumbers with me around ;-) Ohh! I'm speechless! Bill |
|
#142
|
|||
|
|||
|
In article ,
Java Jive wrote: On Mon, 13 Oct 2008 07:23:12 +0100, Mark Carver wrote: Java Jive wrote: Interlace itself is a 2:1 compression system, that partly relies on persistence of vision to be effective. Er, some bedtime reading for you :-) "THE MYTH OF PERSISTENCE OF VISION REVISITED" http://www.uca.edu/org/ccsmi/ccsmi/c...0Revisited.htm I'll start the bidding at a ratio of about 5:1. 50Mb/s as a minimum for SD before artefacts are noticeable. I presume that's still not lossless though, so not really what's needed as a benchmark. I've just tried 10 Flac files at random, and found an average of 2.7 to 1, Ghost imaging disks seems to achieve something around 2-2.5 to 1, so I would guess that 2-3 to 1 is about the best you can get from lossless compression. Not really relevant. The result achieved will depend a lot of the content and of the algorithm used. You are taking music audio. Generally, there's a lot less static data in music than in video. A previously mentionned still green picture could trivially be compressed from 3.45M data points to 1 data point plus 3 bytes to represent repetition. A perfectly lossless compression ratio of 99.99999% Lossy compression is not entirely evil. When the right lossy compression algorithm is used right, it does allow huge savings in bandwidth compared to lossless algorithms. JPEG and MPEG have not been designed by idiots. They do give very good results when used right. Assuming that you have a fixed bandwidth that you need to respect (or for jpeg, a fix file size), you can either reduce resolution and compress using a lossless algorithm or keep the original resolution and compress using a lossy algorithm adjusting the quality factor. (Something similar can be tried at home with a still image using gif vs jpeg, aiming at different file size)(Same thing with MP3, hardly anyone can hear compression artefact in a 320Kbs MP3 vs a FLAC). Unfortunately, broadcasters abuse compression in search of profits. Unfortunately, most of the population is too apathetic to put any significant pressure on broadcasters to clean up their act. Unfortunately, over compression abuses has given a bad name to lossy compression. Nevertheless, I note that even your figure of 5 to 1 is approximately 12.5 x our current best broadcast rate! http://www.bbc.co.uk/rd/projects/dirac/ Interesting link, particularly like the Open Source aspects. |
|
#143
|
|||
|
|||
|
|
|
#144
|
|||
|
|||
|
Java Jive wrote:
It's relevant because I'm trying to get a measure of how much picture information is actually lost. I could just take the original bitrate and compare it with what we receive, that gives figures between about 99% to 97% lost, depending on which source figure you use, downconverted HD or recorded SD. But the trouble with that is it's not really a measure of what is thrown away, because some of that reduction could have been achieved through lossless compression. So to get a measure of what has actually been irretrievably lost through lossy compression, I think you have to compare what we actually receive with what we would receive if lossless compression were used. However, since lossless compression is not in use, there's a difficulty in establishing the latter, hence the guesstimates. Well, IMHO DVDs are fine. I rarely see compression artefacts. And they are... what? 6MBit? Andy |
|
#145
|
|||
|
|||
|
"Andy Champ" wrote in message . uk... Java Jive wrote: Well, IMHO DVDs are fine. I rarely see compression artefacts. And they are... what? 6MBit? up to 10mbits. keep in mind that they are encoded in advance too to get the most out of the encoder. tv stations will be encoding them in realtime - which will never give results as good. -- Gareth. that fly...... is your magic wand.... |
|
#146
|
|||
|
|||
|
And I've seen compression artifacts in DVDs as well, for example
shoals of fish in 'Blue Planet'. On Fri, 17 Oct 2008 19:49:43 +0100, "The dog from that film you saw" wrote: "Andy Champ" wrote in message . uk... Java Jive wrote: Well, IMHO DVDs are fine. I rarely see compression artefacts. And they are... what? 6MBit? up to 10mbits. keep in mind that they are encoded in advance too to get the most out of the encoder. tv stations will be encoding them in realtime - which will never give results as good. |
|
#147
|
|||
|
|||
|
In article ,
Java Jive wrote: On 16 Oct 2008 15:21:06 GMT, (Yannick Tremblay) wrote: In article , Java Jive wrote: I presume that's still not lossless though, so not really what's needed as a benchmark. Not really relevant. The result achieved will depend a lot of the content and of the algorithm used. It's relevant because I'm trying to get a measure of how much picture information is actually lost. I mostly agree with you and I understand what you would like to do but the way you are trying to do it is not valid. I could just take the original bitrate and compare it with what we receive, that gives figures between about 99% to 97% lost, depending Not "lost". 99% to 97% reduction in data bandwidth. There are simple equation to convert from reduction in data bandwidth to "picture information lost". A lossless algorithm may be able to do a 60% compression at 0% information lost. A lossy algorithm is much more difficult to figure out. First of all we'd need to define "lost information". What is the "loss" value of a rounding error? For a film, it's not mathematics that are important but perceived quality by the viewer. So we could try double blind tests (easy to do with music): Take a number of good recording, rip it and compress it using FLAC and AAC at maximum bit rate. Take a test group and make them listen to both recordings asking them to rate the two, playing AAC and FLAC randomly. Also add references recording where both samples are in FLAC or both in AAC. Correlate the results and see if it is statistically significant. At 320Kbs, you are unlikely to find any difference. So you might get results like: Uncompressed audio: - compression ratio 0% - perceived quality 100% FLAC - compression ratio 40% - perceived quality 100% 320Kbs AAC - compression ratio 70% - perceived quality 100% i.e. although there are actual difference between the reconstructed sound and the original recording, these difference are not perceived by listeners. At that point, by using a lossy compression algorithm, you are essentially getting better compression at no cost (not cost that is humanly noticeable) The problem start when you reduce the bandwidth so that there are noticeable difference. At that point, you get into a trade-off bandwidth vs perceivable quality. IMO, DVB-T Freeview has go it wrong. They sacrified far too much quality in order to save bandwidth. The worse is that saved bandwidth is being "wasted" on pointless stuff like +1 channels (just record the stupid program), shopping channels, etc. on which source figure you use, downconverted HD or recorded SD. But the trouble with that is it's not really a measure of what is thrown away, because some of that reduction could have been achieved through lossless compression. So to get a measure of what has actually been irretrievably lost through lossy compression, I think you have to compare what we actually receive with what we would receive if lossless compression were used. However, since lossless compression is not in use, there's a difficulty in establishing the latter, hence the guesstimates. Yes, I understand what you are trying. Just that your staement about 97% lost are incorrect and misleading. For things like braodcasting, lossy compression is most probably the best solution. I think it is much more important to educate the public about compression artifacts and show them the difference between a good (say BBC1 on their best days) and a bad (ITV4 footy) image and explain how by increasing the bandwidth by say 30%, you could get rid of say 90% of visible compression artefacts rather than by making lossy compression the great evil. Unfortunately, broadcasters abuse compression in search of profits. Unfortunately, most of the population is too apathetic to put any significant pressure on broadcasters to clean up their act. Unfortunately, over compression abuses has given a bad name to lossy compression. Too bloody right! |
|
#148
|
|||
|
|||
|
|
|
#150
|
|||
|
|||
|
I wouldn't attempt to broadcast HD on DTT. If they gave us lossless
SD, most of the motivation for HD would be removed. On Mon, 20 Oct 2008 19:37:34 +0100, Stuart Clark wrote: At the top end, and HD-SDI signal is about 3Gbps for 1080p, dropping down to 270Mbps for SD. |
| Thread Tools | |
| Display Modes | |
|
|
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Whats the best Freeview box? | Boltar | UK digital tv | 17 | July 13th 07 02:08 AM |
| Which way to point it? | Barry | UK digital tv | 21 | December 7th 06 12:54 AM |
| sky & freeview whats the difference | robert | UK digital tv | 12 | June 9th 05 10:04 AM |
| point pleasent | Bob G0KYF | UK sky | 1 | April 9th 05 05:23 PM |
| help!! freeview box whats best for me money? | keef | UK digital tv | 8 | September 28th 04 12:12 PM |