A Home cinema forum. HomeCinemaBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HomeCinemaBanter forum » Home cinema newsgroups » UK digital tv
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Whats the point of Freeview?



 
 
Thread Tools Display Modes
  #141  
Old October 15th 08, 02:13 PM posted to uk.tech.digital-tv
Bill Wright
external usenet poster
 
Posts: 6,542
Default Whats the point of Freeview?


"Marky P" wrote in message
...
On Wed, 15 Oct 2008 00:24:46 +0100, "Bill Wright"
wrote:
Cucumbers?

Bill

Don't need cucumbers with me around ;-)


Ohh! I'm speechless!

Bill


  #142  
Old October 16th 08, 05:21 PM posted to uk.tech.digital-tv
Yannick Tremblay
external usenet poster
 
Posts: 5
Default Whats the point of Freeview?

In article ,
Java Jive wrote:
On Mon, 13 Oct 2008 07:23:12 +0100, Mark Carver
wrote:

Java Jive wrote:

Interlace itself is a 2:1
compression system, that partly relies on persistence of vision to be

effective.

Er, some bedtime reading for you :-)
"THE MYTH OF PERSISTENCE OF VISION REVISITED"
http://www.uca.edu/org/ccsmi/ccsmi/c...0Revisited.htm

I'll start the bidding at a ratio of about 5:1. 50Mb/s as a minimum for SD
before artefacts are noticeable.


I presume that's still not lossless though, so not really what's
needed as a benchmark. I've just tried 10 Flac files at random, and
found an average of 2.7 to 1, Ghost imaging disks seems to achieve
something around 2-2.5 to 1, so I would guess that 2-3 to 1 is about
the best you can get from lossless compression.


Not really relevant. The result achieved will depend a lot of the
content and of the algorithm used.

You are taking music audio. Generally, there's a lot less static data
in music than in video.

A previously mentionned still green picture could trivially be
compressed from 3.45M data points to 1 data point plus 3 bytes to
represent repetition. A perfectly lossless compression ratio of
99.99999%

Lossy compression is not entirely evil. When the right lossy
compression algorithm is used right, it does allow huge savings in
bandwidth compared to lossless algorithms.

JPEG and MPEG have not been designed by idiots. They do give very
good results when used right. Assuming that you have a fixed
bandwidth that you need to respect (or for jpeg, a fix file size), you
can either reduce resolution and compress using a lossless algorithm
or keep the original resolution and compress using a lossy algorithm
adjusting the quality factor. (Something similar can be tried at home
with a still image using gif vs jpeg, aiming at different file
size)(Same thing with MP3, hardly anyone can hear compression artefact
in a 320Kbs MP3 vs a FLAC).

Unfortunately, broadcasters abuse compression in search of profits.

Unfortunately, most of the population is too apathetic to put any
significant pressure on broadcasters to clean up their act.

Unfortunately, over compression abuses has given a bad name to lossy
compression.

Nevertheless, I note that even your figure of 5 to 1 is approximately
12.5 x our current best broadcast rate!

http://www.bbc.co.uk/rd/projects/dirac/


Interesting link, particularly like the Open Source aspects.



  #143  
Old October 16th 08, 06:34 PM posted to uk.tech.digital-tv
Java Jive
external usenet poster
 
Posts: 760
Default Whats the point of Freeview?

On 16 Oct 2008 15:21:06 GMT, (Yannick Tremblay)
wrote:

In article ,
Java Jive wrote:

I presume that's still not lossless though, so not really what's
needed as a benchmark.


Not really relevant. The result achieved will depend a lot of the
content and of the algorithm used.


It's relevant because I'm trying to get a measure of how much picture
information is actually lost.

I could just take the original bitrate and compare it with what we
receive, that gives figures between about 99% to 97% lost, depending
on which source figure you use, downconverted HD or recorded SD. But
the trouble with that is it's not really a measure of what is thrown
away, because some of that reduction could have been achieved through
lossless compression.

So to get a measure of what has actually been irretrievably lost
through lossy compression, I think you have to compare what we
actually receive with what we would receive if lossless compression
were used. However, since lossless compression is not in use, there's
a difficulty in establishing the latter, hence the guesstimates.

Unfortunately, broadcasters abuse compression in search of profits.

Unfortunately, most of the population is too apathetic to put any
significant pressure on broadcasters to clean up their act.

Unfortunately, over compression abuses has given a bad name to lossy
compression.


Too bloody right!
  #144  
Old October 17th 08, 08:46 PM posted to uk.tech.digital-tv
Andy Champ
external usenet poster
 
Posts: 145
Default Whats the point of Freeview?

Java Jive wrote:
It's relevant because I'm trying to get a measure of how much picture
information is actually lost.

I could just take the original bitrate and compare it with what we
receive, that gives figures between about 99% to 97% lost, depending
on which source figure you use, downconverted HD or recorded SD. But
the trouble with that is it's not really a measure of what is thrown
away, because some of that reduction could have been achieved through
lossless compression.

So to get a measure of what has actually been irretrievably lost
through lossy compression, I think you have to compare what we
actually receive with what we would receive if lossless compression
were used. However, since lossless compression is not in use, there's
a difficulty in establishing the latter, hence the guesstimates.


Well, IMHO DVDs are fine. I rarely see compression artefacts. And they
are... what? 6MBit?

Andy
  #145  
Old October 17th 08, 08:49 PM posted to uk.tech.digital-tv
The dog from that film you saw
external usenet poster
 
Posts: 587
Default Whats the point of Freeview?


"Andy Champ" wrote in message
. uk...
Java Jive wrote:




Well, IMHO DVDs are fine. I rarely see compression artefacts. And they
are... what? 6MBit?





up to 10mbits.
keep in mind that they are encoded in advance too to get the most out of the
encoder.
tv stations will be encoding them in realtime - which will never give
results as good.



--
Gareth.

that fly...... is your magic wand....

  #146  
Old October 18th 08, 12:08 AM posted to uk.tech.digital-tv
Java Jive
external usenet poster
 
Posts: 760
Default Whats the point of Freeview?

And I've seen compression artifacts in DVDs as well, for example
shoals of fish in 'Blue Planet'.

On Fri, 17 Oct 2008 19:49:43 +0100, "The dog from that film you saw"
wrote:

"Andy Champ" wrote in message
. uk...
Java Jive wrote:


Well, IMHO DVDs are fine. I rarely see compression artefacts. And they
are... what? 6MBit?


up to 10mbits.
keep in mind that they are encoded in advance too to get the most out of the
encoder.
tv stations will be encoding them in realtime - which will never give
results as good.

  #147  
Old October 20th 08, 04:18 PM posted to uk.tech.digital-tv
Yannick Tremblay
external usenet poster
 
Posts: 5
Default Whats the point of Freeview?

In article ,
Java Jive wrote:
On 16 Oct 2008 15:21:06 GMT, (Yannick Tremblay)
wrote:

In article ,
Java Jive wrote:

I presume that's still not lossless though, so not really what's
needed as a benchmark.


Not really relevant. The result achieved will depend a lot of the
content and of the algorithm used.


It's relevant because I'm trying to get a measure of how much picture
information is actually lost.


I mostly agree with you and I understand what you would like to do but
the way you are trying to do it is not valid.

I could just take the original bitrate and compare it with what we
receive, that gives figures between about 99% to 97% lost, depending


Not "lost". 99% to 97% reduction in data bandwidth. There are simple
equation to convert from reduction in data bandwidth to "picture
information lost". A lossless algorithm may be able to do a 60%
compression at 0% information lost. A lossy algorithm is much more
difficult to figure out. First of all we'd need to define "lost
information". What is the "loss" value of a rounding error? For a
film, it's not mathematics that are important but perceived quality by
the viewer.

So we could try double blind tests (easy to do with music):
Take a number of good recording, rip it and compress it using FLAC and
AAC at maximum bit rate. Take a test group and make them listen to both
recordings asking them to rate the two, playing AAC and FLAC
randomly. Also add references recording where both samples are in
FLAC or both in AAC. Correlate the results and see if it is
statistically significant. At 320Kbs, you are unlikely to find any
difference. So you might get results like:

Uncompressed audio:
- compression ratio 0%
- perceived quality 100%

FLAC
- compression ratio 40%
- perceived quality 100%

320Kbs AAC
- compression ratio 70%
- perceived quality 100%

i.e. although there are actual difference between the reconstructed
sound and the original recording, these difference are not perceived
by listeners. At that point, by using a lossy compression algorithm,
you are essentially getting better compression at no cost (not cost
that is humanly noticeable)

The problem start when you reduce the bandwidth so that there are
noticeable difference. At that point, you get into a trade-off
bandwidth vs perceivable quality. IMO, DVB-T Freeview has go it
wrong. They sacrified far too much quality in order to save
bandwidth. The worse is that saved bandwidth is being "wasted" on
pointless stuff like +1 channels (just record the stupid program),
shopping channels, etc.

on which source figure you use, downconverted HD or recorded SD. But
the trouble with that is it's not really a measure of what is thrown
away, because some of that reduction could have been achieved through
lossless compression.

So to get a measure of what has actually been irretrievably lost
through lossy compression, I think you have to compare what we
actually receive with what we would receive if lossless compression
were used. However, since lossless compression is not in use, there's
a difficulty in establishing the latter, hence the guesstimates.


Yes, I understand what you are trying. Just that your staement about
97% lost are incorrect and misleading. For things like braodcasting,
lossy compression is most probably the best solution. I think it is
much more important to educate the public about compression artifacts
and show them the difference between a good (say BBC1 on their best
days) and a bad (ITV4 footy) image and explain how by increasing the
bandwidth by say 30%, you could get rid of say 90% of visible
compression artefacts rather than by making lossy compression the
great evil.

Unfortunately, broadcasters abuse compression in search of profits.

Unfortunately, most of the population is too apathetic to put any
significant pressure on broadcasters to clean up their act.

Unfortunately, over compression abuses has given a bad name to lossy
compression.


Too bloody right!



  #148  
Old October 20th 08, 05:27 PM posted to uk.tech.digital-tv
Java Jive
external usenet poster
 
Posts: 760
Default Whats the point of Freeview?

On 20 Oct 2008 14:18:46 GMT, (Yannick Tremblay)
wrote:

In article ,
Java Jive wrote:

It's relevant because I'm trying to get a measure of how much picture
information is actually lost.


I mostly agree with you and I understand what you would like to do but
the way you are trying to do it is not valid.


It is. The only way we can judge how much has been actually been
thrown away is by comparing with lossless compression.

It's irrelevant whether detail that is lost is visible to most or even
any people. The definition of HiFi, though it was never actually
standardised as such, would have been a flat frequency response
between about 15-23,000Hz (and suitable SNR and THD), yet mostly only
children and very young adults can hear to the upper limit. But
because some people can hear it, you can't remove it from the sound
and call it HiFi.

Similarly, the moment you start removing picture detail "that noone
can see anyway", that introduces subjectivity. The only absolutely
sure way that every one is really is seeing all the picture is not to
remove any of it! This principle has been de facto for decades in
HiFi, why should TV be any different?

There are simple
equation


Which are?

to convert from reduction in data bandwidth to "picture
information lost".


A lossless algorithm may be able to do a 60%
compression at 0% information lost. A lossy algorithm is much more
difficult to figure out.


Which is why you have to compare it with a lossless algorithm.

By definition, assuming all other things such as the transmission,
reception, and display are 'perfect' (hypothetically for the purposes
of argument), the best possible lossless compression determines the
minimum bandwidth required to convey a 'perfect' picture. Then,
keeping everything else 'perfect', the difference in bandwidth between
using lossy compression and lossless compression is the only absolute,
non-subjective measure (that I can think of) of what has actually been
thrown away.

For things like braodcasting,
lossy compression is most probably the best solution.


That's totally subjective, and I for one disagree. I would MUCH (a
hundred times over) rather they used lossless compression to transmit
the five terrestrial channels, any others that could be fitted in
through using lossless compression would be a bonus. We would get a
*better* picture than the best analogue picture, with the added
benefit of widescreen, etc. It would also free up resources to make
better mainstream programmes. There are only 24hrs in a day, and only
one or a two TVs in most houses, so what's the point of a battery of
channels that virtually noone watches?

I think it is
much more important to educate the public about compression artifacts
and show them the difference between a good (say BBC1 on their best
days) and a bad (ITV4 footy) image and explain how by increasing the
bandwidth by say 30%, you could get rid of say 90% of visible
compression artefacts rather than by making lossy compression the
great evil.


You obviously haven't been following the various threads concerning
the way the public are being confused and misled about DSO. Having
dealt with 'the public' in IT support for some years, I think most
members of the public would probably glaze over, wait patiently until
you've finished, and then say something like:
"Look, I just want a picture as good as my analogue picture used to
be, is that so much to ask! Don't try and blind me with science, just
do it!"
  #150  
Old October 20th 08, 09:14 PM posted to uk.tech.digital-tv
Java Jive
external usenet poster
 
Posts: 760
Default Whats the point of Freeview?

I wouldn't attempt to broadcast HD on DTT. If they gave us lossless
SD, most of the motivation for HD would be removed.

On Mon, 20 Oct 2008 19:37:34 +0100, Stuart Clark
wrote:

At the top end, and HD-SDI signal is about 3Gbps for 1080p, dropping
down to 270Mbps for SD.

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Whats the best Freeview box? Boltar UK digital tv 17 July 13th 07 02:08 AM
Which way to point it? Barry UK digital tv 21 December 7th 06 12:54 AM
sky & freeview whats the difference robert UK digital tv 12 June 9th 05 10:04 AM
point pleasent Bob G0KYF UK sky 1 April 9th 05 05:23 PM
help!! freeview box whats best for me money? keef UK digital tv 8 September 28th 04 12:12 PM


All times are GMT +1. The time now is 08:01 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
Copyright ©2004-2021 HomeCinemaBanter.
The comments are property of their posters.