HomeCinemaBanter

HomeCinemaBanter (http://www.homecinemabanter.com/index.php)
-   UK digital tv (http://www.homecinemabanter.com/forumdisplay.php?f=5)
-   -   4k TV on Freesat or Freeview? (http://www.homecinemabanter.com/showthread.php?t=75682)

_Unknown_Freelancer_ August 9th 15 03:40 PM

4k TV on Freesat or Freeview?
 
"Paul Ratcliffe" wrote in message
...
On Thu, 6 Aug 2015 13:25:21 +0100, UnsteadyKen

wrote:

I know that, I have come across controls before, and it was you who
started blathering about numbered sliders and all that guff in a
desperate attempt to cover your arse when it was pointed out you were
talking bollox, IE, turning sharpness processing off does not reduce
the resolution and make the picture go all blurry,as you claimed.


It certainly did on the old tube cameras. Turning contours off made
the picture as soggy as anything and essentially unusable. I think we
used to do that during registration line-up, but it has been rather a
long time since then...


a.k.a. 'Edge' ....was also labelled same on CCUs and OCPs.



_Unknown_Freelancer_ August 9th 15 03:55 PM

4k TV on Freesat or Freeview?
 
"R. Mark Clayton" wrote in message
...
On Wednesday, 5 August 2015 21:02:39 UTC+1, Andy Furniss wrote:
R. Mark Clayton wrote:

FWIW, UHD IS more than four times the bandwidth.

No it is four times the resolution.


Yes, but the source bitrate is 8x as you have to account for current HD
only being 25 fps or 50 fields per sec. UHD doesn't use interlacing so
50fps doubles the source bandwidth on top of the res increase. This
means for sport that the vertical res increase is (more than?) 4 times
HD. The "more than" may be debatable - but I think interlaced gets extra
filtering to prevent interline twitter.


You are still thinking about building a rasterised image with the picture
built up in [alternate] lines every [other] frame time.

More recent methods send the full frame every so often and the changes
every frame time. This works great for static images or for video where
things in the view change, but can generate artefacts when the camera pans
or zooms.


"....full frame every so often" = i frame, start of a GOP, Group Of Pictures
e.g. a video stream may have a GOP of 75 frames

Point is though, yes, (at source) 4K is four times the resolution of HD, but
it produces eight times the data of HD.



_Unknown_Freelancer_ August 9th 15 04:06 PM

4k TV on Freesat or Freeview?
 
"R. Mark Clayton" wrote in message
...
On Wednesday, 5 August 2015 21:09:05 UTC+1, _Unknown_Freelancer_ wrote:
"Andy Furniss" [email protected] wrote in message
o.uk...
SNIP


It's 2015 - I don't have an interlaced display any more!

De-interlace and scale is what my (and I guess most peoples) TV does to
25i I could let it do it or I can do it in s/w my self.


How do you know this?
Do you have the source code from the manufacturer?

You don't need it.

For a CRT interlacing relies on the persistence of the phosphor, so
alternate lines are drawn every frame (50fps (576i)in EU, 60fps in US).
The primary reason for doing this is to reduce the flicker that that would
be very obvious if the whole frame were drawn every time (so 25fps in the
EU).

Later CRT TV's would remember the contents of every line and redraw the
whole screen every frame time (and SECAM sets may have had this feature
longer). CRT monitors topped out at [email protected] around the mid

noughties.

More recent flat screen panels rely on a different method. Basically a
pixel will stay in a particular state until it is told to do something
different.

My 4k panel here was restricted to [email protected] until AMD got around to fixing
their drivers. This would affect high speed motion, but not downloaded
video nor static images.




UHD will be an even bigger rip off, with even more detail lost!


bigger rip off than what? Already the premium for 4k over full HD is quite
modest and no detail will be lost. It is plausible that you won't get all
4k of the detail due to compression etc., but you certainly won't get
less.



A bigger rip off than DTT HD.
.....which is the whole reason this thread exploded the way it did.
Ground zero was my venting steam on 3rd August in reply to MC anticipating
4K over freeview.

Freeview HD is poop.
Therefore 'Freeview 4K' makes people in TV laugh.
It will be an even bigger rip off than Freeview HD.

And note the distinct use of the word 'Freeview'.
Freesat, Sky, Virgin, and BT methods of distribution offer more bandwidth,
and therefore will not be anywhere as bad.



_Unknown_Freelancer_ August 9th 15 04:40 PM

4k TV on Freesat or Freeview?
 
"Andy Furniss" [email protected] wrote in message
...
_Unknown_Freelancer_ wrote:
"Andy Furniss" [email protected] wrote in message


De-interlace and scale is what my (and I guess most peoples) TV
does to 25i I could let it do it or I can do it in s/w my self.


How do you know this? Do you have the source code from the
manufacturer?


No, but that doesn't mean they don't.

There must be many chips sold for the purpose (I know they also do more
complicated processing as well)

I do know that my TV de-interlaces as I can test it with a computer.

Manufacturers spend a lot of time and effort over their kit before
they put it to market. Ok, so there is the occasional lemon model,
but on the whole, most kit does what it says on the tin.

Just because an OLED screen only came out of the factory in January
2015, does not mean it can not interpret an interlaced scan 'is as.
Without the source code, for all we know (when watching 1080i) it may
well actually only update all the odd lines in one pass, and then all
the even in the next.


Should be easy enough to take a pic to prove/disprove. Many TV reviews
seem to test "the deinterlacer" so I assume many TVs don't work by
simulating a CRT.

In which case, leaving source interlaced stuff as interlaced IS the
best thing to do.

Let the equipment decide what to do with it.


Oh I can and do let my TV do its thing - observing the quality of its
de-interlacing and scaling also lets me say that I can equal/beat it
with my own processing. It's not top end TV, but not budget either, it
got good reviews.

As previously, SMPTE didnt make up their standards for fun. And Im
pretty sure the manufacturers didnt toss a coin when writing the
code to decide how it should handle different display modes.

And in 'scaling' you're again compromising your own pictures. Once
you scale, you've ruined your copy for good.


Well someone/something has got to scaled to get SD on an HD panel.

Who said anything about changing master copy - I can choose different
scale eg.lanczos at display time thanks to open source geeks and OpenGL.

I could also use ffmpeg, I can deinterlace on the fly, but I think a
motion compensated de-int that runs at 0.1 fps will beat it. I have
choice to do whatever I want


ffmpeg is good stuff. Although this may not be good news:
http://www.theregister.co.uk/2015/08...er_steps_down/


i.e. You resample all the colour spacing, you contrive any
representation.

Again, leave it as you got it, let the screen decide what to do with
it.


Or do better....


Ok, fair enough.

Just seems like a lot of faff tbh!

Leap forward fifteen years. Screens will still have backward
compatitibility, and will be capable of rediculous resolutions....
but at that point in time the manufacturers will have better means of
making older formats work well on their screens.


True, and some of the methods they will do realtime are likely already
proposed in existing papers, peoples masters/PHDs. Just they currently
are far too slow.

Although I notice certain 'arty types' in sports production
actually add a 'film effect' to some items. Aparently its 'art'.
....with a capital F me thinks!

Not filmic that would imply deint to 25p (eww) TV/me would of
course do 50p.


No. Because it still leaves Tx at 50i. You're not going to change the
whole transmission chain for one vt package, are you?


It was you that mentioned filmic, which I assumed meant 25p - I can't
recall suggesting changing Tx!

FWIW on FreeviewHD 25p is flagged as progressive.


Well, it appeared that after I wrote that sports types apply a film effect,
you got the idea they deinterlaced it.
But you cant do that. You cant deinterlace a picture which is operating in
an interlaced environment.
Well you can, but then it all goes to cock!
Which is why I wrote what I did.

Really, all the kit does is send field 1 to field 1 AND 2 = _cheap_ film
effect




The part run samples I pointed to are source quality aren't they?


TL;DR! But, I dont think thats relevant anyway. Im comparing source
SD to DTT 'HD'. Lab test vs Real world.


Well I've got sport recordings that are 10mbit and so it's not like they
never go that high. I must admit that Park Run at 5mbit is horrible, but
at 10 it's OK.


Dunno, ISTR Sky Sports F1 is 15Mb/s.... or something approaching that.
.....only so that their pictures were better than BBC1s when they penned the
present deal.


Whatever I try I can't get the raw 576i to look as good as a
10mbit encode of the 1080i.


But without the source HD material, how do you know what
detail/definition has been lost to compare?


I do have the HD source - that's what the 10mbit 264 was made from!

Comparing HD to HD wasn't the point, of course 10mbit is not as good as
the raw - but it's way better than the raw SD.

Generally compression occurs by smoothing edges (loosing detail), and
then finding repeat patterns in a frame (which buggers up captions).


Different cameras of course (same lens), detail -

ftp://vqeg.its.bldrdoc.gov/HDTV/SVT_exports/README.txt


Please do not misunderstand this and interpret this as me saying
SD
HD. Simply not the case. Its once a picture has gone through
the
terretrial transmission chain.

I don't see why my 10mbit x264 encode should beat pro kit -
assuming of course they would give that much bitrate to similar
content.



Because its compressed to ****e by Arqiva! Because the broadcasters
dont want to pay any more ££s!


I thought the BBC coded the main HD mux and Aquiva did com 7/8.


AFAIK its Arqiva.

Was it late last year there was the mass renumbering of channels?
Cant remember?
They told the public it was to improve service and make more space.
********!

It was because Arqiva went out and purchased a whole truck load of new mux
boxes!

In the following months kit auction sites were flooded with 'pre-owned'
MPEG2 muxers!
.....all ex-Arqiva!



Again, why do you need to de-interlace ?? Wht are you intent on
de-interlacing? When SMPTE created these new standards they set
added interlaced for distinct reasons. i.e. They didnt do it for
a laugh, or while they were down the beer keller!

Reluctantly to halve the bandwidth while giving decent temporal
res.


But you wont be halving the data rate.


I meant that the reason interlaced still survives and gets standards
despite eg, the EBU trying to get rid for HD is because it's half the
data rate compared to 50p.

https://www.ebu.ch/en/technical/trev...editorial.html

I know its the other end of the spectrum, but the maths is still the
same: (or "math", if youre "murican") 1080i50 (1080 interlaced, 25
fps, alternate lines refreshed at 50fps) = 1.5Gb/s (because you are
sending 1080/2= 540 lines 50 times a second) 1080p50 (1080
progressive lines, refreshed at 50fps) = 3Gb/s (because you are
sending all 1080 lines every 50 times a second)


I can do sums.

So, unless you're converting 1080i50 (off air telly) to 1080p25, you
are doubling your data rate by converting to 1080p50


I am fully aware of that.
.
.....and ruining the source by deinterlacing it needlessly.


Or doing as good as/better than the realtime deint my TV does.

And it's precisely the same if you're doing this with SD off air.
Converting 576i to 576p doubles your data rate.


I know that - just trying to get raw SD at its best to compare with
10mbit HD for comparison - not as a general policy for viewing - as I
said I let my TV deint that.


BUT.... you may well notice the occasional f.cup with 'regional
news'. ENG have gone out and shot something, usually out of focus,
and recorded with more audio distorion than a Foo Fighters concert!
Content goes back to the edit, and the clown at the keyboard, who has
a non-linear (computer) edit in front of him, hasnt bothered to check
it on a proper telly. i.e. They've only watched it on their computer
screen. When it goes to air any horizontal motion shivvers....
because they've got the field dominance the wrong way round!


Well they should really have just done a quich check with yadif=1 :-)


_SHOULD_.
But just like some ENG 'camera' ops dont bother to check an image is focused
before pressing record, some 'editors' dont bother to check on a proper
telly before pressing 'Export'.


I mean surely most people now see interlaced on a progressive
display = it's de-interlaced. If I put my Panny plasma into an
interlaced mode it de-interlaces (motion adaptively). It doesn't
become an interlaced display. I can de-interlace in s/w to achieve
the same effect on a dumber display (my 1920x1080 computer
monitor).


As per first point above, without manufacturers source code, how do
you know that just because your telly only fell of Dixons shelf
yesterday, that it still doesnt update the display in an interlaced
fashion?


Answered above for me, but it's unlikely even if there are TVs that
simulate CRTs they would do it for 576i on an HD panel.


And in encoding to 50p you double the required data rate.

True - but then weaved frames are also "extra" complicated so I
don't


No, they're not. Its just half the lines, with a flag.


I think in practice mpeg2 and h264 encoders do full weaved frames rather
than fields - but anyway it was just me thinking out loud about how to
compare x265 with x264.

It seems currently ffmpeg doesn't re-weave the output of its hevc
decoder. x265 also warns that interlace support is experimental if you
try to use it.


.....could be the case that its 'experimental' because only ffmpeg have
included it as an option.
i.e. No-one else pushing the 4K envelope may not be bothering.

As Ive written previously, all ffmpeg does is include a flag in alternate
frame data.



Better HD yes - I am still not convinced it's quite as bad as SD,
though - maybe I don't watch enough TV (usually motorsport) -
perhaps park run is misleading (I obviously don't have access to
much else to compare), but to me 10mbit x264 HD wins over raw SD
for that.



A 'lab test' is not as good as a 'real world' test. Our real world
test is FTA Freeview HD. Compared to source HD, you're all being
ripped off severely!


Well yes, but I am comparing for the claim that it's worse than raw SD
so if anybody ever broadcasts park run I will record it and see real
world rather than my "lab" :-)

UHD will be an even bigger rip off, with even more detail lost!


Still no update on BTW wholesale connect sin WRT BT UHD.

Given that their HD offerings were 7.5mbit 1440 or "premium" 1920 at
10mbit it will be interesting to see what their UHD is.




R. Mark Clayton[_2_] August 9th 15 05:16 PM

4k TV on Freesat or Freeview?
 
On Sunday, 9 August 2015 15:06:10 UTC+1, _Unknown_Freelancer_ wrote:
"R. Mark Clayton" wrote in message


SNIP

UHD will be an even bigger rip off, with even more detail lost!


bigger rip off than what? Already the premium for 4k over full HD

TV
is quite
modest and no detail will be lost. It is plausible that you won't get all
4k of the detail due to compression etc., but you certainly won't get
less.



A bigger rip off than DTT HD.
....which is the whole reason this thread exploded the way it did.
Ground zero was my venting steam on 3rd August in reply to MC anticipating
4K over freeview.

Freeview HD is poop.
Therefore 'Freeview 4K' makes people in TV laugh.
It will be an even bigger rip off than Freeview HD.

And note the distinct use of the word 'Freeview'.
Freesat, Sky, Virgin, and BT methods of distribution offer more bandwidth,
and therefore will not be anywhere as bad.


The OP said Freesat or Freeview.

No sure how something that is and remains free can be a rip off?

_Unknown_Freelancer_ August 9th 15 05:27 PM

4k TV on Freesat or Freeview?
 
"R. Mark Clayton" wrote in message
...
On Sunday, 9 August 2015 15:06:10 UTC+1, _Unknown_Freelancer_ wrote:
"R. Mark Clayton" wrote in message


SNIP

UHD will be an even bigger rip off, with even more detail lost!

bigger rip off than what? Already the premium for 4k over full HD

TV
is quite
modest and no detail will be lost. It is plausible that you won't get
all
4k of the detail due to compression etc., but you certainly won't get
less.



A bigger rip off than DTT HD.
....which is the whole reason this thread exploded the way it did.
Ground zero was my venting steam on 3rd August in reply to MC
anticipating
4K over freeview.

Freeview HD is poop.
Therefore 'Freeview 4K' makes people in TV laugh.
It will be an even bigger rip off than Freeview HD.

And note the distinct use of the word 'Freeview'.
Freesat, Sky, Virgin, and BT methods of distribution offer more
bandwidth,
and therefore will not be anywhere as bad.


The OP said Freesat or Freeview.

No sure how something that is and remains free can be a rip off?


Licence fee = not free

+Because its sold by the broadcasters as 'better/amazming/fantastic/pin
sharp/other generic superlative'.
.....when its not.
Thus, it is a lie, a rip off.


Watched Sunday Brunch on Freeview HD earlier. (Yes, I know, ironic! eyes
roll It was sunday morning and my head was fried! )
During 'settee' interviews was annoyed by the false halo added to
interviewees caused by the brightly coloured wall behind them.
......'transmission chain artifacts'.



R. Mark Clayton[_2_] August 9th 15 08:30 PM

4k TV on Freesat or Freeview?
 
On Sunday, 9 August 2015 16:35:24 UTC+1, _Unknown_Freelancer_ wrote:
"R. Mark Clayton" wrote in message
...
On Sunday, 9 August 2015 15:06:10 UTC+1, _Unknown_Freelancer_ wrote:
"R. Mark Clayton" wrote in message


SNIP

UHD will be an even bigger rip off, with even more detail lost!

bigger rip off than what? Already the premium for 4k over full HD

TV
is quite
modest and no detail will be lost. It is plausible that you won't get
all
4k of the detail due to compression etc., but you certainly won't get
less.


A bigger rip off than DTT HD.
....which is the whole reason this thread exploded the way it did.
Ground zero was my venting steam on 3rd August in reply to MC
anticipating
4K over freeview.

Freeview HD is poop.
Therefore 'Freeview 4K' makes people in TV laugh.
It will be an even bigger rip off than Freeview HD.

And note the distinct use of the word 'Freeview'.
Freesat, Sky, Virgin, and BT methods of distribution offer more
bandwidth,
and therefore will not be anywhere as bad.


The OP said Freesat or Freeview.

No sure how something that is and remains free can be a rip off?


Licence fee = not free


Unless you are watching in black and white no marginal cost at all.


+Because its sold by the broadcasters as 'better/amazming/fantastic/pin
sharp/other generic superlative'.


4k is - I am looking at it.

....when its not.


Freeview HD at 4K might not be.

Freesat at 4k might be as long as they give a whole transponder to each channel.

Thus, it is a lie, a rip off.


Watched Sunday Brunch on Freeview HD earlier. (Yes, I know, ironic! eyes
roll It was sunday morning and my head was fried! )
During 'settee' interviews was annoyed by the false halo added to
interviewees caused by the brightly coloured wall behind them.
.....'transmission chain artifacts'.


You sure your TV is set up right?


_Unknown_Freelancer_ August 9th 15 08:40 PM

4k TV on Freesat or Freeview?
 
"R. Mark Clayton" wrote in message
...
On Sunday, 9 August 2015 16:35:24 UTC+1, _Unknown_Freelancer_ wrote:
"R. Mark Clayton" wrote in message
...
On Sunday, 9 August 2015 15:06:10 UTC+1, _Unknown_Freelancer_ wrote:
"R. Mark Clayton" wrote in message

SNIP

UHD will be an even bigger rip off, with even more detail lost!

bigger rip off than what? Already the premium for 4k over full HD
TV
is quite
modest and no detail will be lost. It is plausible that you won't
get
all
4k of the detail due to compression etc., but you certainly won't
get
less.


A bigger rip off than DTT HD.
....which is the whole reason this thread exploded the way it did.
Ground zero was my venting steam on 3rd August in reply to MC
anticipating
4K over freeview.

Freeview HD is poop.
Therefore 'Freeview 4K' makes people in TV laugh.
It will be an even bigger rip off than Freeview HD.

And note the distinct use of the word 'Freeview'.
Freesat, Sky, Virgin, and BT methods of distribution offer more
bandwidth,
and therefore will not be anywhere as bad.

The OP said Freesat or Freeview.

No sure how something that is and remains free can be a rip off?


Licence fee = not free


Unless you are watching in black and white no marginal cost at all.


....and in narrowscreen too??!!



+Because its sold by the broadcasters as 'better/amazming/fantastic/pin
sharp/other generic superlative'.


4k is - I am looking at it.


There's a 4K station on air now?



....when its not.


Freeview HD at 4K might not be.

Freesat at 4k might be as long as they give a whole transponder to each
channel.

Thus, it is a lie, a rip off.


Watched Sunday Brunch on Freeview HD earlier. (Yes, I know, ironic! eyes
roll It was sunday morning and my head was fried! )
During 'settee' interviews was annoyed by the false halo added to
interviewees caused by the brightly coloured wall behind them.
.....'transmission chain artifacts'.


You sure your TV is set up right?


Yes.




Paul Ratcliffe August 9th 15 09:38 PM

4k TV on Freesat or Freeview?
 
On Sun, 9 Aug 2015 11:30:23 -0700 (PDT), R. Mark Clayton
wrote:

Watched Sunday Brunch on Freeview HD earlier. (Yes, I know, ironic! eyes
roll It was sunday morning and my head was fried! )
During 'settee' interviews was annoyed by the false halo added to
interviewees caused by the brightly coloured wall behind them.
.....'transmission chain artifacts'.


You sure your TV is set up right?


Are you REALLY that clueless?

Vir Campestris August 9th 15 09:53 PM

4k TV on Freesat or Freeview?
 
On 09/08/2015 12:28, R. Mark Clayton wrote:
More recent methods send the full frame every so often and the changes every frame time. This works great for static images or for video where things in the view change, but can generate artefacts when the camera pans or zooms.


cough motion compensation

That's a bit of a simplification isn't it?

Andy

David Kennedy[_2_] August 9th 15 10:02 PM

4k TV on Freesat or Freeview?
 
On 02/08/2015 12:22, Michael Chare wrote:
4K TVs are becoming available and more reasonable prices and there are some 4k
internet streaming channels.

So when, if ever, will any of the Freesat or Freeview channels convert to 4K?


Surely for the foreseeable future 4k is going to be totally reliant on
upscaling the transmission or on 4k discs? [whenever they arrive although I
believe some Blu-ray is already 4k]

http://www.theregister.co.uk/2014/01/20/feature_4k_confusion_over_specs_and_standards/

http://www.techradar.com/news/home-cinema/high-definition/4k-tv-resolution-what-you-need-to-know-1048954

--
David Kennedy

http://www.anindianinexile.com

Andy Furniss[_2_] August 10th 15 12:16 AM

4k TV on Freesat or Freeview?
 
R. Mark Clayton wrote:
On Wednesday, 5 August 2015 21:02:39 UTC+1, Andy Furniss wrote:
R. Mark Clayton wrote:

FWIW, UHD IS more than four times the bandwidth.

No it is four times the resolution.


Yes, but the source bitrate is 8x as you have to account for
current HD only being 25 fps or 50 fields per sec. UHD doesn't use
interlacing so 50fps doubles the source bandwidth on top of the res
increase. This means for sport that the vertical res increase is
(more than?) 4 times HD. The "more than" may be debatable - but I
think interlaced gets extra filtering to prevent interline
twitter.


You are still thinking about building a rasterised image with the
picture built up in [alternate] lines every [other] frame time.


I was thinking more on the capture side than transmission/display.

I have no idea if it still applies in the world of HD kit, but
historically at least, interlaced would, at camera level, get some sort
of extra vertical low pass filter to prevent artifacts downstream.

Even today in s/w, by default, ffmpeg will low pass vertically if you
convert progressive to interlaced.


More recent methods send the full frame every so often and the
changes every frame time. This works great for static images or for
video where things in the view change, but can generate artefacts
when the camera pans or zooms.


R. Mark Clayton[_2_] August 10th 15 12:22 PM

4k TV on Freesat or Freeview?
 
On Sunday, 9 August 2015 19:40:42 UTC+1, _Unknown_Freelancer_ wrote:
"R. Mark Clayton" wrote in message

SNIP


SNIP



+Because its sold by the broadcasters as 'better/amazming/fantastic/pin
sharp/other generic superlative'.


4k is - I am looking at it.


There's a 4K station on air now?


No. I am looking at a 4k monitor. It is better (a lot better) than full HD (and there is a full HD panel right next to it and a 1600X1200 CRT next to that). It is pin sharp.

I am not going to rush out and buy a 4k TV, just because they have come on the market, but if my existing [full HD] failed then I would buy a 4k one.

No 4k content [yet], well just plug in one of these: -
http://www.rikomagic.com/en/product/...59_pid_19.html



....when its not.


Freeview HD at 4K might not be.

Freesat at 4k might be as long as they give a whole transponder to each
channel.

Thus, it is a lie, a rip off.


Watched Sunday Brunch on Freeview HD earlier. (Yes, I know, ironic! eyes
roll It was sunday morning and my head was fried! )
During 'settee' interviews was annoyed by the false halo added to
interviewees caused by the brightly coloured wall behind them.
.....'transmission chain artifacts'.


You sure your TV is set up right?


Yes.


Not from the results you describe, but there is some "uprated" SD out there.

Brian Gregory August 12th 15 09:06 PM

4k TV on Freesat or Freeview?
 
On 03/08/2015 12:07, _Unknown_Freelancer_ wrote:
"Michael Chare" wrote in message
...
4K TVs are becoming available and more reasonable prices and there are
some 4k internet streaming channels.

So when, if ever, will any of the Freesat or Freeview channels convert to
4K?

--
Michael Chare


4K ....over Freeview?
One hopes you are 'having a laugh'. Seriously.

Consider that this stuff originates at 12Gb/s.
Tweleve gigabits of data, per second.
In comparison, present HD originates at just under 1.5Gb/s.

And to compress this enough so that its transmittable over freeview, you
would have to dispense with so much information that you would render 'UHD'
pointless.


You could use a whole DVB-T2 multiplex and the H.265 codec.

It would arguably be a better redition of 4K than the current HD
Freeview is of 2K.


i.e. The loss in quality would be so bad that it would be comparable to
existing HD. Therefore, just what is the point?

The only way to transmit 'acceptable' UHD is either usings serveral channel
spaces over satellite, or proper* broadband.
* meaning something better than BTs standard twisted pair phone line. Either
the co-ax that Virgin is installing, or fibre from both BT and Virgin.

4K over Freeview?
Hopefully, never.



--

Brian Gregory (in the UK).
To email me please remove all the letter vee from my email address.

Brian Gregory August 12th 15 09:11 PM

4k TV on Freesat or Freeview?
 
On 03/08/2015 19:22, _Unknown_Freelancer_ wrote:

Freeview HD is.... (sorry, Ive run out of expletives) DOG ****E!


Then what tirade of unprintable words do you use to describe Freeview SD?

All I know is that Freeview HD is way way better than Freeview SD.

--

Brian Gregory (in the UK).
To email me please remove all the letter vee from my email address.

Brian Gregory August 12th 15 09:17 PM

4k TV on Freesat or Freeview?
 
On 04/08/2015 20:42, _Unknown_Freelancer_ wrote:


Why would you de-interlace 576i? At all?
If you want to intentionally make PAL look bad, you de-interlace it.


Modern displays that don't just flash light as the electron spot passes
the pixel on the screen do not display motion properly unless you
de-interlace. You see double images as things move.

--

Brian Gregory (in the UK).
To email me please remove all the letter vee from my email address.

_Unknown_Freelancer_ August 12th 15 10:31 PM

4k TV on Freesat or Freeview?
 
"Brian Gregory" wrote in message
...
On 04/08/2015 20:42, _Unknown_Freelancer_ wrote:


Why would you de-interlace 576i? At all?
If you want to intentionally make PAL look bad, you de-interlace it.


Modern displays that don't just flash light as the electron spot passes
the pixel on the screen do not display motion properly unless you
de-interlace. You see double images as things move.

--


You're an "expert".



_Unknown_Freelancer_ August 12th 15 10:38 PM

4k TV on Freesat or Freeview?
 
"Brian Gregory" wrote in message
...
On 03/08/2015 19:22, _Unknown_Freelancer_ wrote:

Freeview HD is.... (sorry, Ive run out of expletives) DOG ****E!


Then what tirade of unprintable words do you use to describe Freeview SD?

All I know is that Freeview HD is way way better than Freeview SD.



Point.
Missed.




By a considerable distance.


My point is that SOURCE SD contains more detail than DTT Freeview HD.

"SOURCE SD".
i.e. As its recorded on DigiBeta cassette.
i.e. Uncompressed.


Of course Freeview HD is better than Freeview SD.
As soon as the analogue channels were switched off, they turned the SD bit
rates right down.... because then no-one had anything to compare digital SD
to any more.
i.e. No-one could say "this digital stuff is worse than the old stuff." The
proof had been switched off!

The secondary purpose of turning the SD bit rate down was to make everyone
think HD was therefore wonderful.
"Well.... its got to be hasnt it? Its better than that digital SD."

A third reason for turning the bit rates down was to free up bandwidth for
more channels.



So obviously, Freeview HD _IS_ better than Freeview HD.
But if they actually transmitted SD at source quality... which TBH, wouldnt
take much doing, it is my opinion that it would contain more detail than the
over compressed VHS quality the public is being sold as "HD" over DTT.


Read the thread before you wade in with statements like that.
Might prevent you looking a fool.



_Unknown_Freelancer_ August 12th 15 10:42 PM

4k TV on Freesat or Freeview?
 
"Brian Gregory" wrote in message
...
On 03/08/2015 12:07, _Unknown_Freelancer_ wrote:
"Michael Chare" wrote in message
...
4K TVs are becoming available and more reasonable prices and there are
some 4k internet streaming channels.

So when, if ever, will any of the Freesat or Freeview channels convert
to
4K?

--
Michael Chare


4K ....over Freeview?
One hopes you are 'having a laugh'. Seriously.

Consider that this stuff originates at 12Gb/s.
Tweleve gigabits of data, per second.
In comparison, present HD originates at just under 1.5Gb/s.

And to compress this enough so that its transmittable over freeview, you
would have to dispense with so much information that you would render
'UHD'
pointless.


You could use a whole DVB-T2 multiplex and the H.265 codec.

It would arguably be a better redition of 4K than the current HD Freeview
is of 2K.


Yes, you could.

But it would contain f.all detail.

Yes, you're transmitting the information for four times more dots, but there
just is not enough bandwidth available over DTT to allow it to carry any
level of Ultra High _Definition_. ....with particulat emphasis on the word
'Definition'.

All you're doing is making the picture bigger.
Not improving the quality.





i.e. The loss in quality would be so bad that it would be comparable to
existing HD. Therefore, just what is the point?

The only way to transmit 'acceptable' UHD is either usings serveral
channel
spaces over satellite, or proper* broadband.
* meaning something better than BTs standard twisted pair phone line.
Either
the co-ax that Virgin is installing, or fibre from both BT and Virgin.

4K over Freeview?
Hopefully, never.



--

Brian Gregory (in the UK).
To email me please remove all the letter vee from my email address.




Java Jive[_3_] August 13th 15 07:28 PM

4k TV on Freesat or Freeview?
 
On Wed, 12 Aug 2015 21:31:41 +0100, "_Unknown_Freelancer_" /dev/null
wrote:

"Brian Gregory" wrote in message

Modern displays that don't just flash light as the electron spot passes
the pixel on the screen do not display motion properly unless you
de-interlace. You see double images as things move.


This, as stated, is ambiguous and therefore perhaps not quite actually
incorrect, but it is at least highly misleading in that it implies the
problem is due to the modern displays, whereas actually the problem is
with the legacy CRTs and interlaced video!

First, I note that you supply no supporting *evidence* for your
assertion. Note the stress on the word 'evidence'. There is, for
example, this in Wikipedia, but it can't count as evidence because we
have no technical details as to how the picture was made. For
example, was it a photograph of a TV showing the scene? It doesn't
seem to be, because there are no tell-tale signs such as screen glare,
dust on the screen, Newton's rings, etc. Or was it a Print-Screen
grab off the creator's desktop while his PC was displaying paused
video? This seems to me to be much more likely, but then that doesn't
count as evidence, because it's not a picture of an LCD TV displaying
the picture as it natively would. And that's not to even mention that
the picture is a JPEG, which is a lossy compression format, so it is
certain that significant detail, that might have been important and
told us something useful, has been lost.

https://commons.wikimedia.org/wiki/F...r_wheel%29.jpg

We need something better ...

Around 2008, I created a myth-busting series of three pages for my
website:

http://www.macfh.co.uk/JavaJive/Audi.../CRTvsLCD.html

Note that the first page contains full technical details about how the
comparisons were made, including such as important details such as
camera shutter speed, which was 1/100s. This was the shortest time
that could still obtain a usable exposure with both types of TV while
still being fast enough in theory to capture the process of a field
being displayed.

To create those pages, I took many photographs such as these:

http://www.macfh.co.uk/Test/Piste-CRT.png (6.2MB)
http://www.macfh.co.uk/Test/Piste-LCD.png (7.2MB)

Note that the original photos were taken in Canon RAW format, not
JPEG, so that all detail seen by the camera was preserved, and that
these were next converted losslessly into TIFFs. As TIFFs cannot be
displayed in a browser, today I created the above lossless PNGs from
those original TIFFs. Therefore, AFAIAA, no detail has been lost from
that originally captured by the camera.

Note too that the vertical resolution of the image is comfortably more
than twice the 625 scan line frequency, which, according to sampling
theory, should therefore be enough to detect individual scan lines.

Note also that this is an aperture-grill CRT. It is possible that a
shadow-mask CRT would be different, but I suspect not significantly
so, as the few other images I've been able to find on the web showing
such material are essentially similar in appearance to those linked
above.

The second picture of the LCD displaying the same scene is there
merely for interest of comparison, but note that the forerunner
ski-racer on the piste has no motion artifacts such as combing.

However, only the first picture of the CRT displaying the scene is
needed to prove that the problem lies not with the LCD, but with the
CRT. The question is, now that you have an actual picture to work
from, can you see WHY?

The first and most glaringly obvious question is: Where are the
horizontal scan lines? If your explanation above is correct, in the
bright area of the picture, where the current field has recently been
drawn, we should see a series of interlaced bright (current field) and
dark (previous field) lines, but we do not. Even in the dark area of
the screen, where the camera could be argued to be less 'dazzled',
whatever that may be supposed to mean, and where we should be able to
see horizontal faint lines (previous field) interlaced with
near-as-dammit black lines (field before previous field), we still
cannot see individual horizontal scan lines. We have a camera with
sufficient vertical resolution to be able to capture them, yet,
although we can see quite clearly the vertical lines arising from the
aperture grille mechanism, horizontal scan lines are nowhere
individually discernible.

So what seems to be happening is that, whatever theory may say about
how CRTs work, in practice they are set up so that each horizontal
scan line is broad enough to overwrite exactly half of each
neighbouring line, so that when they are all drawn the picture is
contiguous, with no individual horizontal scan lines discernible. But,
effectively, this amounts to a halving of vertical resolution, and
THAT is why CRTs show less motion artifacts than equivalently sized
LCDs. It is not that the CRT displays a truer picture and the LCD is
creating artifacts - the artifacts have always been there, they are
an inherent part of interlaced video - rather it is that the CRT
hasn't got the vertical resolution to show these artifacts, but the
LCD has.

You can hardly blame the LCDs for displaying more faithfully the
source signals that are being fed to them, including any artifacts in
those sources. If you want them to display better pictures, feed them
better sources with fewer artifacts!

You're an "expert".


:-)
--
================================================== ======
Please always reply to ng as the email in this post's
header does not exist. Or use a contact address at:
http://www.macfh.co.uk/JavaJive/JavaJive.html
http://www.macfh.co.uk/Macfarlane/Macfarlane.html

Java Jive[_3_] August 13th 15 07:31 PM

4k TV on Freesat or Freeview?
 
Bearing in mind the pictures that I have just posted elsewhere in this
thread, can you explain what test you have done?

I ask because I can see no evidence at all that my, admittedly old and
first generation, LCD is doing any de-interlacing.

On Thu, 06 Aug 2015 23:43:43 +0100, Andy Furniss [email protected] wrote:

I do know that my TV de-interlaces as I can test it with a computer.

--
================================================== ======
Please always reply to ng as the email in this post's
header does not exist. Or use a contact address at:
http://www.macfh.co.uk/JavaJive/JavaJive.html
http://www.macfh.co.uk/Macfarlane/Macfarlane.html

Andy Furniss[_2_] August 14th 15 12:24 AM

4k TV on Freesat or Freeview?
 
Java Jive wrote:
Bearing in mind the pictures that I have just posted elsewhere in
this thread, can you explain what test you have done?


It's connected via hdmi and advertises interlaced and progressive modes
so I can see the difference between playing the same test sequences.

When in interlaced mode I can see that it weaves static = I can see a
full res test pattern. If something moves eg. I move a mouse over the
pattern I can see that it is blending the mouse cursor and an area
around the motion.

Using test videos like fast scrolling text I can see that the TV, when
in interlaced mode is de-interlacing them to field rate. I can do the
same thing in s/w when in progressive mode on TV or monitor so I know
what it looks like de-interlaced.

The TV, because it is motion adaptive turns the vid back to full vert
res weaved if I pause it.

It takes a bit of messing around to actually use the TV correctly in an
interlaced mode as computers don't tend to do field sync. I use linux
and an ffmpeg filter via mpv that works around this - all you need is
vsync at field rate (which is normal). You also need to convert to 422
chroma in an interlaced aware way - but unless anyone actually needs all
the detail I won't go as far as giving full command lines and
justifications.

I ask because I can see no evidence at all that my, admittedly old
and first generation, LCD is doing any de-interlacing.


I don't see any evidence it doesn't - but then you paused to get your
pics and I don't know whether that changes anything on your setup(s).
I don't know what motion was going on in that shot or what res to expect
- but it looks progressive and assuming the source was truly interlaced
- which for sport is highly likely then that means it got de-interlaced.

IIRC from a previous thread your panel was higher res than SD so it must
be scaling. If it is just scaling one field at a time that counts as
deinterlacing - though it would likely do more than that or it would bob
up and down.

As for LCD motion artifacts you mention in your other post - early LCDs
were poor and may have had poor de-interlacers, without seeing it
running it's hard to know what you mean.

Comparing progressive modes/content I count my 120Hz "capable" LCD
monitor as quite crap (in a blurry/not fast enough way) when compared to
my plasma TV. If I drag around the window I am typing this into, it
doesn't take much speed to make the text unreadable which doesn't happen
on the plasma - or on a CRT monitor.

On Thu, 06 Aug 2015 23:43:43 +0100, Andy Furniss [email protected] wrote:

I do know that my TV de-interlaces as I can test it with a
computer.


Java Jive[_3_] August 14th 15 01:51 AM

4k TV on Freesat or Freeview?
 
On Thu, 13 Aug 2015 23:24:52 +0100, Andy Furniss [email protected] wrote:

Java Jive wrote:
Bearing in mind the pictures that I have just posted elsewhere in
this thread, can you explain what test you have done?


It's connected via hdmi and advertises interlaced and progressive modes
so I can see the difference between playing the same test sequences.

[snip]


Thanks for the explanation.

I ask because I can see no evidence at all that my, admittedly old
and first generation, LCD is doing any de-interlacing.


I don't see any evidence it doesn't


I beg to differ. I go into some detail on the subject, referencing a
particular image of an athlete's arm in the CRT vs LCD web-page. I am
convinced that neither that nor any other video is de-interlaced by my
TV. It's a very early model Panasonic with analogue tuners.

- but then you paused to get your
pics and I don't know whether that changes anything on your setup(s).
I don't know what motion was going on in that shot or what res to expect
- but it looks progressive and assuming the source was truly interlaced
- which for sport is highly likely then that means it got de-interlaced.


The main motion would be the forerunner ski-racer on the piste - a
forerunner is a not-quite-yet-top or a formerly-top-but-now aging
ski-racer who descends the slope at near full speed to check it's safe
for the main racers who will follow when the competition starts. It's
a bit difficult to estimate how fast the skier would be travelling
from being stationary at the start gate, but a back-of-an-envelope
calc suggests probably about 50-55 fps, or around 35mph (on some
slopes they can reach 100mph). At the scale of the picture 10px ~
1ft, that's about 10px per interlaced field. I would have thought
that if there were motion artifacts to be seen, we would be able to
see them in this picture, but we can't.

So why can't we? Well ...

IIRC from a previous thread your panel was higher res than SD so it must
be scaling.


.... it's a little bigger than SD horizonally, but a little less
vertically, so yes, it's certainly doing some scaling, but ...

If it is just scaling one field at a time that counts as
deinterlacing - though it would likely do more than that or it would bob
up and down.


.... well it's not really de-interlacing, as that would involve working
with more than one field at a time, and the picture of the athlete's
arm in the website pages suggests that it is not.

So it's certainly buffering, and it's certainly scaling, it HAS to be
doing both, but de-interlacing, I'm pretty sure not.

So why no artifacts? Well this LCD doesn't have the vertical
resolution it should, 492 vs 576, so perhaps they are being lost in
being scaled, or perhaps they are doing the same trick as with CRTs,
and making each scan line overlap over its neighbours in the previous
field.

As for LCD motion artifacts you mention in your other post - early LCDs
were poor and may have had poor de-interlacers, without seeing it
running it's hard to know what you mean.


Well, I wasn't saying that my LCD shows artifacts, in fact I've never
seen it show the combing effects claimed on interlaced input, only
dot-crawl from using CV as the input source. Merely I was suggesting
why in general LCDs might legitimately show such motion artifacts if
fed an interlaced signal, because they their technology allows them to
display what they are fed more faithfully than CRTs.
--
================================================== ======
Please always reply to ng as the email in this post's
header does not exist. Or use a contact address at:
http://www.macfh.co.uk/JavaJive/JavaJive.html
http://www.macfh.co.uk/Macfarlane/Macfarlane.html

R. Mark Clayton[_2_] August 14th 15 12:09 PM

4k TV on Freesat or Freeview?
 
On Wednesday, 12 August 2015 21:42:29 UTC+1, _Unknown_Freelancer_ wrote:
"Brian Gregory" wrote in message
...
On 03/08/2015 12:07, _Unknown_Freelancer_ wrote:
"Michael Chare" wrote in message
...
4K TVs are becoming available and more reasonable prices and there are
some 4k internet streaming channels.

So when, if ever, will any of the Freesat or Freeview channels convert
to
4K?

--
Michael Chare


4K ....over Freeview?
One hopes you are 'having a laugh'. Seriously.

Consider that this stuff originates at 12Gb/s.
Tweleve gigabits of data, per second.
In comparison, present HD originates at just under 1.5Gb/s.

And to compress this enough so that its transmittable over freeview, you
would have to dispense with so much information that you would render
'UHD'
pointless.


You could use a whole DVB-T2 multiplex and the H.265 codec.

It would arguably be a better redition of 4K than the current HD Freeview
is of 2K.


Yes, you could.

But it would contain f.all detail.


I don't think you understand how video compression works. In a [typical] video stream there will be areas of the pictures which are the same as neighbouring ones and / or the same from frame to frame. It is these that are compressed, NOT the fine detail.

Even simple [and lossless] run length encoding will result in a substantial reduction in the file size for an image.

MP3 compression of audio from original CD to 128kbps results in compression of around 91%. Purists say you can tell the difference and so for a while I abjured it, however even though I have quite good ears, I can't tell the difference.

Obviously if one is able to compress in two dimensions and time one will achieve much higher compression with relatively little loss. So the method described above of using a [single] full [satellite] transponder and H.265 compression will result in a very good result at 4k resolution for most all content.


Yes, you're transmitting the information for four times more dots, but there
just is not enough bandwidth available over DTT to allow it to carry any
level of Ultra High _Definition_. ....with particulat emphasis on the word
'Definition'.

All you're doing is making the picture bigger.
Not improving the quality.





i.e. The loss in quality would be so bad that it would be comparable to
existing HD. Therefore, just what is the point?

The only way to transmit 'acceptable' UHD is either usings serveral
channel
spaces over satellite, or proper* broadband.
* meaning something better than BTs standard twisted pair phone line.
Either
the co-ax that Virgin is installing, or fibre from both BT and Virgin.

4K over Freeview?
Hopefully, never.



--

Brian Gregory (in the UK).
To email me please remove all the letter vee from my email address.



_Unknown_Freelancer_ August 14th 15 01:03 PM

4k TV on Freesat or Freeview?
 
Snippity snip snip delete (well, seems thats what the cool kids do)

4K ....over Freeview?
One hopes you are 'having a laugh'. Seriously.

Consider that this stuff originates at 12Gb/s.
Tweleve gigabits of data, per second.
In comparison, present HD originates at just under 1.5Gb/s.

And to compress this enough so that its transmittable over freeview,
you
would have to dispense with so much information that you would render
'UHD'
pointless.

You could use a whole DVB-T2 multiplex and the H.265 codec.

It would arguably be a better redition of 4K than the current HD
Freeview
is of 2K.


Yes, you could.

But it would contain f.all detail.


I don't think you understand how video compression works. In a [typical]
video stream there will be areas of the pictures which are the same as
neighbouring ones and / or the same from frame to frame. It is these that

are compressed, NOT the fine detail.

Even simple [and lossless] run length encoding will result in a substantial
reduction in the file size for an image.

MP3 compression of audio from original CD to 128kbps results in compression
of around 91%. Purists say you can tell the difference and so for a while
I abjured it, however even though I have quite good ears, I can't tell the
difference.

Obviously if one is able to compress in two dimensions and time one will
achieve much higher compression with relatively little loss. So the method
described above of using a [single] full [satellite] transponder and H.265

compression will result in a very good result at 4k resolution for most all
content.



I actually laughed out loud at my monitor when I read your first sentence!

sarcasm
n decades* working in live television, and n+1 decades programming computers
in up to sixteen different laguages.
I really do not have the slightest clue.
/sarcasm


My dear chap, I am very well aware of the constructs employed in standards
deployed by the moving picture experts group.
Ground zero for it was the technique employed in a jpg picture, and then
applying that over a time period.

Yes, when turning down the bitrate for a video stream, the first items to be
sacrificed ARE the non-moving areas*.
Keep turning down the bit rate, as has to be done for DTT, and very soon you
start to afffect the detail contained within anything moving.
In fact, whilst an object is moving, its detail is lost.
Only once the 'thing' stops moving is its detail restored. (Again, this is
detail lost because of compression, NOT motion blur.) Its restored quickly.
So much so that if you're not paying particulat attention to the moving
this, you miss the instant when a vauge fuzzy shape becomes a pitch, with
its skid marks and divots.
Thus, grass on any field based sport shown on DTT turns to VHS mush whilst
the camera is tracking play.
Only when the camera becomes still is any detail added to the pictch.

* What goes against your suggestion was a 'reality' programme on ITV32 a
couple of years back.
Which showed a QR code at the top of the screen constantly. This would link
to chat rooms and a website.
The QR code was stationary, and on screen during the entire programme.
i.e. It was present on every frame and did not move.
No-ones phone could recognise it because the transmission chain crushed all
the detail out of it.
It was a stationary item, compressed to grey mush.
HD?
No 'definition'.

I digress.
If you saw such programmes at source quality, and then compared the DTT off
air signal, you would understand just how much detail Arqiva are removing
from the national viewing pleasure.

To get a 12Gb/s signal out over DTT, you are going to have to drop a vast
amount of detail.

Take a football match being covered in 4K. (Ive seen some.)
Now get your main gantry camera to frame on a stationary wide angle* facing
across the pitch.
*As wide as the lens will go. Not pointing at anything in particular.
Such is the detail in that picture you can make out individual facial
expressions of people in the crowd in the opposite stand (usually around
100metres away).
Its a fair wager that detail will _never_ be present in any "Freeview 4K".


Instead of wasting so much bandwidth on such guff, why not just turn up the
bandwidth for present HD channels, and make Freeview better quality than Sky
satellite or BT TV???
i.e. Make HD.... the best HD. FOR FREE.




Bear in mind CD audio is sampled at 44.1KHz. The entire broadcast world uses
48KHz sample rates for everything. So that CD has already lost the very high
frequencies. Anything above 22KHz to be precise.
Then compressing that in to a 128Kb/s mp3 file..... Any sound supervisor
worth their salt CAN recognise an mp3.
There are several who refuse to use mp3 files in their programmes, insisting
on .wav source files.

If you take a 96Kb/s mp3 file, its easy to hear how bad it is.
128Kb/s is only just a little above that.
i.e. That file is only just above the point where audio is defeningly
obvious how bad it is.



H265 is H264 tweaked.
The macroblock size is increased (the grid which the encoder chops the
picture in to), and they rejigged the colour space. Thats it.
Its really not worth getting all moist about h265 as being the best thing
since the Altair 8800.
Its not some amazing soloution that will allow cinema quality pictures to
pushed down a dial up connection.

Just like that 128Kb/s mp3 file, its h264, but just above the point where
the masses can tell its crap.




Roderick Stewart[_3_] August 14th 15 06:37 PM

4k TV on Freesat or Freeview?
 
On Fri, 14 Aug 2015 12:03:38 +0100, "_Unknown_Freelancer_" /dev/null
wrote:

Instead of wasting so much bandwidth on such guff, why not just turn up the
bandwidth for present HD channels, and make Freeview better quality than Sky
satellite or BT TV???
i.e. Make HD.... the best HD. FOR FREE.


I've encountered people with Freeview HD receivers selecting channels
1 to 4 even when I've pointed out to them that the same programmes are
available in HD on channels 101 to 104, and they are apparently quite
happy with what they are watching. Maybe they just can't be bothered
to type the extra digits, or don't see any advantage.

Rod.

Vir Campestris August 14th 15 10:57 PM

4k TV on Freesat or Freeview?
 
On 14/08/2015 12:03, _Unknown_Freelancer_ wrote:
snip
Instead of wasting so much bandwidth on such guff, why not just turn up the
bandwidth for present HD channels, and make Freeview better quality than Sky
satellite or BT TV???
i.e. Make HD.... the best HD. FOR FREE.


Onik. Flap. Oink. Flap.


Bear in mind CD audio is sampled at 44.1KHz. The entire broadcast world uses
48KHz sample rates for everything. So that CD has already lost the very high
frequencies. Anything above 22KHz to be precise.
Then compressing that in to a 128Kb/s mp3 file..... Any sound supervisor
worth their salt CAN recognise an mp3.
There are several who refuse to use mp3 files in their programmes, insisting
on .wav source files.

If you take a 96Kb/s mp3 file, its easy to hear how bad it is.
128Kb/s is only just a little above that.
i.e. That file is only just above the point where audio is defeningly
obvious how bad it is.

H265 is H264 tweaked.
The macroblock size is increased (the grid which the encoder chops the
picture in to), and they rejigged the colour space. Thats it.
Its really not worth getting all moist about h265 as being the best thing
since the Altair 8800.
Its not some amazing soloution that will allow cinema quality pictures to
pushed down a dial up connection.

Just like that 128Kb/s mp3 file, its h264, but just above the point where
the masses can tell its crap.


I can hear the defects in 128k MP3, but not in 256k MP3 nor in 128k WMA.
I haven't tried on AAC. DAB, OTOH? 80k MP2? Yuck.

And as for Freeview? My techie son and I have learned not to mention
the faults when my wife is there. She just gets cross.

Andy

Andy Furniss[_2_] August 15th 15 01:17 AM

4k TV on Freesat or Freeview?
 
Java Jive wrote:

I ask because I can see no evidence at all that my, admittedly
old and first generation, LCD is doing any de-interlacing.


I don't see any evidence it doesn't


I beg to differ. I go into some detail on the subject, referencing
a particular image of an athlete's arm in the CRT vs LCD web-page. I
am convinced that neither that nor any other video is de-interlaced
by my TV. It's a very early model Panasonic with analogue tuners.


Looking at that image and reading your site I think what I would call
de-interlacing differs from what you would.

I consider any processing to display interlaced content on a progressive
display as de-interlacing. It may be that it is very simple like line
doubling fields and a bit of filtering to compensate for the different
spatial position of the fields, but it's still processing that is needed
and wouldn't be if a native interlaced display were being driven. Doing
line doubling hurts resolution just weaving fields together is good for
res with static portions of the frame but artifacts on motion. Being
clever and adding edge interpolation/motion detection to get the best of
both is what "advanced" de-interlacers do.

Judging by the arm shot your TV looks like it's doing fields - I accept
what you write about the pairs not matching, but that could be
additional processing that's nothing to do with interlacing.

I don't know the details of your screen, but it could be, like the
monitor I am currently looking at, only 6 bit and using spatial and or
temporal dithering to fake 8 bit colour, which could explain why the
lines differ 1:1 chroma but the brighter edge steps look paired.

As well as colour LCDs also have to pull tricks to get motion to "work"
- just holding pixels on for the whole frame/field period is great for
flicker, but is really bad for making motion look blurred, so they
"soomehow" have to try and work around that.


- but then you paused to get your pics and I don't know whether
that changes anything on your setup(s). I don't know what motion
was going on in that shot or what res to expect - but it looks
progressive and assuming the source was truly interlaced - which
for sport is highly likely then that means it got de-interlaced.


The main motion would be the forerunner ski-racer on the piste - a
forerunner is a not-quite-yet-top or a formerly-top-but-now aging
ski-racer who descends the slope at near full speed to check it's
safe for the main racers who will follow when the competition starts.
It's a bit difficult to estimate how fast the skier would be
travelling from being stationary at the start gate, but a
back-of-an-envelope calc suggests probably about 50-55 fps, or around
35mph (on some slopes they can reach 100mph). At the scale of the
picture 10px ~ 1ft, that's about 10px per interlaced field. I would
have thought that if there were motion artifacts to be seen, we would
be able to see them in this picture, but we can't.

So why can't we? Well ...

IIRC from a previous thread your panel was higher res than SD so it
must be scaling.


... it's a little bigger than SD horizonally, but a little less
vertically, so yes, it's certainly doing some scaling, but ...


I notice some of your pics are from tiny screens (OK the 22" isn't so
tiny) I wonder given how much portables used to overscan whether they
could be doing 1:1 and loosing lots off top & bottom - just a thought.

If it is just scaling one field at a time that counts as
deinterlacing - though it would likely do more than that or it
would bob up and down.


... well it's not really de-interlacing, as that would involve
working with more than one field at a time, and the picture of the
athlete's arm in the website pages suggests that it is not.


Multi field deinterlacing is "advanced" - it's possible but sub optimal
to call processing one field de-interlacing. processing may involve more
than just line doubling eg. edge detection and smoothing to hide the
half res steps on diagonals.

So it's certainly buffering, and it's certainly scaling, it HAS to
be doing both, but de-interlacing, I'm pretty sure not.

So why no artifacts? Well this LCD doesn't have the vertical
resolution it should, 492 vs 576, so perhaps they are being lost in
being scaled, or perhaps they are doing the same trick as with CRTs,
and making each scan line overlap over its neighbours in the
previous field.


I doubt weave would be lost in scaling - IME it looks far worse when scaled.

As for overlapping - well yes, though I would say totally overwrite
rather that overlap. I wonder whether a pic of a full size CRT would
look the same as your portable. Saying that, though, I recall reading
years ago that rf output ist gen game consoles/early home computers used
to pull a trick on interlaced CRTs to make them progressive (IIRC there
is a pulse/something to mark top/bottom field and they just repeated one
rather than alternating). It did say this worked on most but not all TVs
and I recall wondering "why no gaps", so perhaps all CRTs have a fat
spot/spots.


As for LCD motion artifacts you mention in your other post - early
LCDs were poor and may have had poor de-interlacers, without seeing
it running it's hard to know what you mean.


Well, I wasn't saying that my LCD shows artifacts, in fact I've
never seen it show the combing effects claimed on interlaced input,
only dot-crawl from using CV as the input source. Merely I was
suggesting why in general LCDs might legitimately show such motion
artifacts if fed an interlaced signal, because they their technology
allows them to display what they are fed more faithfully than CRTs.


Perhaps it's to do with things other than interlacing - like it's hard
to get slow/always on LCDs to do motion properly.



Andy Furniss[_2_] August 15th 15 01:47 AM

4k TV on Freesat or Freeview?
 
_Unknown_Freelancer_ wrote:

Take a football match being covered in 4K. (Ive seen some.) Now get
your main gantry camera to frame on a stationary wide angle* facing
across the pitch. *As wide as the lens will go. Not pointing at
anything in particular. Such is the detail in that picture you can
make out individual facial expressions of people in the crowd in the
opposite stand (usually around 100metres away). Its a fair wager that
detail will _never_ be present in any "Freeview 4K".


Instead of wasting so much bandwidth on such guff, why not just turn
up the bandwidth for present HD channels, and make Freeview better
quality than Sky satellite or BT TV???


I am curious what bitrate you think is enough for HD?

AIUI BT HD is max 10mbit for their (BTW) premium offering or 7.5 1440
standard. For UHD they require a connection of 44mbit (I really hope
they are not allowing for record one and watch another in that!).

I don't know what Sky HD uses, though I have "come across" some SD
transport stream motorsport rips that don't seem any higher than the
same content from the BBC.

i.e. Make HD.... the best HD. FOR FREE.


I agree that should happen - but what Brian wrote was -

"You could use a whole DVB-T2 multiplex and the H.265 codec.
It would arguably be a better redition of 4K than the current HD
Freeview is of 2K."

I think 40mbit hevc 2160p50 should surely be better than the current HD
offering - maybe in the case of 1080i25 high motion bits even better
than "quality/raw HD" deinterlaced/scaled up for a UHD TV.

I guess it should really go to HD - but there's a T2 mux spare on my
transmitter with just nulls and QVC.

Hmm, maybe I should find the keys to my lab and test -

Would grainy 65mm 2160p50 film scans with professionally made (for VQEG)
1080i25 derivatives do :)


_Unknown_Freelancer_ August 15th 15 02:02 AM

4k TV on Freesat or Freeview?
 
"Roderick Stewart" wrote in message
...
On Fri, 14 Aug 2015 12:03:38 +0100, "_Unknown_Freelancer_" /dev/null
wrote:

Instead of wasting so much bandwidth on such guff, why not just turn up
the
bandwidth for present HD channels, and make Freeview better quality than
Sky
satellite or BT TV???
i.e. Make HD.... the best HD. FOR FREE.


I've encountered people with Freeview HD receivers selecting channels
1 to 4 even when I've pointed out to them that the same programmes are
available in HD on channels 101 to 104, and they are apparently quite
happy with what they are watching. Maybe they just can't be bothered
to type the extra digits, or don't see any advantage.

Rod.


Either blissfully oblivious/thick, or cant be arsed to press two more
buttons.
.....one digit memory



_Unknown_Freelancer_ August 15th 15 02:04 AM

4k TV on Freesat or Freeview?
 
"Vir Campestris" wrote in message
...
On 14/08/2015 12:03, _Unknown_Freelancer_ wrote:
snip
Instead of wasting so much bandwidth on such guff, why not just turn up
the
bandwidth for present HD channels, and make Freeview better quality than
Sky
satellite or BT TV???
i.e. Make HD.... the best HD. FOR FREE.


Onik. Flap. Oink. Flap.


Preaching to the converted here!
I know it will never happen.
BUT, instead of wasting truck loads of cash on pointless DTT 4K, why not
just offer 'the best HD'?




Bear in mind CD audio is sampled at 44.1KHz. The entire broadcast world
uses
48KHz sample rates for everything. So that CD has already lost the very
high
frequencies. Anything above 22KHz to be precise.
Then compressing that in to a 128Kb/s mp3 file..... Any sound supervisor
worth their salt CAN recognise an mp3.
There are several who refuse to use mp3 files in their programmes,
insisting
on .wav source files.

If you take a 96Kb/s mp3 file, its easy to hear how bad it is.
128Kb/s is only just a little above that.
i.e. That file is only just above the point where audio is defeningly
obvious how bad it is.

H265 is H264 tweaked.
The macroblock size is increased (the grid which the encoder chops the
picture in to), and they rejigged the colour space. Thats it.
Its really not worth getting all moist about h265 as being the best thing
since the Altair 8800.
Its not some amazing soloution that will allow cinema quality pictures to
pushed down a dial up connection.

Just like that 128Kb/s mp3 file, its h264, but just above the point where
the masses can tell its crap.


I can hear the defects in 128k MP3, but not in 256k MP3 nor in 128k WMA. I
haven't tried on AAC. DAB, OTOH? 80k MP2? Yuck.

And as for Freeview? My techie son and I have learned not to mention the
faults when my wife is there. She just gets cross.


Lets just say, perhaps yours is not the only abode where that situation
arises!



Andy




_Unknown_Freelancer_ August 15th 15 02:16 AM

4k TV on Freesat or Freeview?
 
"Andy Furniss" [email protected] wrote in message
o.uk...
_Unknown_Freelancer_ wrote:

Take a football match being covered in 4K. (Ive seen some.) Now get
your main gantry camera to frame on a stationary wide angle* facing
across the pitch. *As wide as the lens will go. Not pointing at
anything in particular. Such is the detail in that picture you can
make out individual facial expressions of people in the crowd in the
opposite stand (usually around 100metres away). Its a fair wager that
detail will _never_ be present in any "Freeview 4K".


Instead of wasting so much bandwidth on such guff, why not just turn
up the bandwidth for present HD channels, and make Freeview better
quality than Sky satellite or BT TV???


I am curious what bitrate you think is enough for HD?


I dont know.
I really cant be arsed to carry out loads of tests.
But I do know that when I watch Freeview HD, I got annoyed!
.....yes I know what the doctor will tell me!

I get annoyed because it looks so, well, ****!
And this '****' is sold as "HD".
Its a bleedin' con!



AIUI BT HD is max 10mbit for their (BTW) premium offering or 7.5 1440
standard. For UHD they require a connection of 44mbit (I really hope
they are not allowing for record one and watch another in that!).

I don't know what Sky HD uses, though I have "come across" some SD
transport stream motorsport rips that don't seem any higher than the
same content from the BBC.


AFAIK Sky and Sky Sports use different bit rates for different channels, and
occasionally change it depending on content.
I know Sky Soprts F1 has (or did) have a much higher bit rate so it looked
great.
......but then they went and spent ALL of the money.
So things may have changed.
There's probably a nerds thread on DigitalSpy where someone lists all the
present TS details!
I cant be bothered to trawl right now!



i.e. Make HD.... the best HD. FOR FREE.


I agree that should happen - but what Brian wrote was -

"You could use a whole DVB-T2 multiplex and the H.265 codec.
It would arguably be a better redition of 4K than the current HD
Freeview is of 2K."


Like, that would happen?!!! (Retorical question, btw!)



I think 40mbit hevc 2160p50 should surely be better than the current HD
offering - maybe in the case of 1080i25 high motion bits even better
than "quality/raw HD" deinterlaced/scaled up for a UHD TV.


Hmmmm.....pushing the boundaries there me thinks.
i.e. Back to that 128Kb/s mp3 file again.



I guess it should really go to HD - but there's a T2 mux spare on my
transmitter with just nulls and QVC.

Hmm, maybe I should find the keys to my lab and test -

Would grainy 65mm 2160p50 film scans with professionally made (for VQEG)
1080i25 derivatives do :)




Andy Furniss[_2_] August 15th 15 11:15 AM

4k TV on Freesat or Freeview?
 
Andy Furniss wrote:
I recall reading years ago that rf output ist gen game consoles/early
home computers used to pull a trick on interlaced CRTs to make them
progressive (IIRC there is a pulse/something to mark top/bottom field
and they just repeated one rather than alternating). It did say this
worked on most but not all TVs and I recall wondering "why no gaps",
so perhaps all CRTs have a fat spot/spots.


Which makes me wonder what "paused" means/does in the world of analogue kit.

I may well be that taking photos of paused images is only ever going to
get one field as the kit sending the signal is repeating the same field
over and over.

Java Jive[_3_] August 15th 15 02:01 PM

4k TV on Freesat or Freeview?
 
On Sat, 15 Aug 2015 00:17:28 +0100, Andy Furniss [email protected] wrote:

Java Jive wrote:

I ask because I can see no evidence at all that my, admittedly
old and first generation, LCD is doing any de-interlacing.

I don't see any evidence it doesn't


I beg to differ. I go into some detail on the subject, referencing
a particular image of an athlete's arm in the CRT vs LCD web-page. I
am convinced that neither that nor any other video is de-interlaced
by my TV. It's a very early model Panasonic with analogue tuners.


Looking at that image and reading your site I think what I would call
de-interlacing differs from what you would.


Yes. I think that is becoming clear ...

I consider any processing to display interlaced content on a progressive
display as de-interlacing. It may be that it is very simple like line
doubling fields and a bit of filtering to compensate for the different
spatial position of the fields, but it's still processing that is needed
and wouldn't be if a native interlaced display were being driven. Doing
line doubling hurts resolution just weaving fields together is good for
res with static portions of the frame but artifacts on motion. Being
clever and adding edge interpolation/motion detection to get the best of
both is what "advanced" de-interlacers do.


I have to say that I prefer my own definitions ... I don't see how
anything can truly be called de-interlacing unless electronically it
combines the content of *two or more successive* fields to produce an
image that is different from what it would have been had the lines of
each field been drawn on the screen in the appropriate place as
received. As we both seem to acknowledge, there are other things that
an LCD does, such as buffering and scaling, which are necessary for it
to display content correctly, but as long as it is only electronically
processing *one* field at a time, and not electronically combining it
with the content of, temporally speaking, neighbouring fields, before
drawing it on the screen, I don't think it makes any sense to call
that de-interlacing. If you were going to use that definition, you'd
more less have to say that a CRT de-interlaces as well, and the
definition of the word thus becomes too wide and general to be useful.

Judging by the arm shot your TV looks like it's doing fields - I accept
what you write about the pairs not matching, but that could be
additional processing that's nothing to do with interlacing.


But the *simplest* and *cheapest*, and therefore most probably
correct, of the many possible explanations is that it is simply
drawing the fields on the screen pretty much as received.

I don't know the details of your screen


The pictures was from the set taken for the ancillary artifact
demonstration, and as it was not possible to do a field-exact, and
therefore meaningful, visual side by side comparison for this - I
had no way of releasing the shutter on a particular given field, and
therefore could not guarantee that pictures taken on different TVs
were of exactly the same field - all I could hope to do was show that
both types of TV showed the dot-crawl type of motion artifact.
Therefore, I simply used the most convenient LCD model to hand, which
was a Panasonic TX-22LT2, a larger model in the same range as the one
used for the side-by-side comparison of the original experiment, a
Panasonic TX-15LT2.

but it could be, like the
monitor I am currently looking at, only 6 bit and using spatial and or
temporal dithering to fake 8 bit colour, which could explain why the
lines differ 1:1 chroma but the brighter edge steps look paired.


Can't comment on that.

As well as colour LCDs also have to pull tricks to get motion to "work"
- just holding pixels on for the whole frame/field period is great for
flicker, but is really bad for making motion look blurred, so they
"soomehow" have to try and work around that.


From your descriptions and others' I think it likely that modern LCDs
do more processing than they did at the time of the original
demonstration, either by default or via menu options, and I probably
ought to update some of the wording of the page accordingly, but at
the time I never saw any evidence of more complex processing for
either of the two LCDs used in it.

I still have the TX-15LT2, though its screen is scratched after I
contrived to pull it off the bedside furniture :-( but I no longer
have the other two TVs. Nor do I have any of the original content
used. This being so, I can't do any further investigations, unless
they can be done from the original photographs, most, probably all, of
which I still have.

I notice some of your pics are from tiny screens (OK the 22" isn't so
tiny) I wonder given how much portables used to overscan whether they
could be doing 1:1 and loosing lots off top & bottom - just a thought.


Yes, quite possibly, though, as described above, no longer having most
of the kit, I can't check this.

Multi field deinterlacing is "advanced" - it's possible but sub optimal
to call processing one field de-interlacing. processing may involve more
than just line doubling eg. edge detection and smoothing to hide the
half res steps on diagonals.


See above ...

So it's certainly buffering, and it's certainly scaling, it HAS to
be doing both, but de-interlacing, I'm pretty sure not.

So why no artifacts? Well this LCD doesn't have the vertical
resolution it should, 492 vs 576, so perhaps they are being lost in
being scaled, or perhaps they are doing the same trick as with CRTs,
and making each scan line overlap over its neighbours in the
previous field.


I doubt weave would be lost in scaling - IME it looks far worse when scaled.


Perhaps the same trick as CRTs then, halving the vertical resolution.

As for overlapping - well yes, though I would say totally overwrite
rather that overlap.


However we describe it, the result would be the same.

I wonder whether a pic of a full size CRT would
look the same as your portable. Saying that, though, I recall reading
years ago that rf output ist gen game consoles/early home computers used
to pull a trick on interlaced CRTs to make them progressive (IIRC there
is a pulse/something to mark top/bottom field and they just repeated one
rather than alternating). It did say this worked on most but not all TVs
and I recall wondering "why no gaps", so perhaps all CRTs have a fat
spot/spots.


One machine is not a statistically significant sample, but if I had to
generalise based on those photographs and the very few others I've
ever managed to find on the web, I would guess that they're all pretty
much like mine. However, if anyone here with experience of,
particularly, factory setting up, or even just repairing, CRTs could
describe for us how they set such things as the focus, their
description might illuminate this point considerably.

As for LCD motion artifacts you mention in your other post - early
LCDs were poor and may have had poor de-interlacers, without seeing
it running it's hard to know what you mean.


Well, I wasn't saying that my LCD shows artifacts, in fact I've
never seen it show the combing effects claimed on interlaced input,
only dot-crawl from using CV as the input source. Merely I was
suggesting why in general LCDs might legitimately show such motion
artifacts if fed an interlaced signal, because they their technology
allows them to display what they are fed more faithfully than CRTs.


Perhaps it's to do with things other than interlacing - like it's hard
to get slow/always on LCDs to do motion properly.


I do remember that the very earliest LCDs used as laptop screens used
to leave motion trails, but I've never seen that on either of my LCD
TVs, nor on recent laptops, such as the Dell that currently functions
as my best 'TV'!
--
================================================== ======
Please always reply to ng as the email in this post's
header does not exist. Or use a contact address at:
http://www.macfh.co.uk/JavaJive/JavaJive.html
http://www.macfh.co.uk/Macfarlane/Macfarlane.html

_Unknown_Freelancer_ August 15th 15 10:42 PM

4k TV on Freesat or Freeview?
 
"Andy Furniss" [email protected] wrote in message
o.uk...
Andy Furniss wrote:
I recall reading years ago that rf output ist gen game consoles/early
home computers used to pull a trick on interlaced CRTs to make them
progressive (IIRC there is a pulse/something to mark top/bottom field
and they just repeated one rather than alternating). It did say this
worked on most but not all TVs and I recall wondering "why no gaps",
so perhaps all CRTs have a fat spot/spots.


In having a fat raster you ensured there were no black lines between scan
lines.

A bit like helical drum video recorders always moved the tape slower than
one head width per rotation.
i.e. The head would always over write part of the previous pass.


Years back computers were low rez anyway... even with 625 PAL kit.
So it didnt really make any difference if both fields were precisely the
same.
Saved on processing too.... which there wasnt too much of.



Which makes me wonder what "paused" means/does in the world of analogue
kit.

I may well be that taking photos of paused images is only ever going to
get one field as the kit sending the signal is repeating the same field
over and over.


Depends on the kit concerned.
Some had an option switch. This would select whether your freeze/slow motion
was field based or frame based.

Field based would send the present field to both fields on the display.
Frame based would send the present frame (both parts of it) to the display.

(IIRC) The Sony BVW75 was the first analogue deck to include an onboard TBC
and frame memory.

When you pause or played tape slowly, there would be a horizontal stripe
somewhere with no picture content.
(Depending on the kit) It was either a grey stripe or just looked like a bit
of screwed up picture.

When recording to tape and normal speed the stripe recorded by the rotating
head would be stretched further along the tape. This was a good thing = more
bandwidth.
When paused or played at less than normal speed that distortion which
occurred when recording could not occur. The playback head could not read
all of one scan line, because the tape was not moving.
Therefore there was a point when the head passed from one track, over the
blank interleave, and in to a neighbouring track.
This is what caused the grey or screwed up stripe.

To remove this Sony added a frame store and called it DMC. Nothing to do
with a popular American rap crew of the time Id like to add, but Dynamic
Motion Control.

Whilst video RF was above a threshold data would be written to the frame
store.
When RF fell below that threshold no data would be written.
i.e. That part of the frame store remained unchanged.

Then reading from the framestore gave a perfect picture, with none of the
previous problems of analogue video tape.


Yes, before Betacam SP, Quad band could do slowmotion without distortion...
but it was a piece of **** for the time.
Hold a VT on a freeze for too long and the head would slice through the
tape! Genius! "Sorry, cant run that news item right now, tape's snapped
again."

And if the engineer hadnt bothered to line up properly, the resulting
pictures looked like the screen was a composite of four different horizontal
bands of picture, all with different luminance and chroma. ....like one of
those kids flip books.

Towards the end of the VHS era most domestic decks began to include this
sort of technology.
AFAIK, S-VHS decks did by default.... well they had to give you something
for the extra £200 you were paying!






_Unknown_Freelancer_ August 15th 15 10:52 PM

4k TV on Freesat or Freeview?
 
"_Unknown_Freelancer_" /dev/null wrote in message
o.uk...
"Vir Campestris" wrote in message
...
On 14/08/2015 12:03, _Unknown_Freelancer_ wrote:
snip
Instead of wasting so much bandwidth on such guff, why not just turn up
the
bandwidth for present HD channels, and make Freeview better quality than
Sky
satellite or BT TV???
i.e. Make HD.... the best HD. FOR FREE.


Onik. Flap. Oink. Flap.


Preaching to the converted here!
I know it will never happen.
BUT, instead of wasting truck loads of cash on pointless DTT 4K, why not
just offer 'the best HD'?



Actually, I had a thought on this today.

IF Arqiva gave the masses 'the best HD', it would pull the rug from Virgin,
Sky and BT.
Because they would have nothing better to sell, because the masses would be
getting it better for free.

There would no doubt be some idiot court case or petition handed to Ofcom to
have the DTT bit rates slashed back to ****!

So, perhaps, there is a reason DTT is rubbish quality.... the ill logic of
driving profit!

On a similar note, there is discussion and chatter of putting iPlayer behind
a paywall. (DISCUSSION and CHATTER ONLY.)
The next thing we know, ITV come out saying they will HAVE TO put ITVPlayer
behind a paywall if the BBC do!
i.e. An even bigger truck load of bovine faeces!



alan_m August 16th 15 09:28 AM

4k TV on Freesat or Freeview?
 
On 09/08/2015 14:55, _Unknown_Freelancer_ wrote:


Point is though, yes, (at source) 4K is four times the resolution of HD, but
it produces eight times the data of HD.


What's more important for the viewer is how much of that data is lost in
the processing during transmission.

The UK public demand more channels rather than better (technical)
quality channels so the broadcaster always squeezes more channels into
the available bandwidth using more aggressive lossy algorithms.

Would there be a need for 4K broadcast data if HD was allocated the same
broadcast bitrates as 4k?



--
mailto: news {at} admac {dot] myzen {dot} co {dot} uk

alan_m August 16th 15 09:39 AM

4k TV on Freesat or Freeview?
 
On 15/08/2015 21:52, _Unknown_Freelancer_ wrote:

Because they would have nothing better to sell, because the masses would be
getting it better for free.


There are many products sold on the basis that the bigger the advertised
number the better the product.




--
mailto: news {at} admac {dot] myzen {dot} co {dot} uk

Roderick Stewart[_3_] August 16th 15 09:56 AM

4k TV on Freesat or Freeview?
 
On Sat, 15 Aug 2015 21:52:54 +0100, "_Unknown_Freelancer_" /dev/null
wrote:

Actually, I had a thought on this today.

IF Arqiva gave the masses 'the best HD', it would pull the rug from Virgin,
Sky and BT.
Because they would have nothing better to sell, because the masses would be
getting it better for free.

There would no doubt be some idiot court case or petition handed to Ofcom to
have the DTT bit rates slashed back to ****!

So, perhaps, there is a reason DTT is rubbish quality.... the ill logic of
driving profit!


Perhaps that's why they also have corner logos on most broadcasts, and
squish the end credits to one side and speed them up so they're too
fast to read while some **** tells you half the plot of the following
episode, or some other programme entirely, about 10dB louder than the
music. The only way to see TV programmes nowadays free from any
deliberate blemish is to pay for DVDs or watch them online.

An online service such as Amazon or Netflix costs about half as much
as the TV licence, so roll on the day when the only payment we have to
make is for what we're actually watching.

Rod.

Andy Burns[_9_] August 16th 15 12:14 PM

4k TV on Freesat or Freeview?
 
alan_m wrote:

The UK public demand more channels rather than better (technical)
quality channels so the broadcaster always squeezes more channels into
the available bandwidth using more aggressive lossy algorithms.


Maybe Mr Corbyn will have a referendum on quality vs quantity of
freeview channels :-P



All times are GMT +1. The time now is 09:37 PM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
HomeCinemaBanter.com