A Home cinema forum. HomeCinemaBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HomeCinemaBanter forum » Home cinema newsgroups » UK digital tv
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Why interlaced HDTV?



 
 
Thread Tools Display Modes
  #41  
Old August 24th 05, 06:31 AM
JC
external usenet poster
 
Posts: n/a
Default

On Tue, 23 Aug 2005 09:23:44 +0100, Kennedy McEwen
wrote:

No it isn't, it is something you filter out very quickly and the size of
the screen has nothing to do with it - how many kids from the 60's and
70's sat a couple of feet from their 26" CRTs watching TV while their
parents told them "don't sit too close, johnny, your eyes will go
square!"? None of them saw any flicker, yet the angular screen size was
far bigger than anything viewed at normal distance.


Well I was a kid from the 70's with exactly this experience but I must
say that I've always noticed mild flicker on colour TVs since we got
our first in around 1980. Flicker is definitely more noticeable as
screen sizes get larger, 32" 50Hz sets are unwatchable on bright
scenes for me which I guess is why most are 100 Hz. We had a 21" 50Hz
4:3 set for years which was OK but when I upgraded to a 28" 16:9 50 Hz
set I ended up watching most TV on the portable in the bedroom and
ended up replacing it with an LCD almost immediately.

Ever had american visitors to your home for a week or two? At first
they complain that TV flickers in this country, but by the time they go
home they are marvelling about the picture quality.


After watching an LCD i'm now like this even with some 100 Hz TVs. Yes
you adjust after a while with smaller screens but I've never been able
to cope with anything over 28".

Hence the number of 100 Hz TVs on the market and all the
artifacts that their frame stores cause.


Marketing. Looks good in the showroom when you have a bank of TVs
stretching out to the extreme periphery of your vision where you have
most sensitivity to flicker because you haven't regularly watched it and
learned to filter it out.


I don't think it's just marketing. Some of the 100 Hz TVs have
horrible frame store artifacts (a friend has a Phillips set where you
can actually see the interlace "jaggies" on any moving image), but the
reduced flicker is noticeable.

What difference does that make - it is just manipulation of digital data
so it doesn't make a ha'penny difference if it is implemented in a
£50,000 Quantel box or in a £5 chip.


The real point is that with a move to progressive production and
progressive display, we don't need interlace in the middle ruining
perfectly good pictures. Any legacy conversion that needs to be done
can be done as well or better by the broadcaster that inside the TV.

However the adoption of a progressive
broadcast system allows the migration to all progressive production
over time eliminating this problem completely.

But the problem doesn't need to be present - there is no reason why a
flat panel progressive screen cannot display an interlace signal
accurately. There is nothing intrinsically superior about a progressive
source which has the same overall bandwidth as the alternate interlace
system, and generally it has inferior resolution, as demonstrated by the
1080i/720p debate.


A progressive source will in general digitally compress better than an
equivalent interlaced source in a given bandwidth. In addition, the
move to progressive production allows more fluid use of features such
as high speed/slow motion, frame accurate editing etc. Coupled with
the fact that 720p has double the temporal resolution of 1080i making
it the only real choice for any material with movement (sports, "arty"
camerawork, action movies, pop videos etc) this in my mind is a pretty
convincing argument in favor of 720p (..and of course I must add that
1080p would be better still).


But nobodies going to be using an interlaced screen for HD. Even an HD
CRT should be capable of 50 or 100 Hz progressive refresh without
interlace and I'd be very surprised to see an HD CRT set in Dixons etc
in a years time.

That is exactly the point: nobody would need them - if high quality
backwards compatibility were delivered. Whilst that is certainly
possible, indeed just as simple to achieve, it isn't what most flat
panels provide. Consequently the push for progressive standards
alienates about half a century of existing video heritage.


I don't follow this. The push for progressive standards is as much
about moving forward with technology as the move to high definition.

This is a new system based on higher resolution displays and more
efficient codecs. Since the existing systems are sadly lacking in the
temporal resolution department (25 fps really is very poor) it seems
logical to fix this at the same time. This no more alienates existing
video heritage than any other part of HD. HD may consign much of what
you and I are familiar with to a museum, but that's progress for you.

Progressive production and display is the future. To tie our HD
broadcast standards to the legacy interlace is even more crazy than
tieing DAB to Layer 2.

I would agree if we were discussing a comparison of 1080p versus 1080i,
but the option is 720p versus 1080i, so what you are calling for is to
tie our HD broadcast standard to little more than the legacy static
resolution of the interlaced system we have had for the past half
century, when we could be quadrupling it! So much for the term
"progressive" - "marginally incremental" would be more appropriate!


I really agree that we should be looking to 1080p but as production is
moving in that area anyway it would appear logical to adopt the
broadcast standard that's the closest technical match.

With 720p you have a down sampling of the source to from 1080 to 720
at the broadcaster and then this is maintained all the way through the
transmission system to the progressive TV.

With 1080i you first have to throw away half the temporal information,
transmit this at 1080i and then the domestic TV has to reconstruct a
1080 line progressive frame in memory from two 540 line halves of the
interlace each taken 1/50 of a second apart. Since these frames may
have completely different information (depending on the amount of
movement etc) this then leads to an apparent quality reduction. Add to
that the inefficiencies of digitally compressing this system and for
any given bit rate 1080i will look on average the same or even worse
than 720p, except on medium to fast movement where 1080i will look
significantly worse.

As I say, that isn't my experience. 1080i is superb when displayed
correctly - which is often not even attempted because it requires a much
higher resolution screen to do it properly.


The HD I've seen has been almost exclusively on high (and low) end
domestic equipment - native 1080 and 768 (why??) progressive LCD and
Plasma screens. On this equipment my experience has been that
progressive material appears to have the advantage.

It's interesting what you say about the progressive DMD system and I'm
sure you're right that it and LCD/Plasma could be made to display a
native interlace cleanly. But with all the above advantages of a
progressive source would this not be the equivalent of making a wonky
road for a square wheeled car? ;-)

Rgds
Jonathan

  #42  
Old August 24th 05, 06:46 AM
JC
external usenet poster
 
Posts: n/a
Default

On Tue, 23 Aug 2005 23:54:55 +0100, Roderick Stewart
wrote:

Exactly. I've been watching television with a 50Hz flicker rate since the
coronation and it's never bothered me, yet all of a sudden it's supposed to
be a such a problem that we need to spend lots of money on 100Hz displays.


I've been watching TV for 35 years and I've always noticed it. It's
only with the move to larger screen sizes that it becomes a major
problem. 100 Hz TVs have also been in the shops for over 15 years so
people obviously feel they get some benefit from them, despite the
fact that conventional interlaced signals cause visible artifacts on
100 Hz TVs. The idea of a 60" or larger panel flickering away on my
wall makes me feel sick just thinking about it.

Meanwhile, another faction is proposing a *reduction* in picture
intermittency rate from 50Hz to 25Hz (the most noticeable effect of
so-called "progressive" scanning), and somehow this isn't a problem at all!


Umm no. Existing TV is 25 frames interlaced to 50 half fields. 720p is
50 full frames and would be displayed progressively as such. On an LCD
for example each pixel would only change when the information changed.

This is a true doubling of the existing frame rate and with a
progressive display can completely eliminate flicker.

Rgds
Jonathan


  #43  
Old August 24th 05, 10:33 AM
Kennedy McEwen
external usenet poster
 
Posts: n/a
Default

In article , JC
writes

Umm no. Existing TV is 25 frames interlaced to 50 half fields. 720p is
50 full frames and would be displayed progressively as such. On an LCD
for example each pixel would only change when the information changed.

This is a true doubling of the existing frame rate and with a
progressive display can completely eliminate flicker.

Changing from 50Hz interlace to 50Hz progressive in itself makes
absolutely no difference to the level of flicker - the screen refresh is
still 50Hz in both cases.
--
Kennedy
Yes, Socrates himself is particularly missed;
A lovely little thinker, but a bugger when he's ****ed.
Python Philosophers (replace 'nospam' with 'kennedym' when replying)
  #44  
Old August 24th 05, 11:24 AM
Kennedy McEwen
external usenet poster
 
Posts: n/a
Default

In article , JC
writes

After watching an LCD i'm now like this even with some 100 Hz TVs.


You need to eat more vegetables then, especially carrots!


I don't think it's just marketing. Some of the 100 Hz TVs have
horrible frame store artifacts (a friend has a Phillips set where you
can actually see the interlace "jaggies" on any moving image), but the
reduced flicker is noticeable.

The fact that they sell to the unwashed masses means very little in
terms of image quality - slow rise time LCD panels without any colour
management have been selling in volume too, but the picture quality is
complete crap.

A progressive source will in general digitally compress better than an
equivalent interlaced source in a given bandwidth.


Yes, if you ignore the fact that interlace is already 50% data
compression to start with! This has been debated extensively on this
group.

Coupled with
the fact that 720p has double the temporal resolution of 1080i


No it doesn't! 720p has exactly the same temporal resolution of 1080i,
they both have 50Hz refresh rates. The fact is that 1080i provides
additional information in that frame time, in the form of significantly
enhanced horizontal resolution and additional vertical resolution
through the interlace.

making
it the only real choice for any material with movement


What crap - interlace sources have been the format of choice for
movement for over half a century.

That is exactly the point: nobody would need them - if high quality
backwards compatibility were delivered. Whilst that is certainly
possible, indeed just as simple to achieve, it isn't what most flat
panels provide. Consequently the push for progressive standards
alienates about half a century of existing video heritage.


I don't follow this.


That is clear, from your previous posts. Perhaps you need to consider
it more.

The push for progressive standards is as much
about moving forward with technology as the move to high definition.

But progressive is NOT a move forward with technology in itself.
particularly when the option is between 720p and 1080i. Both formats
provide similar vertical resolution in the presence of motion, but the
interlaced option provides much higher horizontal resolution in all
situations and much higher vertical resolution where motion is limited.
Since we are already adopting a digital video coding format that drops
resolution when motion is present, on the grounds that it cannot be
perceived, the better system at the point of consumption is 1080i.

Since the existing systems are sadly lacking in the
temporal resolution department (25 fps really is very poor) it seems
logical to fix this at the same time.


This is a flawed logic. 25frames per second, but 50 fields per second,
the temporal resolution is exactly the same as a 50 frame per second
system. Yes, the full *spatial* resolution is not available
simultaneously with the full vertical resolution, but that is no worse
than most digital codecs, which drop horizontal and vertical resolution
when full temporal resolution is required.

This no more alienates existing
video heritage than any other part of HD. HD may consign much of what
you and I are familiar with to a museum, but that's progress for you.

If it means that it cannot be rebroadcast without introduction of
artefacts than it is not progress, it is anarchy.

I really agree that we should be looking to 1080p but as production is
moving in that area anyway it would appear logical to adopt the
broadcast standard that's the closest technical match.

Which is 1080i - simply drop every other line in alternate fields: no
interpolative downsampling (with the consequential loss of resolution
inherent in all interpolation techniques) required.

With 720p you have a down sampling of the source to from 1080 to 720
at the broadcaster


Losing vertical resolution in the process - unlike sampling a 1080i
field from a 1080p frame, getting a 720p frame is not an integer spatial
division!

and then this is maintained all the way through the
transmission system to the progressive TV.

That part is true: it maintains crap in crap out all the way through to
the display!

With 1080i you first have to throw away half the temporal information,


Wrong - and this is the mistake that seems to underpin most of the
"progressive is superior to interlace" logic! You do NOT throw away
half of the temporal information. What you throw away is the
information which requires *BOTH* the full temporal and the full spatial
resolution. That is very much *LESS* than a quarter of the information
in a real world image - and, given the temporal response of the eye, is
even further reduced at the point of viewing. This is why interlace was
adopted in the first place - halving the transmission bandwidth resulted
in a very small amount of the perceivable information being lost.

transmit this at 1080i and then the domestic TV has to reconstruct a
1080 line progressive frame in memory from two 540 line halves of the
interlace each taken 1/50 of a second apart.


As already explained, that is trivial to accomplish if done correctly.

Since these frames may
have completely different information (depending on the amount of
movement etc) this then leads to an apparent quality reduction.


Back to your error - these frames (fields actually) do not have
*completely* different information. Even in a high motion content
scene, the fast majority of the information is identical in both fields.

The spatial resolution of a 1080 format is more than double that of a
720 format. Your logic appears to argue that since the temporal
resolution of interlace is half that of progressive this cancels out the
spatial advantage - but the temporal resolution of progressive is NOT
twice that of progressive. The difference between the two lies only in
the region where both the full spatial and temporal resolution is
required to carry the information. At best, on a synthetic signal which
fully occupies that information space, you are looking at a loss of 25%
of the information in interlace compared to progressive. In practice it
is *MUCH* less than that in real world images, if nothing else because
the camera optics don't resolve at full contrast at full resolution.


The HD I've seen has been almost exclusively on high (and low) end
domestic equipment - native 1080 and 768 (why??) progressive LCD and
Plasma screens.


Which domestic display is currently available that has a 1080 line
vertical resolution?

It's interesting what you say about the progressive DMD system and I'm
sure you're right that it and LCD/Plasma could be made to display a
native interlace cleanly. But with all the above advantages of a
progressive source would this not be the equivalent of making a wonky
road for a square wheeled car? ;-)

No, it would be the equivalent of making a vehicle that is capable of
running on existing roads as well as on new roads. Just like the
dimensions of the worlds most advanced transport vehicle, the space
shuttle, can be traced back to the width of two horses hind quarters,
even though a horse has probably never been within miles of it. Its
called backwards compatibility.
--
Kennedy
Yes, Socrates himself is particularly missed;
A lovely little thinker, but a bugger when he's ****ed.
Python Philosophers (replace 'nospam' with 'kennedym' when replying)
  #45  
Old August 24th 05, 01:05 PM
external usenet poster
 
Posts: n/a
Default

You know, I reckon you're both partly right (and both partly wrong!)

A _good_ deinterlacer doesn't do the simple processing that you're
suggesting - it can take a 1080i source and recreate a full 1080p
picture without any additional artefacts - so long as the information
in the 1080i source isn't ambiguous.

Interlacing makes high frequency spatial and high frequency temporal
information ambiguous - at the limit, 1 line of picture information and
a 25Hz strobe are one and the same thing. However, most pictures just
don't have enough movement or detail to cause a major problem.

Most deinterlacers integrated into consumer products are simply crap.
Excellent ones currently cost a lot of money, and work very well most
of the time.

At a given (reasonable) bitrate, interlacing _improves_ the overall
coding efficiency (counter intuitive, but true) so allowing a higher
artefact-free resolution. That's why it is used for HD. One reason the
EBU want to use 720p is because the bitrate required for a given amount
of coding artefacts is slightly lower than for 1080i - but of course
the resolution is half that of 1080i! The other "good" reasons are all
true (no deinterlacing required at the display, more consistent coding
performance at a given bitrate) but its questionable whether they
outweigh the dramatic reduction in resolution of 720p vs 1080i.

IME 50Hz progressive (on a CRT) flickers even more than 50Hz
interlaced. IMO the artefacts introduced by all frame-rate-changing
processing (e.g. conversion to 100Hz) in domestic equipment are so bad
that I'll happily put up with the slight flicker of 50Hz, but this is
obviously highly subjective.

OTOH it's not difficult to display 50Hz progressive material on an
interlaced display - both 50 interlaced fields or 100 interlaced fields
are acceptable solutions. The former doesn't have to compromise
resolution if the resolution of the display is higher than the
resolution of the broadcast. The latter may introduce some slight
motion artefacts, but they're more like those introduced by showing
film on an interlaced display (i.e. very subtle and consistent) than
like showing video on a modern 100Hz display (i.e. sometimes annoying
and very content dependent).

Just my =A30.02.

I'm sure we'll get what we're given, and count ourselves lucky to get
any HD at all when it finally arrives - especially if it's FTA,
dog-free, and the bitrates are adequate to prevent artefacts. 1080i vs
720p is irrelevant if I have to pay Sky, or they DOG the thing, or the
bitrates are squeezed like with SD at present.

Cheers,
David.

  #46  
Old August 24th 05, 01:47 PM
Roderick Stewart
external usenet poster
 
Posts: n/a
Default

In article , Jc wrote:
I don't follow this. The push for progressive standards is as much
about moving forward with technology as the move to high definition.


Calling a non-interlaced signal "progressive" is simply a rhetorical
use of words to give the impresssion of progress to those who don't
understand what is really being described.

Unfortunately the ignorant are numerous, so those who extoll their
inferior system behind this particular banner may eventually win, and
the realism of televison pictures will be the worse for it.

Rod.

  #47  
Old August 24th 05, 01:47 PM
Roderick Stewart
external usenet poster
 
Posts: n/a
Default

In article , Jc wrote:
Meanwhile, another faction is proposing a *reduction* in picture
intermittency rate from 50Hz to 25Hz (the most noticeable effect of
so-called "progressive" scanning), and somehow this isn't a problem at all!


Umm no. Existing TV is 25 frames interlaced to 50 half fields. 720p is
50 full frames and would be displayed progressively as such.


To display such a signal with the whole picture changing 50 times per second
would require twice the transmission bandwidth of the equivalent interlaced
signal, which is why interlace has been used by all broadcasters since the
invention of television.

Updating only one field's worth of the picture, or half the number of picture
lines, 50 times per second (instead of all of them half as often) is quite
sufficient to present the illusion of smooth movement as if the whole picture
were updated this often, whereas scanning the image "progressively" only
updates the picture information 25 times per second, similar to the update rate
of film, which is either 24 or 25 frames per second.

In standard unadulterated television signals from television cameras the
picture information is slightly different between the two successive *fields*
that make up each *frame*, because the information is sampled at different
times, this being an inevitable consequence of the way scanned tube cameras
worked. Modern chip cameras *can* be made to integrate the light over 1/25th
second and then make two interlaced fileds with identical pictorial
imformation, but the result looks intolerably jerky if you do this. If you then
also set the electronic shutter time to 1/50th second or less, it only looks as
jerky as film, because half the action is missing, but more jerky than if you
just left it alone.

Picture update rate is not the same as the brightness "flicker rate", which can
be anything you want it to be without requiring any change in the banddwidth
needed to transmit the signal or the storage capacity needed to record it.
Flicker only depends on how you design the display mechanism.

You can eliminate flicker by displaying the image a thousand times a second or
using a method that illuminates it continuously, and you won't see any
variations in brightness, but if the picture information is only being updated
every 25th of a second, moving objects will move in 25 jerky little steps every
second instead of smoothly. 50 jerky little steps every second isn't perfect
either, but it looks a lot better and it's what we've had for the last 70
years, so it seems madness to throw this away for ever.

Rod.

  #48  
Old August 24th 05, 02:18 PM
external usenet poster
 
Posts: n/a
Default

Roderick Stewart wrote:
: Both types of display are in use today, but one day they will probably all
: be flat panel types with linear characteristics and extra circuitry to
: correct for gamma corrected signals. When this happens, we will have the
: odd situation that all television displays contain circuitry to undo
: pre-distortion that is applied to all video signals to compensate for a
: type of display that is no longer in use. We'll probably still call it
: "gamma correction" even though there will be nothing in the system with an
: innate gamma characteristic for which to correct!

I understood your point. The historical reason for gamma
correction is, as you say, to compensate for the non-linearity
of CRTs. What wasn't realised at the time is that it also has
the fortuitous property of allowing the use of many fewer bits
when digitised. So, certainly, when all displays are linear
we'll still use a non-linear transmission format, and we'll
still call it gamma, but it won't be an "odd situation" in that
there will still be a sound technical reason for doing so.

Richard.
http://www.rtrussell.co.uk/
To reply by email change 'news' to my forename.
  #49  
Old August 24th 05, 07:57 PM
JC
external usenet poster
 
Posts: n/a
Default

On Wed, 24 Aug 2005 09:33:47 +0100, Kennedy McEwen
wrote:

Umm no. Existing TV is 25 frames interlaced to 50 half fields. 720p is
50 full frames and would be displayed progressively as such. On an LCD
for example each pixel would only change when the information changed.

This is a true doubling of the existing frame rate and with a
progressive display can completely eliminate flicker.

Changing from 50Hz interlace to 50Hz progressive in itself makes
absolutely no difference to the level of flicker - the screen refresh is
still 50Hz in both cases.


I'm not sure how simple I have to make this but with a progressive
standard it's easy to refresh the screen as many or as few times as
you like without introducing additional artifacts. With interlace, any
change from the native refresh leads to artifacts. A progressive
broadcast on a modern pixel based screen can lead to the highest field
rate with no flicker or other artifacts.

Rgds
Jonathan

  #50  
Old August 24th 05, 08:27 PM
JC
external usenet poster
 
Posts: n/a
Default

On Wed, 24 Aug 2005 12:47:34 +0100, Roderick Stewart
wrote:

To display such a signal with the whole picture changing 50 times per second
would require twice the transmission bandwidth of the equivalent interlaced
signal, which is why interlace has been used by all broadcasters since the
invention of television.


In the analogue world yes. With the move to digital standards it does
require more bandwidth but not double.

Updating only one field's worth of the picture, or half the number of picture
lines, 50 times per second (instead of all of them half as often) is quite
sufficient to present the illusion of smooth movement as if the whole picture
were updated this often, whereas scanning the image "progressively" only
updates the picture information 25 times per second, similar to the update rate
of film, which is either 24 or 25 frames per second.


Wrong. 720p is 50 FULL FRAMES per second. That's the same refresh as
the equivalent 50 Hz interlace, but with TWICE the vertical
resolution.

In standard unadulterated television signals from television cameras the
picture information is slightly different between the two successive *fields*
that make up each *frame*, because the information is sampled at different
times, this being an inevitable consequence of the way scanned tube cameras


Exactly, which is why progressive display of interlace leads to a
reduction in visible resolution.

worked. Modern chip cameras *can* be made to integrate the light over 1/25th
second and then make two interlaced fileds with identical pictorial
imformation, but the result looks intolerably jerky if you do this. If you then
also set the electronic shutter time to 1/50th second or less, it only looks as
jerky as film, because half the action is missing, but more jerky than if you
just left it alone.


Exactly, which is why there is a move to 1080p/720p 50 FRAME
progressive production and transmission, removing this problem and in
the process helping solve all the other problems that interlace
causes. A 1080p production can also be easily converted to interlaced
576i for legacy transmission.

Picture update rate is not the same as the brightness "flicker rate", which can
be anything you want it to be without requiring any change in the banddwidth
needed to transmit the signal or the storage capacity needed to record it.
Flicker only depends on how you design the display mechanism.


As I said above. Changing the refresh rate on any interlaced system
will always be a compromise as two time different half FIELDS have to
be merged in memory and effectively redisplayed multiple times.

Imagine taking two separate photographs at half resolution, a short
time apart of something that's moving and then trying to merge them to
get double resolution. The moving object will be in a different
position on the second field to the first. Now imagine rapidly
flicking between these two images or showing them merged, before
moving on to the next two.

With a progressive system you just reshow the same FRAME multiple
times or in the case of a memory type display such as LCD, just change
the pixels as required.

You can eliminate flicker by displaying the image a thousand times a second or
using a method that illuminates it continuously, and you won't see any
variations in brightness, but if the picture information is only being updated
every 25th of a second, moving objects will move in 25 jerky little steps every
second instead of smoothly. 50 jerky little steps every second isn't perfect
either, but it looks a lot better and it's what we've had for the last 70
years, so it seems madness to throw this away for ever.


We're moving from 50 half fields per second (as you say, 25 FRAMES) to
50 full FRAMES per second. I'd hardly call a doubling of the FRAME
rate throwing something away.

Rgds
Jonathan

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Sky's HDTV {{{{{Welcome}}}}} UK digital tv 105 March 15th 05 07:40 PM
HDTV - after one year, I'm unimpressed magnulus High definition TV 102 December 27th 04 02:36 AM
Getting the masses to buy HDTV CygnusX-1 High definition TV 6 December 6th 04 06:14 AM
HDTV - after one year, I'm unimpressed using a 17" monitor imjohnny High definition TV 0 December 1st 04 10:43 AM
Completing the HDTV Picture Ben Thomas High definition TV 0 July 22nd 03 10:55 PM


All times are GMT +1. The time now is 09:52 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
Copyright ©2004-2021 HomeCinemaBanter.
The comments are property of their posters.