Digital TV Banter

Digital TV Banter (https://www.digitaltvbanter.co.uk/forum.php)
-   uk.tech.digital-tv (Digital TV - General) (https://www.digitaltvbanter.co.uk/forumdisplay.php?f=4)
-   -   4k TV on Freesat or Freeview? (https://www.digitaltvbanter.co.uk/showthread.php?t=34439)

Roderick Stewart[_3_] August 6th 15 11:44 AM

4k TV on Freesat or Freeview?
 
On Thu, 6 Aug 2015 00:49:06 +0100, Phil Cook
wrote:

[re "sharpness" controls]
So, to the right hand end it emphasises egdes, makes them more vivid.
In the middle, it displays what it received. Be that OTA or off a piece of
wire from another box.
To the left, it goes the other way. It smooths out edges, it un-emphasizes
them.
i.e. Removes detail.

In analgue telly it was an HF tweak.
Putting gain in emphasised vertical edges. Attenuating HF smoothed them out.

And you're winding it all the way down to remove/smooth over the crap caused
by DTT transmission chain.
As before, you removed the 'D' from 'HD'.


No, the sharpness slider is *adding* something to the received picture
across its whole range, except at 0 where you are seeing what is
broadcast.


That may be true of a particular TV set, but it is possible for such a
control to remove sharpness, i.e. make it less sharp than what is
broadcast, just as it is possible to make a treble control that makes
something sound more muffled than the original.

A treble control is simpler situation in that it is just amplifying
high frequency components of what is already present in the signal by
a ratio that can be greater or less than unity, but the principle is
the same for sharpness controls, even though the "sharpness" signal is
artificially constructed*. It is possible either to add it to, or
subtract it from, the original, the latter making the picture look
less sharp than what is broadcast.

*In a TV aperture corrector, which does something similar to a
sharpness control, the usual way was to delay the signal twice and use
the signal after the first delay as the main output, the direct and
twice-delayed versions then being effectively copies of the original
ahead of, and behind it in time. Addition of these two signals created
a blurred version of the original (but centred on it, rather than
delayed as a simple HF filter would have done), and so subtracting
this blurred version from the original would produce an artificial
signal which was brightest where the amplitude of the original was
changing most rapidly, i.e. where there were edges between areas of
different brightness. In an aperture corrector, this sharpness signal
would normally be added in such a way as to increase the apparent
sharpness by adding bright edges, but it could just as easily be added
in the opposite polarity for effect.

In the vertical direction, for practical reasons the amount of delay
had to be whole lines (implemented by two 64 microsecond acoustic
delays carrying a 30MHz amplitude modulated signal), so the amount of
delay in the horizontal direction was chosen for a visual match.

When image processing software is used in a computer to apply
artificial sharpening to a blurred image, the technique sometimes goes
by the name of "unsharp masking", I suppose because it is generating
an "unsharp" (i.e. blurred) version of the original, and then taking
the difference between the unsharp version and the original to create
the edge enhancement signal, but despite a few extra adjustments being
available, it seems to be doing the same thing that television cameras
have been doing for decades.

Rod.

UnsteadyKen[_4_] August 6th 15 12:25 PM

4k TV on Freesat or Freeview?
 

In article:

"_Unknown_Freelancer_" says...

The numbers are no measure of anything, they're just there for the
non-technical human end user.

I know that, I have come across controls before, and it was you who
started blathering about numbered sliders and all that guff in a
desperate attempt to cover your arse when it was pointed out you were
talking bollox, IE, turning sharpness processing off does not reduce
the resolution and make the picture go all blurry,as you claimed.

Therefore I think I may lay claim to victory in this particular
pointless pedantic nitpicking contest and you should send a prize of
your choosing to the usual address forthwith.

--
Ken O'Meara

R. Mark Clayton[_2_] August 6th 15 03:17 PM

4k TV on Freesat or Freeview?
 
On Wednesday, 5 August 2015 21:09:05 UTC+1, _Unknown_Freelancer_ wrote:
"Andy Furniss" [email protected] wrote in message
o.uk...
SNIP


It's 2015 - I don't have an interlaced display any more!

De-interlace and scale is what my (and I guess most peoples) TV does to
25i I could let it do it or I can do it in s/w my self.


How do you know this?
Do you have the source code from the manufacturer?

You don't need it.

For a CRT interlacing relies on the persistence of the phosphor, so alternate lines are drawn every frame (50fps (576i)in EU, 60fps in US). The primary reason for doing this is to reduce the flicker that that would be very obvious if the whole frame were drawn every time (so 25fps in the EU).

Later CRT TV's would remember the contents of every line and redraw the whole screen every frame time (and SECAM sets may have had this feature longer). CRT monitors topped out at [email protected] around the mid noughties.

More recent flat screen panels rely on a different method. Basically a pixel will stay in a particular state until it is told to do something different.

Andy Furniss[_3_] August 6th 15 10:43 PM

4k TV on Freesat or Freeview?
 
_Unknown_Freelancer_ wrote:
"Andy Furniss" [email protected] wrote in message


De-interlace and scale is what my (and I guess most peoples) TV
does to 25i I could let it do it or I can do it in s/w my self.


How do you know this? Do you have the source code from the
manufacturer?


No, but that doesn't mean they don't.

There must be many chips sold for the purpose (I know they also do more
complicated processing as well)

I do know that my TV de-interlaces as I can test it with a computer.

Manufacturers spend a lot of time and effort over their kit before
they put it to market. Ok, so there is the occasional lemon model,
but on the whole, most kit does what it says on the tin.

Just because an OLED screen only came out of the factory in January
2015, does not mean it can not interpret an interlaced scan 'is as.
Without the source code, for all we know (when watching 1080i) it may
well actually only update all the odd lines in one pass, and then all
the even in the next.


Should be easy enough to take a pic to prove/disprove. Many TV reviews
seem to test "the deinterlacer" so I assume many TVs don't work by
simulating a CRT.

In which case, leaving source interlaced stuff as interlaced IS the
best thing to do.

Let the equipment decide what to do with it.


Oh I can and do let my TV do its thing - observing the quality of its
de-interlacing and scaling also lets me say that I can equal/beat it
with my own processing. It's not top end TV, but not budget either, it
got good reviews.

As previously, SMPTE didnt make up their standards for fun. And Im
pretty sure the manufacturers didnt toss a coin when writing the
code to decide how it should handle different display modes.

And in 'scaling' you're again compromising your own pictures. Once
you scale, you've ruined your copy for good.


Well someone/something has got to scaled to get SD on an HD panel.

Who said anything about changing master copy - I can choose different
scale eg.lanczos at display time thanks to open source geeks and OpenGL.

I could also use ffmpeg, I can deinterlace on the fly, but I think a
motion compensated de-int that runs at 0.1 fps will beat it. I have
choice to do whatever I want

i.e. You resample all the colour spacing, you contrive any
representation.

Again, leave it as you got it, let the screen decide what to do with
it.


Or do better....

Leap forward fifteen years. Screens will still have backward
compatitibility, and will be capable of rediculous resolutions....
but at that point in time the manufacturers will have better means of
making older formats work well on their screens.


True, and some of the methods they will do realtime are likely already
proposed in existing papers, peoples masters/PHDs. Just they currently
are far too slow.

Although I notice certain 'arty types' in sports production
actually add a 'film effect' to some items. Aparently its 'art'.
....with a capital F me thinks!


Not filmic that would imply deint to 25p (eww) TV/me would of
course do 50p.


No. Because it still leaves Tx at 50i. You're not going to change the
whole transmission chain for one vt package, are you?


It was you that mentioned filmic, which I assumed meant 25p - I can't
recall suggesting changing Tx!

FWIW on FreeviewHD 25p is flagged as progressive.


The part run samples I pointed to are source quality aren't they?


TL;DR! But, I dont think thats relevant anyway. Im comparing source
SD to DTT 'HD'. Lab test vs Real world.


Well I've got sport recordings that are 10mbit and so it's not like they
never go that high. I must admit that Park Run at 5mbit is horrible, but
at 10 it's OK.

Whatever I try I can't get the raw 576i to look as good as a
10mbit encode of the 1080i.


But without the source HD material, how do you know what
detail/definition has been lost to compare?


I do have the HD source - that's what the 10mbit 264 was made from!

Comparing HD to HD wasn't the point, of course 10mbit is not as good as
the raw - but it's way better than the raw SD.

Generally compression occurs by smoothing edges (loosing detail), and
then finding repeat patterns in a frame (which buggers up captions).


Different cameras of course (same lens), detail -

ftp://vqeg.its.bldrdoc.gov/HDTV/SVT_exports/README.txt


Please do not misunderstand this and interpret this as me saying
SD
HD. Simply not the case. Its once a picture has gone through
the
terretrial transmission chain.


I don't see why my 10mbit x264 encode should beat pro kit -
assuming of course they would give that much bitrate to similar
content.



Because its compressed to ****e by Arqiva! Because the broadcasters
dont want to pay any more s!


I thought the BBC coded the main HD mux and Aquiva did com 7/8.


Again, why do you need to de-interlace ?? Wht are you intent on
de-interlacing? When SMPTE created these new standards they set
added interlaced for distinct reasons. i.e. They didnt do it for
a laugh, or while they were down the beer keller!


Reluctantly to halve the bandwidth while giving decent temporal
res.


But you wont be halving the data rate.


I meant that the reason interlaced still survives and gets standards
despite eg, the EBU trying to get rid for HD is because it's half the
data rate compared to 50p.

https://www.ebu.ch/en/technical/trev...editorial.html

I know its the other end of the spectrum, but the maths is still the
same: (or "math", if youre "murican") 1080i50 (1080 interlaced, 25
fps, alternate lines refreshed at 50fps) = 1.5Gb/s (because you are
sending 1080/2= 540 lines 50 times a second) 1080p50 (1080
progressive lines, refreshed at 50fps) = 3Gb/s (because you are
sending all 1080 lines every 50 times a second)


I can do sums.

So, unless you're converting 1080i50 (off air telly) to 1080p25, you
are doubling your data rate by converting to 1080p50


I am fully aware of that.
..
.....and ruining the source by deinterlacing it needlessly.


Or doing as good as/better than the realtime deint my TV does.

And it's precisely the same if you're doing this with SD off air.
Converting 576i to 576p doubles your data rate.


I know that - just trying to get raw SD at its best to compare with
10mbit HD for comparison - not as a general policy for viewing - as I
said I let my TV deint that.


BUT.... you may well notice the occasional f.cup with 'regional
news'. ENG have gone out and shot something, usually out of focus,
and recorded with more audio distorion than a Foo Fighters concert!
Content goes back to the edit, and the clown at the keyboard, who has
a non-linear (computer) edit in front of him, hasnt bothered to check
it on a proper telly. i.e. They've only watched it on their computer
screen. When it goes to air any horizontal motion shivvers....
because they've got the field dominance the wrong way round!


Well they should really have just done a quich check with yadif=1 :-)

I mean surely most people now see interlaced on a progressive
display = it's de-interlaced. If I put my Panny plasma into an
interlaced mode it de-interlaces (motion adaptively). It doesn't
become an interlaced display. I can de-interlace in s/w to achieve
the same effect on a dumber display (my 1920x1080 computer
monitor).


As per first point above, without manufacturers source code, how do
you know that just because your telly only fell of Dixons shelf
yesterday, that it still doesnt update the display in an interlaced
fashion?


Answered above for me, but it's unlikely even if there are TVs that
simulate CRTs they would do it for 576i on an HD panel.


And in encoding to 50p you double the required data rate.


True - but then weaved frames are also "extra" complicated so I
don't


No, they're not. Its just half the lines, with a flag.


I think in practice mpeg2 and h264 encoders do full weaved frames rather
than fields - but anyway it was just me thinking out loud about how to
compare x265 with x264.

It seems currently ffmpeg doesn't re-weave the output of its hevc
decoder. x265 also warns that interlace support is experimental if you
try to use it.


Better HD yes - I am still not convinced it's quite as bad as SD,
though - maybe I don't watch enough TV (usually motorsport) -
perhaps park run is misleading (I obviously don't have access to
much else to compare), but to me 10mbit x264 HD wins over raw SD
for that.



A 'lab test' is not as good as a 'real world' test. Our real world
test is FTA Freeview HD. Compared to source HD, you're all being
ripped off severely!


Well yes, but I am comparing for the claim that it's worse than raw SD
so if anybody ever broadcasts park run I will record it and see real
world rather than my "lab" :-)

UHD will be an even bigger rip off, with even more detail lost!


Still no update on BTW wholesale connect sin WRT BT UHD.

Given that their HD offerings were 7.5mbit 1440 or "premium" 1920 at
10mbit it will be interesting to see what their UHD is.

Andy Furniss[_3_] August 6th 15 11:03 PM

4k TV on Freesat or Freeview?
 
_Unknown_Freelancer_ wrote:

So you're turning the detail down to zero, removing picture
definition. Its a safe bet your telly makes Mary Berry look twenty
years younger!


As discussed in the past on here or utb - being kind to people on TV can
be deliberate also (soft focus/avoiding too close).

.....because in removing detail, youre removing definition. You're
applying a real time photo-shop!


I can see single line b/w/r/g/b/y/m/c pattern detail on my TV with
sharpness full down = it does not mean a blur filter is somehow engaged
for me, and putting sharpness down is often recommended in review/test
articles.

So, what is the point of HD for you?? You are removing the 'D' from
it.



Paul Ratcliffe August 7th 15 10:21 AM

4k TV on Freesat or Freeview?
 
On Thu, 6 Aug 2015 13:25:21 +0100, UnsteadyKen
wrote:

I know that, I have come across controls before, and it was you who
started blathering about numbered sliders and all that guff in a
desperate attempt to cover your arse when it was pointed out you were
talking bollox, IE, turning sharpness processing off does not reduce
the resolution and make the picture go all blurry,as you claimed.


It certainly did on the old tube cameras. Turning contours off made
the picture as soggy as anything and essentially unusable. I think we
used to do that during registration line-up, but it has been rather a
long time since then...

Dave W August 7th 15 10:55 AM

4k TV on Freesat or Freeview?
 
On Thu, 6 Aug 2015 13:25:21 +0100, UnsteadyKen
wrote:


In article:

"_Unknown_Freelancer_" says...

The numbers are no measure of anything, they're just there for the
non-technical human end user.

I know that, I have come across controls before, and it was you who
started blathering about numbered sliders and all that guff in a
desperate attempt to cover your arse when it was pointed out you were
talking bollox, IE, turning sharpness processing off does not reduce
the resolution and make the picture go all blurry,as you claimed.

Therefore I think I may lay claim to victory in this particular
pointless pedantic nitpicking contest and you should send a prize of
your choosing to the usual address forthwith.


I congratulate you for giving a splendid link demonstrating the
effect, and winning victory in my eyes.
--
Dave W

R. Mark Clayton[_2_] August 9th 15 11:28 AM

4k TV on Freesat or Freeview?
 
On Wednesday, 5 August 2015 21:02:39 UTC+1, Andy Furniss wrote:
R. Mark Clayton wrote:

FWIW, UHD IS more than four times the bandwidth.


No it is four times the resolution.


Yes, but the source bitrate is 8x as you have to account for current HD
only being 25 fps or 50 fields per sec. UHD doesn't use interlacing so
50fps doubles the source bandwidth on top of the res increase. This
means for sport that the vertical res increase is (more than?) 4 times
HD. The "more than" may be debatable - but I think interlaced gets extra
filtering to prevent interline twitter.


You are still thinking about building a rasterised image with the picture built up in [alternate] lines every [other] frame time.

More recent methods send the full frame every so often and the changes every frame time. This works great for static images or for video where things in the view change, but can generate artefacts when the camera pans or zooms.

R. Mark Clayton[_2_] August 9th 15 11:41 AM

4k TV on Freesat or Freeview?
 
On Wednesday, 5 August 2015 21:32:06 UTC+1, _Unknown_Freelancer_ wrote:
Snip


As it happens we only bought a [small] Freeview HD TV (for the kitchen)
relatively recently. The picture on the HD channels is noticeably better
than SD.



'SMALL'
Pictures ALWAYS look better on a small screen.
....like (in the old days) telly was alway a better picture when you were at
the caravan
....because the screen is smaller, all the mistakes and artifacts are
smaller, much harder to notice.

iPlayer looks fantastic on my phone... with its 3" screen!

You could have a small telly for your living room, so you dont notice the
crap.
.....but its smaller! No-one would do that.


Not quite correct. We have a 42" screen and view it from about 3m5 the picture is good AND you can easily tell HD from SD as there is a lot more detail. View from our dining area (about 7m5 away) you can't because the eye can't resolve the image sufficiently.



Our main TV is Freesat and the smearing etc. that you mention does not
occur very much on that, however BT TV (over the internet) of a football
match suffered in exactly the way you describe.


Freesat has far more bandwidth that Freeview.
Therefore is not subject to rediculous compression, resulting in VHS
quality.




FWIW, UHD IS more than four times the bandwidth.


No it is four times the resolution.


I wrote the nugget above in reply to RMCs post on 3rd August:
"UHD is certainly no more than four times the bandwidth of 1080i, which is
broadcast in the teens of Mbps (SD usually less than half that)."

i.e. He didnt understand my point that 4K required so much bandwidth.

Correct, it is four times the resolution.
HD x 2 in each direction.

The SMPTE solution was to carve the picture in to quarters.
So a 4K video source has four video outputs. Each 1080p.
1080p requires 3Gb/s

Four images at 3Gb/s = 12Gb/s data rate for one 4K image.
Eight times the data rate for 1080i50 HD.


I missed the higher frame rate. Compression over a frame four times the size should perform slightly better than compression over four frames, however this assumes the same algorithm.



And you want to compress 12Gb/s down to teens of Mb/s ????
Just what do you think will happen to your detail? ....to your picture
definition?
You think it will be 'Ultra High' do you?


Well I am sitting in front of a 4k monitor running at 60Hz. The signal
comes down one DisplayPort cable and IIRC it can manage 4k @ 30Hz on HDMI.


Ah, Mac.


Mac - spit!

No new PC with AMD A8-7600, Asrock MB & Iiyama B2888UHSU display.

I have a universal solution to all Apple problems:
www.wickes.co.uk/p/139250

Display port (allegedly) has a 32Gb/s bandwidth
https://en.wikipedia.org/wiki/DisplayPort


Only got DP 1.2, which gives max 17Gbps (for true 4k). Actually using ~16Gbps for 10bit colour on [email protected]

_Unknown_Freelancer_ August 9th 15 01:38 PM

4k TV on Freesat or Freeview?
 
"UnsteadyKen" wrote in message
...

In article:

"_Unknown_Freelancer_" says...

The numbers are no measure of anything, they're just there for the
non-technical human end user.

I know that, I have come across controls before, and it was you who
started blathering about numbered sliders and all that guff in a
desperate attempt to cover your arse when it was pointed out you were
talking bollox, IE, turning sharpness processing off does not reduce
the resolution and make the picture go all blurry,as you claimed.

Therefore I think I may lay claim to victory in this particular
pointless pedantic nitpicking contest and you should send a prize of
your choosing to the usual address forthwith.

--
Ken O'Meara


Ken,

Apologies for the delay in my response. I had to go to work. 39 hours in
three days + 300 miles.

If you wish to rest easy in a smug oblivious shallow victory, then please,
enjoy it.
Get your self a certificate printed too so you can hang it on the back of
the toilet door too.

But really, Ive worked in and around live tv for n decades*, Ive taken
verbal abuse from the Keys, Grey, Shrieves, Grey's brother and others too
insignificant to mention. As a result Ive got thicker skin than an
armadillo. You really think I care??

* where n is an integer between 1 and 8.


BTT, if I must legitimise your 'victory', here is why I wrote what I wrote:
In the begining I wrote that your telly must make Mary Berry 20 appear 20
years younger, because you turn your sharpness setting all the way down.

To which your reply was:
Oh give over, you have it totally arse about face, a setting of zero on most
TV sets turns the Photo Shoppish artificial edge enhancement processing off.

This _appears_ that you are saying that the sharpness control does not then
become an 'unsharp' control when turned 'fully down'. That because it starts
at zero, is proof that it does not 'unsharpen' the image.


To which I started to "blather on about sliders in a desparate attempt to
cover my arse".
When in my mind, I was attempting to bring a 'virtual control' (your tellys
sharpness metric) in to the physical world, by comparing it to something
many have encountered in the past on analogue equipment.

Which is why I used the example of three tellys, all with identical
internals, but with different markings on the exterior by the sharpness
control.

The point being, that no matter what scale displayed by the control, be it 0
to 10, 1 to 11, or -5 to +5, the effect is exactly the same on the inside of
the box.
Therefore, the numbers shown by the control, relate to absolutely nothing on
the inside of the box.
They do not actually measure anything. i.e. Gain/attenuation, filter input
value, flux capacitor voltage, etc

Thus, just because your tellys sharpness control starts at zero, this does
not neccesarily mean that there is no 'unsharpening' being applied to your
pictures.
i.e. The zero on your settings display is just a number. It is just a means
for non-technical types to interpret how much of an effect has been applied.

So raise a glass, victory, it appears (to some), is yours Ken.
https://www.google.ca/search?site=&s...tificate+maker





All times are GMT. The time now is 05:07 PM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.
SEO by vBSEO 2.4.0
Copyright 2004-2006 DigitalTVBanter.co.uk