Digital TV Banter

Digital TV Banter (https://www.digitaltvbanter.co.uk/forum.php)
-   uk.tech.digital-tv (Digital TV - General) (https://www.digitaltvbanter.co.uk/forumdisplay.php?f=4)
-   -   4k TV on Freesat or Freeview? (https://www.digitaltvbanter.co.uk/showthread.php?t=34439)

Michael Chare[_5_] August 2nd 15 11:22 AM

4k TV on Freesat or Freeview?
 
4K TVs are becoming available and more reasonable prices and there are
some 4k internet streaming channels.

So when, if ever, will any of the Freesat or Freeview channels convert
to 4K?

--
Michael Chare

David[_9_] August 2nd 15 12:28 PM

4k TV on Freesat or Freeview?
 
Hi
I thought 4K TV was being supplied via the internet.

Regards
David





"Michael Chare" wrote in message ...

4K TVs are becoming available and more reasonable prices and there are
some 4k internet streaming channels.

So when, if ever, will any of the Freesat or Freeview channels convert
to 4K?

--
Michael Chare

Angus Robertson - Magenta Systems Ltd August 2nd 15 01:06 PM

4k TV on Freesat or Freeview?
 
So when, if ever, will any of the Freesat or Freeview channels
convert to 4K?


Sky is apparently launching a new 4K and/or UHD compatible SkyQ box later
this year and has plenty of satellite bandwidth to cope with the higher (up
four times HD) demands of UHD (remember how the BBC and Sky conjured up 20
or more new HD channels for the last Olympics).

The BBC did UHD test transmissions using DVB-T2 from three main
transmitters last year, but has been giving up it's satellite transponders
to save money, so seems unlikely to want to offer free to air UHD in the
near future.

So I'd guess the only free to air UHD will be state subsidised European and
Arabic broadcasters, and a German shopping channel.

Angus



the dog from that film you saw August 2nd 15 02:56 PM

4k TV on Freesat or Freeview?
 
On 02/08/2015 14:06, Angus Robertson - Magenta Systems Ltd wrote:
So when, if ever, will any of the Freesat or Freeview channels
convert to 4K?


Sky is apparently launching a new 4K and/or UHD compatible SkyQ box later
this year and has plenty of satellite bandwidth to cope with the higher (up
four times HD) demands of UHD (remember how the BBC and Sky conjured up 20
or more new HD channels for the last Olympics).

The BBC did UHD test transmissions using DVB-T2 from three main
transmitters last year, but has been giving up it's satellite transponders
to save money, so seems unlikely to want to offer free to air UHD in the
near future.

So I'd guess the only free to air UHD will be state subsidised European and
Arabic broadcasters, and a German shopping channel.

Angus





worth waiting before buying a tv though.
if sky uses 120fps for their broadcasts as rumoured there's probably
nothing on sale now that will show them at their best.

--
Gareth.
That fly.... Is your magic wand.

Michael Chare[_5_] August 2nd 15 07:12 PM

4k TV on Freesat or Freeview?
 
On 02/08/2015 15:56, the dog from that film you saw wrote:

worth waiting before buying a tv though.
if sky uses 120fps for their broadcasts as rumoured there's probably
nothing on sale now that will show them at their best.


Yes, I think it is necessary to wait a couple of years. From what I have
read, some of the present TVs blur motion pictures.

I would rather receive by satellite rather than broadband as there are
less contention problems.


--
Michael Chare

R. Mark Clayton[_2_] August 2nd 15 08:18 PM

4k TV on Freesat or Freeview?
 
On Sunday, 2 August 2015 20:12:17 UTC+1, Michael Chare wrote:
On 02/08/2015 15:56, the dog from that film you saw wrote:

worth waiting before buying a tv though.
if sky uses 120fps for their broadcasts as rumoured there's probably
nothing on sale now that will show them at their best.


Yes, I think it is necessary to wait a couple of years. From what I have
read, some of the present TVs blur motion pictures.

I would rather receive by satellite rather than broadband as there are
less contention problems.


--
Michael Chare


Wait if you can, but if your TV breaks down, then the additional cost of 4K will provide more future proofing.

_Unknown_Freelancer_ August 3rd 15 11:07 AM

4k TV on Freesat or Freeview?
 
"Michael Chare" wrote in message
...
4K TVs are becoming available and more reasonable prices and there are
some 4k internet streaming channels.

So when, if ever, will any of the Freesat or Freeview channels convert to
4K?

--
Michael Chare


4K ....over Freeview?
One hopes you are 'having a laugh'. Seriously.

Consider that this stuff originates at 12Gb/s.
Tweleve gigabits of data, per second.
In comparison, present HD originates at just under 1.5Gb/s.

And to compress this enough so that its transmittable over freeview, you
would have to dispense with so much information that you would render 'UHD'
pointless.
i.e. The loss in quality would be so bad that it would be comparable to
existing HD. Therefore, just what is the point?

The only way to transmit 'acceptable' UHD is either usings serveral channel
spaces over satellite, or proper* broadband.
* meaning something better than BTs standard twisted pair phone line. Either
the co-ax that Virgin is installing, or fibre from both BT and Virgin.

4K over Freeview?
Hopefully, never.



R. Mark Clayton[_2_] August 3rd 15 01:51 PM

4k TV on Freesat or Freeview?
 
On Monday, 3 August 2015 12:07:38 UTC+1, _Unknown_Freelancer_ wrote:
"Michael Chare" wrote in message
...
4K TVs are becoming available and more reasonable prices and there are
some 4k internet streaming channels.

So when, if ever, will any of the Freesat or Freeview channels convert to
4K?

--
Michael Chare


4K ....over Freeview?
One hopes you are 'having a laugh'. Seriously.

Consider that this stuff originates at 12Gb/s.
Tweleve gigabits of data, per second.
In comparison, present HD originates at just under 1.5Gb/s.

And to compress this enough so that its transmittable over freeview, you
would have to dispense with so much information that you would render 'UHD'
pointless.
i.e. The loss in quality would be so bad that it would be comparable to
existing HD. Therefore, just what is the point?

The only way to transmit 'acceptable' UHD is either usings serveral channel
spaces over satellite, or proper* broadband.
* meaning something better than BTs standard twisted pair phone line. Either
the co-ax that Virgin is installing, or fibre from both BT and Virgin.

4K over Freeview?
Hopefully, never.


UHD is certainly no more than four times the bandwidth of 1080i, which is broadcast in the teens of Mbps (SD usually less than half that).

With more sophisticated compression BT expect to be able to get UHD down a 30-40Mbps pipe, which curiously enough is what you normally get from FTTC. This will also fit onto a single satellite transponder. These currently carry up to ten SD channels or two to three HD ones.

Different argument for Freeview, although Freeview HD looks good.

Roderick Stewart[_3_] August 3rd 15 03:30 PM

4k TV on Freesat or Freeview?
 
On Mon, 3 Aug 2015 06:51:47 -0700 (PDT), "R. Mark Clayton"
wrote:

With more sophisticated compression BT expect to be able
to get UHD down a 30-40Mbps pipe, which curiously enough
is what you normally get from FTTC.


Your version of "normally" might not apply to all FTTC users.

Rod.

_Unknown_Freelancer_ August 3rd 15 06:22 PM

4k TV on Freesat or Freeview?
 
"R. Mark Clayton" wrote in message
...
On Monday, 3 August 2015 12:07:38 UTC+1, _Unknown_Freelancer_ wrote:
"Michael Chare" wrote in message
...
4K TVs are becoming available and more reasonable prices and there are
some 4k internet streaming channels.

So when, if ever, will any of the Freesat or Freeview channels convert
to
4K?

--
Michael Chare


4K ....over Freeview?
One hopes you are 'having a laugh'. Seriously.

Consider that this stuff originates at 12Gb/s.
Tweleve gigabits of data, per second.
In comparison, present HD originates at just under 1.5Gb/s.

And to compress this enough so that its transmittable over freeview, you
would have to dispense with so much information that you would render
'UHD'
pointless.
i.e. The loss in quality would be so bad that it would be comparable to
existing HD. Therefore, just what is the point?

The only way to transmit 'acceptable' UHD is either usings serveral
channel
spaces over satellite, or proper* broadband.
* meaning something better than BTs standard twisted pair phone line.
Either
the co-ax that Virgin is installing, or fibre from both BT and Virgin.

4K over Freeview?
Hopefully, never.


UHD is certainly no more than four times the bandwidth of 1080i, which is
broadcast in the teens of Mbps (SD usually less than half that).

With more sophisticated compression BT expect to be able to get UHD down a
30-40Mbps pipe, which curiously enough is what you normally get from FTTC.
This will also fit onto a single satellite transponder. These currently
carry up to ten SD channels or two to three HD ones.

Different argument for Freeview, although Freeview HD looks good.







"Freeview HD looks good."

"...looks good" ????



The war, my lord, is lost!

Freeview HD is.... (sorry, Ive run out of expletives) DOG ****E!

Next time there's a field based sport on telly, watch it.
Instead of watching 'the action', watch the detail in the grass when the
camera moves around.

The grass turns to one big green VHS* quality mush. All the detail in the
grass disappears.
When I say VHS quality, I mean the tape you left in the deck for that
'emergency crash in to record' occasion!

The detail only returns to the grass when the camera stops moving.
.....once the mpg encoding has enough bandwidth to restore picture detail.


Next, take any 'shiney floor show' on ITV.... well, just about any programme
on ITV which isnt a vacuous soap or period drama.
Wait for the confetti drop at the end of the programme.... And just like
Cillit Bang, the detail is gone!
.....too many different items moving in too many different directions for mpg
encoding to be able to maintain any reasonable level of detail, resulting in
a picture resembling a YouTube video in 2005!

Any captions or graphics, any channel. They should all have distinct sharp
edges.
Look again, and you'll see a border around everything, about 1 to 3 pixels
wide.
.....because the transmitted mpg stream doesnt have enough bandwidth to
detail the edges of graphics.

"HD" is not HD.
Yes, its 'High', lots of pixels and vertical lines, but no 'Definition'.

The quality you see on HD Freeview equates to SD quality at source.
i.e. What you see on telly is only as good as broadcast quality SD (at
source) ever was.

Thus, the 'digital tv' rip off.
Freeview HD does not look good.




FWIW, UHD IS more than four times the bandwidth.

A 4K video source produces four 3Gb/s streams.
......that was the method SMPTE came up with.

A 4K camera has four video outputs BNCs. Each containing a 3Gb/s stream.
4 x 3 = 12Gb/s

As opposed to a 1080 HD camera, which produces one 1.5Gb/s stream from one
BNC socket.

And you want to compress 12Gb/s down to teens of Mb/s ????
Just what do you think will happen to your detail? ....to your picture
definition?
You think it will be 'Ultra High' do you?


"....better compression....blah"
Its called HEVC, or h.265.
.....Which is also ********!

All they did was take h.264 and make the macro blocks bigger, and tweak
colour spacing.
i.e. the size of the squares the encoder carves the picture in to.

So when you get a 4K ITV shiney floor show (HA HA HA HAAAAAAAA! Shortly
after hell freezes over), the size of the squares the confetti drop causes
will be four times bigger.
i.e. it will still be dog ****e!


On the plus side, the UHD pictures you see will by that point be source HD
quality at least!
i.e. You will see at home what we presently see at source in HD.


At present, if a football match is shot in 4K, it is possible to make out
the detail in peoples faces _in the opposite stand_.
It is actually possible to lip read whilst the camera is sat on a wide shot,
such is the level of detail in 4K.
If you compress it..... well, whats the point?



FTR, I live right next to a major BT exchange.
There isnt even a cabinet between me and the exchange.
I get 13Mb/s!! (twisted pair phone line)
As I said previously, to get decent bandwidth to the home requires either
fibre or co-ax.
......not a lot of that about, and in some areas, just aint gonna happen.



Michael Chare[_5_] August 3rd 15 06:47 PM

4k TV on Freesat or Freeview?
 
On 03/08/2015 19:22, _Unknown_Freelancer_ wrote:


FTR, I live right next to a major BT exchange.
There isnt even a cabinet between me and the exchange.
I get 13Mb/s!! (twisted pair phone line)
As I said previously, to get decent bandwidth to the home requires either
fibre or co-ax.
.....not a lot of that about, and in some areas, just aint gonna happen.


As far as home connections are concerned, unfortunately BT are
determined to use early 20th century technology. The Internet speed to
my house by metal phone line is about a 10th of what you have.
Fortunately I have a real fibre connection (not what BT call fibre)
which runs 10 times faster than you have.


--
Michael Chare

Paul Ratcliffe August 3rd 15 07:12 PM

4k TV on Freesat or Freeview?
 
On Mon, 3 Aug 2015 19:22:14 +0100, _Unknown_Freelancer_ /dev/null wrote:

The war, my lord, is lost!

Freeview HD is.... (sorry, Ive run out of expletives) DOG ****E!


Yes, Camera 1 out of the scanner/studio is a different world to what
comes back off air. Always was, whatever the system.

FTR, I live right next to a major BT exchange.
There isnt even a cabinet between me and the exchange.
I get 13Mb/s!! (twisted pair phone line)


Your cable is faulty then. On one I monitor, it is currently syncing
at 19.4Mb/s and the line is almost exactly a mile long.

_Unknown_Freelancer_ August 3rd 15 09:50 PM

4k TV on Freesat or Freeview?
 
"Paul Ratcliffe" wrote in message
...
On Mon, 3 Aug 2015 19:22:14 +0100, _Unknown_Freelancer_ /dev/null wrote:

The war, my lord, is lost!

Freeview HD is.... (sorry, Ive run out of expletives) DOG ****E!


Yes, Camera 1 out of the scanner/studio is a different world to what
comes back off air. Always was, whatever the system.



I know.
But the point I was making is that so called 'HD' is not what it says on the
tin, because RMC said "Freeview HD looks good" in the previous post.
Its barely SD.
Its VHS. S-VHS if you must.

The the digital revolution/HD upgrade was supposed to improve things.
Well, it shoud have, if the bean counters hadnt reigned in on the bit rates
the broadcasters use.
.....and this is because a private company (Arqiva) now run our terrestrial
networks.... who want pennies per bit.

I never expect the off air signal to be 'grade 1'.
But I do expect it to be far superior than what I recorded on magnetic tape
in 1995.
Instead we get YouTube 2005 quality.... a new kind of ****e.



FTR, I live right next to a major BT exchange.
There isnt even a cabinet between me and the exchange.
I get 13Mb/s!! (twisted pair phone line)


Your cable is faulty then. On one I monitor, it is currently syncing
at 19.4Mb/s and the line is almost exactly a mile long.


No, its not faulty.
Its been tested. No faults.
Its 20 years old. So will not be replaced.
Virgin do not have any equipment in the exchange. So no other 'strong
contenders'.
Both Virgin and BT will not lay fibre or co-ax in to my street. (Already
asked both)
So my neighbours and I are stuck with it.




Andy Furniss[_3_] August 4th 15 10:21 AM

4k TV on Freesat or Freeview?
 
_Unknown_Freelancer_ wrote:

The quality you see on HD Freeview equates to SD quality at source.
i.e. What you see on telly is only as good as broadcast quality SD
(at source) ever was.


I agree that the bitrates are too low, but on something like park-run
from below, a 10 mbit h264 is still looks better than a de-interlaced
scaled up raw 576i. I could try harder WRT deint/scale quality - but I
doubt that would really help. Maybe the source is not up to modern
standards? Those samples do seem to get a lot of "research" done on them
- but then I am just an idly/lazily half interested amateur.

Of course watching the raw 1080i25 is even nicer - but it still needs
de-interlacing and that + the shutter speed still means the grass/snow
in the foreground is blurred during the pan.

At least UHD will avoid the de-interlace and in the future 120fps help
with the shutter speed blur.

I bet 10mbit hevc would look better than the same rate 264 on parkrun -
but I would have to cheat as I don't think free lib/ffmpeg will code
interlaced yet. I know hevc just does fields anyway - maybe deint first
to 50p wouldn't be cheating too much, but it's currently so slow I could
never play it at full speed on my old PC.

ftp://vqeg.its.bldrdoc.gov/HDTV/

for raw video + details of.




Andy Furniss[_3_] August 4th 15 10:50 AM

4k TV on Freesat or Freeview?
 
Andy Furniss wrote:

ftp://vqeg.its.bldrdoc.gov/HDTV/

for raw video + details of.


To be more specific I am looking at the 576i25 and 1080i25 park run
(converted to 420) - though watching 422 makes no difference, it
wouldn't be a fair test as I had to convert to 420 to make a
representative h264 anyway.

ftp://vqeg.its.bldrdoc.gov/HDTV/SVT_...ekas_Exports_/



_Unknown_Freelancer_ August 4th 15 07:42 PM

4k TV on Freesat or Freeview?
 
Sorry, was a wee bit busy earlier.


"Andy Furniss" [email protected] wrote in message
o.uk...
_Unknown_Freelancer_ wrote:

The quality you see on HD Freeview equates to SD quality at source.
i.e. What you see on telly is only as good as broadcast quality SD
(at source) ever was.


I agree that the bitrates are too low, but on something like park-run
from below, a 10 mbit h264 is still looks better than a de-interlaced
scaled up raw 576i. I could try harder WRT deint/scale quality - but I
doubt that would really help. Maybe the source is not up to modern
standards? Those samples do seem to get a lot of "research" done on them
- but then I am just an idly/lazily half interested amateur.



Why would you de-interlace 576i? At all?
If you want to intentionally make PAL look bad, you de-interlace it.

Although I notice certain 'arty types' in sports production actually add a
'film effect' to some items.
Aparently its 'art'.
.....with a capital F me thinks!


The key to its success was that it refreshed the screen (albeit at half
resolution) fifty times a second*.
The same 'higher refresh rate' rule applies today. It gives clarity in any
fast motion, such as sports.

*FWIW, the half resolution higher refresh rate was chosen at the time
because they couldnt get phosphor to glow long enough for a progressive scan
without producing flicker.

Source quality SD contains more detail than present DTT HD.

Please do not misunderstand this and interpret this as me saying SD HD.
Simply not the case.
Its once a picture has gone through the terretrial transmission chain.

A few years ago, someone at ITV49 had a great idea to make a 'reality'
programme interactive by placing a QR code in the top tight of the screen.
It worked perfectly in the edit. The website and chat rooms were all set up.
Then the programme went to air..... no-ones phone could interpret the QR
code.... because it had been compressed to a grey blur!

Some muppet just had not thought that one through, and its never been seen
since!

And this was something that wasnt even moving.
It was a permanent stationary graphic.
At no time was there sufficient capacity to send any definition detail. So
it stayed as a grey smudge.



Of course watching the raw 1080i25 is even nicer - but it still needs
de-interlacing and that + the shutter speed still means the grass/snow
in the foreground is blurred during the pan.



Again, why do you need to de-interlace ??
Wht are you intent on de-interlacing?
When SMPTE created these new standards they set added interlaced for
distinct reasons.
i.e. They didnt do it for a laugh, or while they were down the beer keller!

The shutter speed does not cause the grass to be 'blurred during the pan'.
There is 'motion blur', which is what occurs when the camera moves. Super
motion and HiMo cameras negate this because of their increased aquistion
frame rates, which is then played back at 50i.

And there is artificting blur, because the broadcaster has compressed the
source to VHS quality, and ALL the grass becomes one green smudge. Because
there is insufficient bandwidth to transmit a reasonable amount of detail.

There is a distinct difference between the two.




At least UHD will avoid the de-interlace and in the future 120fps help
with the shutter speed blur.


DO NOT be so sure that any forthcoming UHD channels will transmit, or even
originate, at 120fps.

I know, and full well understand that is the intention, the long term goal.
But at present, its just about getting 4K out before the other broadcasters
do.
Four thousand horizontal pixels in a progressive scan constitutes UHD.
120fps does not constitute, nor is a requirement of, UHD.

As with everything else, bits = pennies, hence not neccesarily leaping
straight to 120fps.




I bet 10mbit hevc would look better than the same rate 264 on parkrun -


No.
HEVC macro block size is eight times bigger than that of h.264. (64x64)
Yes there would be far less data, but there would also be far bigger square
artificts*. Eight times bigger.
*See also YouTube 2005.

but I would have to cheat as I don't think free lib/ffmpeg will code
interlaced yet.


Yes, it does.
http://x265.readthedocs.org/en/defau...-intra-options

CTRL+F 'interlace'


I know hevc just does fields anyway


According to the manual (above), it can do interlaced.
Crazy, I know, but it seems it will.

- maybe deint first


You are obsessed with progressive scan!!!

I know 'its better', and just like 'garlic bread', 'its the future'.
But if something is sourced in an interlaced standard, and the equipment you
play back on can handle interlaced images, why ruin it?

to 50p wouldn't be cheating too much,


And in encoding to 50p you double the required data rate.

but it's currently so slow I could
never play it at full speed on my old PC.

ftp://vqeg.its.bldrdoc.gov/HDTV/

for raw video + details of.


To be more specific I am looking at the 576i25 and 1080i25 park run
(converted to 420) - though watching 422 makes no difference, it
wouldn't be a fair test as I had to convert to 420 to make a
representative h264 anyway.

ftp://vqeg.its.bldrdoc.gov/HDTV/SVT_...ekas_Exports_/


Changing pixel format, or colour space, will make zero difference to
definition or detail contained within an image.
It relates to colour handling.
i.e. How you record the brightness of three neighbouring pixels.

Yes, some colour spaces are not compatible with some encoders.
But that was in mind when certain encoders were constructed.

Even reduced to 1:0:0 (monochrome), an image would still contain detail,
edge definition, and clarity.
e.g. A graphic consisting of words would still contain precise edge detail
in monochrome, in 1:0:0.
A moving shot of a pitch would still contain the detail and definition of
the playing surface.

Only once you push that through an inter-frame encoder, that is when you
loose detail.
That is when graphics suddenly get an aura of their own. (Derek Acorah not
included.)
By which I mean an extra unintended pixel shadow 1 to 3 pixels wide around
the whole edge of any graphics.


Going back to my original point, that if anyone transmitted _source quality_
SD now (using the same bandwidth which is applied to HD. i.e. uncompressed
SD), it would appear to contain more detail than the DTT dog faeces we get
presently.

Therefore, why, instead of nob waving, launching half cocked 4K, why doesnt
someone have the bright idea of transmitting 'better HD'??
"True HD" even??
It would save trucks full of money for all concerned for a start!




Dave W August 4th 15 10:38 PM

4k TV on Freesat or Freeview?
 
On Mon, 3 Aug 2015 19:22:14 +0100, "_Unknown_Freelancer_" /dev/null
wrote: (amongst other things)

Any captions or graphics, any channel. They should all have distinct sharp
edges.
Look again, and you'll see a border around everything, about 1 to 3 pixels
wide.
....because the transmitted mpg stream doesnt have enough bandwidth to
detail the edges of graphics.

"HD" is not HD.
Yes, its 'High', lots of pixels and vertical lines, but no 'Definition'.

The quality you see on HD Freeview equates to SD quality at source.
i.e. What you see on telly is only as good as broadcast quality SD (at
source) ever was.

Thus, the 'digital tv' rip off.
Freeview HD does not look good.


My TV, fed from HD Freesat, has to have "sharpness" turned to zero to
get rid of the horrible borders and exaggerated detail. The HD looks
good to me. In fact too good, exposing the make-up applied to female
presenters to make them look good in SD.
--
Dave W


Andy Furniss[_3_] August 4th 15 11:48 PM

4k TV on Freesat or Freeview?
 
_Unknown_Freelancer_ wrote:
Sorry, was a wee bit busy earlier.


"Andy Furniss" [email protected] wrote in message
o.uk...
_Unknown_Freelancer_ wrote:

The quality you see on HD Freeview equates to SD quality at
source. i.e. What you see on telly is only as good as broadcast
quality SD (at source) ever was.


I agree that the bitrates are too low, but on something like
park-run from below, a 10 mbit h264 is still looks better than a
de-interlaced scaled up raw 576i. I could try harder WRT
deint/scale quality - but I doubt that would really help. Maybe
the source is not up to modern standards? Those samples do seem to
get a lot of "research" done on them - but then I am just an
idly/lazily half interested amateur.



Why would you de-interlace 576i? At all? If you want to
intentionally make PAL look bad, you de-interlace it.


It's 2015 - I don't have an interlaced display any more!

De-interlace and scale is what my (and I guess most peoples) TV does to
25i I could let it do it or I can do it in s/w my self.

Although I notice certain 'arty types' in sports production actually
add a 'film effect' to some items. Aparently its 'art'. ....with a
capital F me thinks!


Not filmic that would imply deint to 25p (eww) TV/me would of course do 50p.

The key to its success was that it refreshed the screen (albeit at
half resolution) fifty times a second*. The same 'higher refresh
rate' rule applies today. It gives clarity in any fast motion, such
as sports.

*FWIW, the half resolution higher refresh rate was chosen at the time
because they couldnt get phosphor to glow long enough for a
progressive scan without producing flicker.

Source quality SD contains more detail than present DTT HD.


The part run samples I pointed to are source quality aren't they?

Whatever I try I can't get the raw 576i to look as good as a 10mbit
encode of the 1080i.

Different cameras of course (same lens), detail -

ftp://vqeg.its.bldrdoc.gov/HDTV/SVT_exports/README.txt


Please do not misunderstand this and interpret this as me saying SD
HD. Simply not the case. Its once a picture has gone through the

terretrial transmission chain.


I don't see why my 10mbit x264 encode should beat pro kit - assuming of
course they would give that much bitrate to similar content.

A few years ago, someone at ITV49 had a great idea to make a
'reality' programme interactive by placing a QR code in the top
tight of the screen. It worked perfectly in the edit. The website and
chat rooms were all set up. Then the programme went to air.....
no-ones phone could interpret the QR code.... because it had been
compressed to a grey blur!

Some muppet just had not thought that one through, and its never
been seen since!

And this was something that wasnt even moving. It was a permanent
stationary graphic. At no time was there sufficient capacity to send
any definition detail. So it stayed as a grey smudge.


That is amusing, and I don't doubt that DTT bitrates are terrible. Back
when I had both analogue and DTT on a CRT it was plain to see people had
hairy arms etc. that disappeared on the mpeg2 version. Of course that
was SD - SD. I am still unconvinced that 702x576 even raw will beat HD
even if it is somewhat bitrate crippled.

Of course watching the raw 1080i25 is even nicer - but it still
needs de-interlacing and that + the shutter speed still means the
grass/snow in the foreground is blurred during the pan.



Again, why do you need to de-interlace ?? Wht are you intent on
de-interlacing? When SMPTE created these new standards they set
added interlaced for distinct reasons. i.e. They didnt do it for a
laugh, or while they were down the beer keller!


Reluctantly to halve the bandwidth while giving decent temporal res.

Do you pros still use interlaced displays then? At work/at home/for HD?

I mean surely most people now see interlaced on a progressive display =
it's de-interlaced. If I put my Panny plasma into an interlaced mode it
de-interlaces (motion adaptively). It doesn't become an interlaced
display. I can de-interlace in s/w to achieve the same effect on a
dumber display (my 1920x1080 computer monitor).

The shutter speed does not cause the grass to be 'blurred during the
pan'. There is 'motion blur', which is what occurs when the camera
moves. Super motion and HiMo cameras negate this because of their
increased aquistion frame rates, which is then played back at 50i.


OK - you are in the trade, but I do recall shutter speed getting a
mention in one of the BBC R&D high framerate papers.

And there is artificting blur, because the broadcaster has
compressed the source to VHS quality, and ALL the grass becomes one
green smudge. Because there is insufficient bandwidth to transmit a
reasonable amount of detail.

There is a distinct difference between the two.


Fair enough, though the blur on the raw 1080i25 park run is clearly not
the latter, though total mush it isn't.

At least UHD will avoid the de-interlace and in the future 120fps
help with the shutter speed blur.


DO NOT be so sure that any forthcoming UHD channels will transmit,
or even originate, at 120fps.

I know, and full well understand that is the intention, the long
term goal. But at present, its just about getting 4K out before the
other broadcasters do. Four thousand horizontal pixels in a
progressive scan constitutes UHD. 120fps does not constitute, nor is
a requirement of, UHD.

As with everything else, bits = pennies, hence not neccesarily
leaping straight to 120fps.


Yes, I'm not upgrading till it's the norm (if I live that long) :-)

I bet 10mbit hevc would look better than the same rate 264 on
parkrun -


No. HEVC macro block size is eight times bigger than that of h.264.
(64x64) Yes there would be far less data, but there would also be
far bigger square artificts*. Eight times bigger. *See also YouTube
2005.

but I would have to cheat as I don't think free lib/ffmpeg will
code interlaced yet.


Yes, it does.
http://x265.readthedocs.org/en/defau...-intra-options



CTRL+F 'interlace'


OK, I'll have to revisit that. I do recall there being an issue
somewhere, so I guess it is/was ffmpeg - possibly that it can't/couldn't
actually play interlaced properly as it is/was used to getting weaved
frames from decoders, not fields.

I know hevc just does fields anyway


According to the manual (above), it can do interlaced. Crazy, I
know, but it seems it will.


What I meant buy fields was it doesn't (AFAIK) do anything complicated
for interlaced like mpeg2/h264 did - like they really wanted not do
interlaced, but did simply = coding fields and nothing more.

- maybe deint first


You are obsessed with progressive scan!!!

I know 'its better', and just like 'garlic bread', 'its the future'.
But if something is sourced in an interlaced standard, and the
equipment you play back on can handle interlaced images, why ruin
it?


My TV de-interlaces, I can do the same (possibly better) to play back on
a monitor that doesn't.

to 50p wouldn't be cheating too much,


And in encoding to 50p you double the required data rate.


True - but then weaved frames are also "extra" complicated so I don't
think it would need 2x bits for the sameq. It would be a different test
though.


but it's currently so slow I could never play it at full speed on
my old PC.

ftp://vqeg.its.bldrdoc.gov/HDTV/

for raw video + details of.


To be more specific I am looking at the 576i25 and 1080i25 park run
(converted to 420) - though watching 422 makes no difference, it
wouldn't be a fair test as I had to convert to 420 to make a
representative h264 anyway.

ftp://vqeg.its.bldrdoc.gov/HDTV/SVT_...ekas_Exports_/


Changing pixel format, or colour space, will make zero difference to
definition or detail contained within an image. It relates to
colour handling. i.e. How you record the brightness of three
neighbouring pixels.

Yes, some colour spaces are not compatible with some encoders. But
that was in mind when certain encoders were constructed.

Even reduced to 1:0:0 (monochrome), an image would still contain
detail, edge definition, and clarity. e.g. A graphic consisting of
words would still contain precise edge detail in monochrome, in
1:0:0. A moving shot of a pitch would still contain the detail and
definition of the playing surface.

Only once you push that through an inter-frame encoder, that is when
you loose detail. That is when graphics suddenly get an aura of
their own. (Derek Acorah not included.) By which I mean an extra
unintended pixel shadow 1 to 3 pixels wide around the whole edge of
any graphics.


Going back to my original point, that if anyone transmitted _source
quality_ SD now (using the same bandwidth which is applied to HD.
i.e. uncompressed SD), it would appear to contain more detail than
the DTT dog faeces we get presently.

Therefore, why, instead of nob waving, launching half cocked 4K, why
doesnt someone have the bright idea of transmitting 'better HD'??
"True HD" even?? It would save trucks full of money for all
concerned for a start!


Better HD yes - I am still not convinced it's quite as bad as SD, though
- maybe I don't watch enough TV (usually motorsport) - perhaps park run
is misleading (I obviously don't have access to much else to compare),
but to me 10mbit x264 HD wins over raw SD for that.

R. Mark Clayton[_2_] August 5th 15 04:22 PM

4k TV on Freesat or Freeview?
 
On Monday, 3 August 2015 19:22:22 UTC+1, _Unknown_Freelancer_ wrote:
"R. Mark Clayton" wrote in message
...
On Monday, 3 August 2015 12:07:38 UTC+1, _Unknown_Freelancer_ wrote:
"Michael Chare" wrote in message
...
4K TVs are becoming available and more reasonable prices and there are
some 4k internet streaming channels.

So when, if ever, will any of the Freesat or Freeview channels convert
to
4K?

--
Michael Chare


4K ....over Freeview?
One hopes you are 'having a laugh'. Seriously.

Consider that this stuff originates at 12Gb/s.
Tweleve gigabits of data, per second.
In comparison, present HD originates at just under 1.5Gb/s.

And to compress this enough so that its transmittable over freeview, you
would have to dispense with so much information that you would render
'UHD'
pointless.
i.e. The loss in quality would be so bad that it would be comparable to
existing HD. Therefore, just what is the point?

The only way to transmit 'acceptable' UHD is either usings serveral
channel
spaces over satellite, or proper* broadband.
* meaning something better than BTs standard twisted pair phone line.
Either
the co-ax that Virgin is installing, or fibre from both BT and Virgin.

4K over Freeview?
Hopefully, never.


UHD is certainly no more than four times the bandwidth of 1080i, which is
broadcast in the teens of Mbps (SD usually less than half that).

With more sophisticated compression BT expect to be able to get UHD down a
30-40Mbps pipe, which curiously enough is what you normally get from FTTC.
This will also fit onto a single satellite transponder. These currently
carry up to ten SD channels or two to three HD ones.

Different argument for Freeview, although Freeview HD looks good.







"Freeview HD looks good."

"...looks good" ????



The war, my lord, is lost!

Freeview HD is.... (sorry, Ive run out of expletives) DOG ****E!

Next time there's a field based sport on telly, watch it.
Instead of watching 'the action', watch the detail in the grass when the
camera moves around.

The grass turns to one big green VHS* quality mush. All the detail in the
grass disappears.
When I say VHS quality, I mean the tape you left in the deck for that
'emergency crash in to record' occasion!

The detail only returns to the grass when the camera stops moving.
....once the mpg encoding has enough bandwidth to restore picture detail.


As it happens we only bought a [small] Freeview HD TV (for the kitchen) relatively recently. The picture on the HD channels is noticeably better than SD.

Our main TV is Freesat and the smearing etc. that you mention does not occur very much on that, however BT TV (over the internet) of a football match suffered in exactly the way you describe.



Next, take any 'shiney floor show' on ITV.... well, just about any programme
on ITV which isnt a vacuous soap or period drama.
Wait for the confetti drop at the end of the programme.... And just like
Cillit Bang, the detail is gone!
....too many different items moving in too many different directions for mpg
encoding to be able to maintain any reasonable level of detail, resulting in
a picture resembling a YouTube video in 2005!

Any captions or graphics, any channel. They should all have distinct sharp
edges.
Look again, and you'll see a border around everything, about 1 to 3 pixels
wide.
....because the transmitted mpg stream doesnt have enough bandwidth to
detail the edges of graphics.

"HD" is not HD.
Yes, its 'High', lots of pixels and vertical lines, but no 'Definition'.

The quality you see on HD Freeview equates to SD quality at source.
i.e. What you see on telly is only as good as broadcast quality SD (at
source) ever was.

Thus, the 'digital tv' rip off.
Freeview HD does not look good.




FWIW, UHD IS more than four times the bandwidth.


No it is four times the resolution.


A 4K video source produces four 3Gb/s streams.
.....that was the method SMPTE came up with.

A 4K camera has four video outputs BNCs. Each containing a 3Gb/s stream.
4 x 3 = 12Gb/s

As opposed to a 1080 HD camera, which produces one 1.5Gb/s stream from one
BNC socket.

And you want to compress 12Gb/s down to teens of Mb/s ????
Just what do you think will happen to your detail? ....to your picture
definition?
You think it will be 'Ultra High' do you?


Well I am sitting in front of a 4k monitor running at 60Hz. The signal comes down one DisplayPort cable and IIRC it can manage 4k @ 30Hz on HDMI.



"....better compression....blah"
Its called HEVC, or h.265.
....Which is also ********!

All they did was take h.264 and make the macro blocks bigger, and tweak
colour spacing.
i.e. the size of the squares the encoder carves the picture in to.

So when you get a 4K ITV shiney floor show (HA HA HA HAAAAAAAA! Shortly
after hell freezes over), the size of the squares the confetti drop causes
will be four times bigger.
i.e. it will still be dog ****e!


On the plus side, the UHD pictures you see will by that point be source HD
quality at least!
i.e. You will see at home what we presently see at source in HD.


At present, if a football match is shot in 4K, it is possible to make out
the detail in peoples faces _in the opposite stand_.
It is actually possible to lip read whilst the camera is sat on a wide shot,
such is the level of detail in 4K.
If you compress it..... well, whats the point?



FTR, I live right next to a major BT exchange.
There isnt even a cabinet between me and the exchange.
I get 13Mb/s!! (twisted pair phone line)


That is because dozy BT would put the ADSL2 connection on direct to exchange lines (now sorted I believe).

As I said previously, to get decent bandwidth to the home requires either
fibre or co-ax.


Which BT installed where I am in 2009 (OK the first in the UK), giving you a maximum of 76Mbps. Cable also passes most homes (not ours - it is on a private road) and will give you 100Mbps. I ordered the 38Mbps option, and allowing a little loss for protocol overheads, that's what I get.

.....not a lot of that about, and in some areas, just aint gonna happen.


Same with buses - if you choose to live in Hicksville, Montana you ain't going to get a two minute service.

Andy Furniss[_3_] August 5th 15 08:02 PM

4k TV on Freesat or Freeview?
 
R. Mark Clayton wrote:

FWIW, UHD IS more than four times the bandwidth.


No it is four times the resolution.


Yes, but the source bitrate is 8x as you have to account for current HD
only being 25 fps or 50 fields per sec. UHD doesn't use interlacing so
50fps doubles the source bandwidth on top of the res increase. This
means for sport that the vertical res increase is (more than?) 4 times
HD. The "more than" may be debatable - but I think interlaced gets extra
filtering to prevent interline twitter.


_Unknown_Freelancer_ August 5th 15 08:08 PM

4k TV on Freesat or Freeview?
 
"Andy Furniss" [email protected] wrote in message
o.uk...
_Unknown_Freelancer_ wrote:
Sorry, was a wee bit busy earlier.


"Andy Furniss" [email protected] wrote in message
o.uk...
_Unknown_Freelancer_ wrote:

The quality you see on HD Freeview equates to SD quality at
source. i.e. What you see on telly is only as good as broadcast
quality SD (at source) ever was.

I agree that the bitrates are too low, but on something like
park-run from below, a 10 mbit h264 is still looks better than a
de-interlaced scaled up raw 576i. I could try harder WRT
deint/scale quality - but I doubt that would really help. Maybe
the source is not up to modern standards? Those samples do seem to
get a lot of "research" done on them - but then I am just an
idly/lazily half interested amateur.



Why would you de-interlace 576i? At all? If you want to
intentionally make PAL look bad, you de-interlace it.


It's 2015 - I don't have an interlaced display any more!

De-interlace and scale is what my (and I guess most peoples) TV does to
25i I could let it do it or I can do it in s/w my self.


How do you know this?
Do you have the source code from the manufacturer?

Manufacturers spend a lot of time and effort over their kit before they put
it to market.
Ok, so there is the occasional lemon model, but on the whole, most kit does
what it says on the tin.

Just because an OLED screen only came out of the factory in January 2015,
does not mean it can not interpret an interlaced scan 'is as.
Without the source code, for all we know (when watching 1080i) it may well
actually only update all the odd lines in one pass, and then all the even in
the next.

In which case, leaving source interlaced stuff as interlaced IS the best
thing to do.

Let the equipment decide what to do with it.

As previously, SMPTE didnt make up their standards for fun.
And Im pretty sure the manufacturers didnt toss a coin when writing the code
to decide how it should handle different display modes.

And in 'scaling' you're again compromising your own pictures.
Once you scale, you've ruined your copy for good.
i.e. You resample all the colour spacing, you contrive any representation.

Again, leave it as you got it, let the screen decide what to do with it.

Leap forward fifteen years. Screens will still have backward
compatitibility, and will be capable of rediculous resolutions.... but at
that point in time the manufacturers will have better means of making older
formats work well on their screens.



Although I notice certain 'arty types' in sports production actually
add a 'film effect' to some items. Aparently its 'art'. ....with a
capital F me thinks!


Not filmic that would imply deint to 25p (eww) TV/me would of course do
50p.


No.
Because it still leaves Tx at 50i.
You're not going to change the whole transmission chain for one vt package,
are you?

Generally its an effect applied at the time of playout. It just removes one
field.
"In the old days" VT was sped up to 200%, recorded, and played back at 50%
to achieve the same result.



The key to its success was that it refreshed the screen (albeit at
half resolution) fifty times a second*. The same 'higher refresh
rate' rule applies today. It gives clarity in any fast motion, such
as sports.

*FWIW, the half resolution higher refresh rate was chosen at the time
because they couldnt get phosphor to glow long enough for a
progressive scan without producing flicker.

Source quality SD contains more detail than present DTT HD.


The part run samples I pointed to are source quality aren't they?


TL;DR!
But, I dont think thats relevant anyway.
Im comparing source SD to DTT 'HD'.
Lab test vs Real world.


Whatever I try I can't get the raw 576i to look as good as a 10mbit
encode of the 1080i.


But without the source HD material, how do you know what detail/definition
has been lost to compare?
Generally compression occurs by smoothing edges (loosing detail), and then
finding repeat patterns in a frame (which buggers up captions).


Different cameras of course (same lens), detail -

ftp://vqeg.its.bldrdoc.gov/HDTV/SVT_exports/README.txt


Please do not misunderstand this and interpret this as me saying SD
HD. Simply not the case. Its once a picture has gone through the

terretrial transmission chain.


I don't see why my 10mbit x264 encode should beat pro kit - assuming of
course they would give that much bitrate to similar content.



Because its compressed to ****e by Arqiva!
Because the broadcasters dont want to pay any more s!


A few years ago, someone at ITV49 had a great idea to make a
'reality' programme interactive by placing a QR code in the top
tight of the screen. It worked perfectly in the edit. The website and
chat rooms were all set up. Then the programme went to air.....
no-ones phone could interpret the QR code.... because it had been
compressed to a grey blur!

Some muppet just had not thought that one through, and its never
been seen since!

And this was something that wasnt even moving. It was a permanent
stationary graphic. At no time was there sufficient capacity to send
any definition detail. So it stayed as a grey smudge.


That is amusing, and I don't doubt that DTT bitrates are terrible. Back
when I had both analogue and DTT on a CRT it was plain to see people had
hairy arms etc. that disappeared on the mpeg2 version. Of course that
was SD - SD. I am still unconvinced that 702x576 even raw will beat HD
even if it is somewhat bitrate crippled.

Of course watching the raw 1080i25 is even nicer - but it still
needs de-interlacing and that + the shutter speed still means the
grass/snow in the foreground is blurred during the pan.



Again, why do you need to de-interlace ?? Wht are you intent on
de-interlacing? When SMPTE created these new standards they set
added interlaced for distinct reasons. i.e. They didnt do it for a
laugh, or while they were down the beer keller!


Reluctantly to halve the bandwidth while giving decent temporal res.


But you wont be halving the data rate.
I know its the other end of the spectrum, but the maths is still the same:
(or "math", if youre "murican")
1080i50 (1080 interlaced, 25 fps, alternate lines refreshed at 50fps) =
1.5Gb/s (because you are sending 1080/2= 540 lines 50 times a second)
1080p50 (1080 progressive lines, refreshed at 50fps) = 3Gb/s (because you
are sending all 1080 lines every 50 times a second)

So, unless you're converting 1080i50 (off air telly) to 1080p25, you are
doubling your data rate by converting to 1080p50.
......and ruining the source by deinterlacing it needlessly.

And it's precisely the same if you're doing this with SD off air.
Converting 576i to 576p doubles your data rate.



Do you pros still use interlaced displays then? At work/at home/for HD?



That matters not.
Each different person has a different job to do, so they only want a picture
to suit their needs.
Vision just need it to make sure its the right colour/exposure/black level
and back focused.
Sound... (HA).... just want to see what's going to air, and a clue as to
what's next.
VT.... same as above.
Production.... "wait, what?"

It matters not to anyone really.
If this job is 720p, all the tellys (what every they be) resync
automatically.
If the next is 1080i, same again.
A programme is not made for our viewing pleasure. We just want a picture to
meet our particular requirements.
Everything just works.
And thats the way its 'supposed to be' all the way down the transmission
chain.

BUT.... you may well notice the occasional f.cup with 'regional news'.
ENG have gone out and shot something, usually out of focus, and recorded
with more audio distorion than a Foo Fighters concert!
Content goes back to the edit, and the clown at the keyboard, who has a
non-linear (computer) edit in front of him, hasnt bothered to check it on a
proper telly.
i.e. They've only watched it on their computer screen.
When it goes to air any horizontal motion shivvers.... because they've got
the field dominance the wrong way round!


I mean surely most people now see interlaced on a progressive display =
it's de-interlaced. If I put my Panny plasma into an interlaced mode it
de-interlaces (motion adaptively). It doesn't become an interlaced
display. I can de-interlace in s/w to achieve the same effect on a
dumber display (my 1920x1080 computer monitor).


As per first point above, without manufacturers source code, how do you know
that just because your telly only fell of Dixons shelf yesterday, that it
still doesnt update the display in an interlaced fashion?


The shutter speed does not cause the grass to be 'blurred during the
pan'. There is 'motion blur', which is what occurs when the camera
moves. Super motion and HiMo cameras negate this because of their
increased aquistion frame rates, which is then played back at 50i.


OK - you are in the trade, but I do recall shutter speed getting a
mention in one of the BBC R&D high framerate papers.


Ah, BBC R&D!
Way back in the last century, they really used to know their onions, and
even make their own onions too!


And there is artificting blur, because the broadcaster has
compressed the source to VHS quality, and ALL the grass becomes one
green smudge. Because there is insufficient bandwidth to transmit a
reasonable amount of detail.

There is a distinct difference between the two.


Fair enough, though the blur on the raw 1080i25 park run is clearly not
the latter, though total mush it isn't.



But ITV37 is.
And this is the stuff they tell you is better 'because its in HD'.


At least UHD will avoid the de-interlace and in the future 120fps
help with the shutter speed blur.


DO NOT be so sure that any forthcoming UHD channels will transmit,
or even originate, at 120fps.

I know, and full well understand that is the intention, the long
term goal. But at present, its just about getting 4K out before the
other broadcasters do. Four thousand horizontal pixels in a
progressive scan constitutes UHD. 120fps does not constitute, nor is
a requirement of, UHD.

As with everything else, bits = pennies, hence not neccesarily
leaping straight to 120fps.


Yes, I'm not upgrading till it's the norm (if I live that long) :-)

I bet 10mbit hevc would look better than the same rate 264 on
parkrun -


No. HEVC macro block size is eight times bigger than that of h.264.
(64x64) Yes there would be far less data, but there would also be
far bigger square artificts*. Eight times bigger. *See also YouTube
2005.

but I would have to cheat as I don't think free lib/ffmpeg will
code interlaced yet.


Yes, it does.
http://x265.readthedocs.org/en/defau...-intra-options



CTRL+F 'interlace'


OK, I'll have to revisit that. I do recall there being an issue
somewhere, so I guess it is/was ffmpeg - possibly that it can't/couldn't
actually play interlaced properly as it is/was used to getting weaved
frames from decoders, not fields.


AFAIK, the only difference is an interlace flag on every other frame.
I think its '-show_frames' (or something similar) for ffmpeg to dump the raw
data.

An interlaced frame then only contains half of the full screen of lines. The
flag indicates odd or even.
Thus, half the data rate of progressive.



I know hevc just does fields anyway


According to the manual (above), it can do interlaced. Crazy, I
know, but it seems it will.


What I meant buy fields was it doesn't (AFAIK) do anything complicated
for interlaced like mpeg2/h264 did - like they really wanted not do
interlaced, but did simply = coding fields and nothing more.

- maybe deint first


You are obsessed with progressive scan!!!

I know 'its better', and just like 'garlic bread', 'its the future'.
But if something is sourced in an interlaced standard, and the
equipment you play back on can handle interlaced images, why ruin
it?


My TV de-interlaces, I can do the same (possibly better) to play back on
a monitor that doesn't.

to 50p wouldn't be cheating too much,


And in encoding to 50p you double the required data rate.


True - but then weaved frames are also "extra" complicated so I don't


No, they're not.
Its just half the lines, with a flag.

think it would need 2x bits for the sameq. It would be a different test
though.


but it's currently so slow I could never play it at full speed on
my old PC.

ftp://vqeg.its.bldrdoc.gov/HDTV/

for raw video + details of.


To be more specific I am looking at the 576i25 and 1080i25 park run
(converted to 420) - though watching 422 makes no difference, it
wouldn't be a fair test as I had to convert to 420 to make a
representative h264 anyway.

ftp://vqeg.its.bldrdoc.gov/HDTV/SVT_...ekas_Exports_/


Changing pixel format, or colour space, will make zero difference to
definition or detail contained within an image. It relates to
colour handling. i.e. How you record the brightness of three
neighbouring pixels.

Yes, some colour spaces are not compatible with some encoders. But
that was in mind when certain encoders were constructed.

Even reduced to 1:0:0 (monochrome), an image would still contain
detail, edge definition, and clarity. e.g. A graphic consisting of
words would still contain precise edge detail in monochrome, in
1:0:0. A moving shot of a pitch would still contain the detail and
definition of the playing surface.

Only once you push that through an inter-frame encoder, that is when
you loose detail. That is when graphics suddenly get an aura of
their own. (Derek Acorah not included.) By which I mean an extra
unintended pixel shadow 1 to 3 pixels wide around the whole edge of
any graphics.


Going back to my original point, that if anyone transmitted _source
quality_ SD now (using the same bandwidth which is applied to HD.
i.e. uncompressed SD), it would appear to contain more detail than
the DTT dog faeces we get presently.

Therefore, why, instead of nob waving, launching half cocked 4K, why
doesnt someone have the bright idea of transmitting 'better HD'??
"True HD" even?? It would save trucks full of money for all
concerned for a start!


Better HD yes - I am still not convinced it's quite as bad as SD, though
- maybe I don't watch enough TV (usually motorsport) - perhaps park run
is misleading (I obviously don't have access to much else to compare),
but to me 10mbit x264 HD wins over raw SD for that.



A 'lab test' is not as good as a 'real world' test.
Our real world test is FTA Freeview HD.
Compared to source HD, you're all being ripped off severely!

UHD will be an even bigger rip off, with even more detail lost!



_Unknown_Freelancer_ August 5th 15 08:11 PM

4k TV on Freesat or Freeview?
 
Dave W" wrote in message
.. .
On Mon, 3 Aug 2015 19:22:14 +0100, "_Unknown_Freelancer_" /dev/null
wrote: (amongst other things)

Any captions or graphics, any channel. They should all have distinct sharp
edges.
Look again, and you'll see a border around everything, about 1 to 3 pixels
wide.
....because the transmitted mpg stream doesnt have enough bandwidth to
detail the edges of graphics.

"HD" is not HD.
Yes, its 'High', lots of pixels and vertical lines, but no 'Definition'.

The quality you see on HD Freeview equates to SD quality at source.
i.e. What you see on telly is only as good as broadcast quality SD (at
source) ever was.

Thus, the 'digital tv' rip off.
Freeview HD does not look good.


My TV, fed from HD Freesat, has to have "sharpness" turned to zero to
get rid of the horrible borders and exaggerated detail. The HD looks
good to me. In fact too good, exposing the make-up applied to female
presenters to make them look good in SD.


So you're turning the detail down to zero, removing picture definition.
Its a safe bet your telly makes Mary Berry look twenty years younger!
......because in removing detail, youre removing definition.
You're applying a real time photo-shop!

So, what is the point of HD for you??
You are removing the 'D' from it.



_Unknown_Freelancer_ August 5th 15 08:31 PM

4k TV on Freesat or Freeview?
 
"Freeview HD looks good."

"...looks good" ????



The war, my lord, is lost!

Freeview HD is.... (sorry, Ive run out of expletives) DOG ****E!

Next time there's a field based sport on telly, watch it.
Instead of watching 'the action', watch the detail in the grass when the
camera moves around.

The grass turns to one big green VHS* quality mush. All the detail in the
grass disappears.
When I say VHS quality, I mean the tape you left in the deck for that
'emergency crash in to record' occasion!

The detail only returns to the grass when the camera stops moving.
....once the mpg encoding has enough bandwidth to restore picture detail.


As it happens we only bought a [small] Freeview HD TV (for the kitchen)
relatively recently. The picture on the HD channels is noticeably better
than SD.



'SMALL'
Pictures ALWAYS look better on a small screen.
.....like (in the old days) telly was alway a better picture when you were at
the caravan
.....because the screen is smaller, all the mistakes and artifacts are
smaller, much harder to notice.

iPlayer looks fantastic on my phone... with its 3" screen!

You could have a small telly for your living room, so you dont notice the
crap.
......but its smaller! No-one would do that.



Our main TV is Freesat and the smearing etc. that you mention does not
occur very much on that, however BT TV (over the internet) of a football
match suffered in exactly the way you describe.


Freesat has far more bandwidth that Freeview.
Therefore is not subject to rediculous compression, resulting in VHS
quality.




FWIW, UHD IS more than four times the bandwidth.


No it is four times the resolution.


I wrote the nugget above in reply to RMCs post on 3rd August:
"UHD is certainly no more than four times the bandwidth of 1080i, which is
broadcast in the teens of Mbps (SD usually less than half that)."

i.e. He didnt understand my point that 4K required so much bandwidth.

Correct, it is four times the resolution.
HD x 2 in each direction.

The SMPTE solution was to carve the picture in to quarters.
So a 4K video source has four video outputs. Each 1080p.
1080p requires 3Gb/s

Four images at 3Gb/s = 12Gb/s data rate for one 4K image.
Eight times the data rate for 1080i50 HD.



And you want to compress 12Gb/s down to teens of Mb/s ????
Just what do you think will happen to your detail? ....to your picture
definition?
You think it will be 'Ultra High' do you?


Well I am sitting in front of a 4k monitor running at 60Hz. The signal
comes down one DisplayPort cable and IIRC it can manage 4k @ 30Hz on HDMI.


Ah, Mac.
I have a universal solution to all Apple problems:
www.wickes.co.uk/p/139250

Display port (allegedly) has a 32Gb/s bandwidth
https://en.wikipedia.org/wiki/DisplayPort






UnsteadyKen[_4_] August 5th 15 10:06 PM

4k TV on Freesat or Freeview?
 

In article:

"_Unknown_Freelancer_" says...

So you're turning the detail down to zero, removing picture definition.
Its a safe bet your telly makes Mary Berry look twenty years younger!
.....because in removing detail, youre removing definition.
You're applying a real time photo-shop!

Oh give over, you have it totally arse about face, a setting of zero on
most TV sets turns the Photo Shoppish artificial edge enhancement
processing off.
Have you really never tried it?

http://hifi-writer.com/wpblog/?page_id=3517

--
Ken O'Meara

_Unknown_Freelancer_ August 5th 15 11:12 PM

4k TV on Freesat or Freeview?
 
"UnsteadyKen" wrote in message
...

In article:

"_Unknown_Freelancer_" says...

So you're turning the detail down to zero, removing picture definition.
Its a safe bet your telly makes Mary Berry look twenty years younger!
.....because in removing detail, youre removing definition.
You're applying a real time photo-shop!

Oh give over, you have it totally arse about face, a setting of zero on
most TV sets turns the Photo Shoppish artificial edge enhancement
processing off.
Have you really never tried it?

http://hifi-writer.com/wpblog/?page_id=3517

--
Ken O'Meara


Imagine you've got this analogue slider,
.....similar to the old 'colour' slider, where it was monochrome at one end,
just right somewhere near the middle, and comparable to LSD at the other
end.

So, to the right hand end it emphasises egdes, makes them more vivid.
In the middle, it displays what it received. Be that OTA or off a piece of
wire from another box.
To the left, it goes the other way. It smooths out edges, it un-emphasizes
them.
i.e. Removes detail.

In analgue telly it was an HF tweak.
Putting gain in emphasised vertical edges. Attenuating HF smoothed them out.

And you're winding it all the way down to remove/smooth over the crap caused
by DTT transmission chain.
As before, you removed the 'D' from 'HD'.




UnsteadyKen[_4_] August 5th 15 11:40 PM

4k TV on Freesat or Freeview?
 

In article:

"_Unknown_Freelancer_" says...

Imagine you've got this analogue slider,
....similar to the old 'colour' slider, where it was monochrome at one end,

So, to the right hand end it emphasises egdes, makes them more vivid.
In the middle, it displays what it received. Be that OTA or off a piece of
wire from another box.
To the left, it goes the other way. It smooths out edges, it un-emphasizes
them.
i.e. Removes detail.

Well yes, but the sharpness control doesn't have a left hand end, it

only has a middle and right hand, it starts at zero processing which
equals image displayed as received, it doesn't do minus zero.


--
Ken O'Meara

Phil Cook August 5th 15 11:49 PM

4k TV on Freesat or Freeview?
 
On 06/08/2015 00:12, _Unknown_Freelancer_ wrote:
"UnsteadyKen" wrote in message
...

In article:

"_Unknown_Freelancer_" says...

So you're turning the detail down to zero, removing picture definition.
Its a safe bet your telly makes Mary Berry look twenty years younger!
.....because in removing detail, youre removing definition.
You're applying a real time photo-shop!

Oh give over, you have it totally arse about face, a setting of zero on
most TV sets turns the Photo Shoppish artificial edge enhancement
processing off.
Have you really never tried it?

http://hifi-writer.com/wpblog/?page_id=3517

--
Ken O'Meara


Imagine you've got this analogue slider,
.....similar to the old 'colour' slider, where it was monochrome at one end,
just right somewhere near the middle, and comparable to LSD at the other
end.

So, to the right hand end it emphasises egdes, makes them more vivid.
In the middle, it displays what it received. Be that OTA or off a piece of
wire from another box.
To the left, it goes the other way. It smooths out edges, it un-emphasizes
them.
i.e. Removes detail.

In analgue telly it was an HF tweak.
Putting gain in emphasised vertical edges. Attenuating HF smoothed them out.

And you're winding it all the way down to remove/smooth over the crap caused
by DTT transmission chain.
As before, you removed the 'D' from 'HD'.


No, the sharpness slider is *adding* something to the received picture
across its whole range, except at 0 where you are seeing what is
broadcast.
--
Phil Cook

_Unknown_Freelancer_ August 6th 15 10:21 AM

4k TV on Freesat or Freeview?
 
"UnsteadyKen" wrote in message
...

In article:

"_Unknown_Freelancer_" says...

Imagine you've got this analogue slider,
....similar to the old 'colour' slider, where it was monochrome at one
end,

So, to the right hand end it emphasises egdes, makes them more vivid.
In the middle, it displays what it received. Be that OTA or off a piece
of
wire from another box.
To the left, it goes the other way. It smooths out edges, it
un-emphasizes
them.
i.e. Removes detail.

Well yes, but the sharpness control doesn't have a left hand end, it

only has a middle and right hand, it starts at zero processing which
equals image displayed as received, it doesn't do minus zero.


Its just a number
It is not a measure of anything.
Its just some means by which non-technical users can understand the how much
change to a filter setting has been applied

Reverting back to analogue.... what if a slider on one telly had markations
from 0 to 20
Then an identical telly had -10 to 10
And a further identical one had 1 to 11 ....this of course would be only
sold by Marshall amps to rock bands!

Zero difference between them.
All apply the same amount of gain/attenuation.
The numbers are no measure of anything, they're just there for the
non-technical human end user.





_Unknown_Freelancer_ August 6th 15 10:22 AM

4k TV on Freesat or Freeview?
 
"Phil Cook" wrote in message
...
On 06/08/2015 00:12, _Unknown_Freelancer_ wrote:
"UnsteadyKen" wrote in message
...

In article:

"_Unknown_Freelancer_" says...

So you're turning the detail down to zero, removing picture definition.
Its a safe bet your telly makes Mary Berry look twenty years younger!
.....because in removing detail, youre removing definition.
You're applying a real time photo-shop!

Oh give over, you have it totally arse about face, a setting of zero on
most TV sets turns the Photo Shoppish artificial edge enhancement
processing off.
Have you really never tried it?

http://hifi-writer.com/wpblog/?page_id=3517

--
Ken O'Meara


Imagine you've got this analogue slider,
.....similar to the old 'colour' slider, where it was monochrome at one
end,
just right somewhere near the middle, and comparable to LSD at the other
end.

So, to the right hand end it emphasises egdes, makes them more vivid.
In the middle, it displays what it received. Be that OTA or off a piece
of
wire from another box.
To the left, it goes the other way. It smooths out edges, it
un-emphasizes
them.
i.e. Removes detail.

In analgue telly it was an HF tweak.
Putting gain in emphasised vertical edges. Attenuating HF smoothed them
out.

And you're winding it all the way down to remove/smooth over the crap
caused
by DTT transmission chain.
As before, you removed the 'D' from 'HD'.


No, the sharpness slider is *adding* something to the received picture
across its whole range, except at 0 where you are seeing what is
broadcast.
--
Phil Cook


.....and you have the manufacturers source code??



_Unknown_Freelancer_ August 6th 15 10:23 AM

4k TV on Freesat or Freeview?
 

FTR, big feature posted on Broadcastnow.co.uk today about BT Sports
forthcoming 4K Sports channel


http://www.broadcastnow.co.uk/featur...091309.article



Roderick Stewart[_3_] August 6th 15 11:44 AM

4k TV on Freesat or Freeview?
 
On Thu, 6 Aug 2015 00:49:06 +0100, Phil Cook
wrote:

[re "sharpness" controls]
So, to the right hand end it emphasises egdes, makes them more vivid.
In the middle, it displays what it received. Be that OTA or off a piece of
wire from another box.
To the left, it goes the other way. It smooths out edges, it un-emphasizes
them.
i.e. Removes detail.

In analgue telly it was an HF tweak.
Putting gain in emphasised vertical edges. Attenuating HF smoothed them out.

And you're winding it all the way down to remove/smooth over the crap caused
by DTT transmission chain.
As before, you removed the 'D' from 'HD'.


No, the sharpness slider is *adding* something to the received picture
across its whole range, except at 0 where you are seeing what is
broadcast.


That may be true of a particular TV set, but it is possible for such a
control to remove sharpness, i.e. make it less sharp than what is
broadcast, just as it is possible to make a treble control that makes
something sound more muffled than the original.

A treble control is simpler situation in that it is just amplifying
high frequency components of what is already present in the signal by
a ratio that can be greater or less than unity, but the principle is
the same for sharpness controls, even though the "sharpness" signal is
artificially constructed*. It is possible either to add it to, or
subtract it from, the original, the latter making the picture look
less sharp than what is broadcast.

*In a TV aperture corrector, which does something similar to a
sharpness control, the usual way was to delay the signal twice and use
the signal after the first delay as the main output, the direct and
twice-delayed versions then being effectively copies of the original
ahead of, and behind it in time. Addition of these two signals created
a blurred version of the original (but centred on it, rather than
delayed as a simple HF filter would have done), and so subtracting
this blurred version from the original would produce an artificial
signal which was brightest where the amplitude of the original was
changing most rapidly, i.e. where there were edges between areas of
different brightness. In an aperture corrector, this sharpness signal
would normally be added in such a way as to increase the apparent
sharpness by adding bright edges, but it could just as easily be added
in the opposite polarity for effect.

In the vertical direction, for practical reasons the amount of delay
had to be whole lines (implemented by two 64 microsecond acoustic
delays carrying a 30MHz amplitude modulated signal), so the amount of
delay in the horizontal direction was chosen for a visual match.

When image processing software is used in a computer to apply
artificial sharpening to a blurred image, the technique sometimes goes
by the name of "unsharp masking", I suppose because it is generating
an "unsharp" (i.e. blurred) version of the original, and then taking
the difference between the unsharp version and the original to create
the edge enhancement signal, but despite a few extra adjustments being
available, it seems to be doing the same thing that television cameras
have been doing for decades.

Rod.

UnsteadyKen[_4_] August 6th 15 12:25 PM

4k TV on Freesat or Freeview?
 

In article:

"_Unknown_Freelancer_" says...

The numbers are no measure of anything, they're just there for the
non-technical human end user.

I know that, I have come across controls before, and it was you who
started blathering about numbered sliders and all that guff in a
desperate attempt to cover your arse when it was pointed out you were
talking bollox, IE, turning sharpness processing off does not reduce
the resolution and make the picture go all blurry,as you claimed.

Therefore I think I may lay claim to victory in this particular
pointless pedantic nitpicking contest and you should send a prize of
your choosing to the usual address forthwith.

--
Ken O'Meara

R. Mark Clayton[_2_] August 6th 15 03:17 PM

4k TV on Freesat or Freeview?
 
On Wednesday, 5 August 2015 21:09:05 UTC+1, _Unknown_Freelancer_ wrote:
"Andy Furniss" [email protected] wrote in message
o.uk...
SNIP


It's 2015 - I don't have an interlaced display any more!

De-interlace and scale is what my (and I guess most peoples) TV does to
25i I could let it do it or I can do it in s/w my self.


How do you know this?
Do you have the source code from the manufacturer?

You don't need it.

For a CRT interlacing relies on the persistence of the phosphor, so alternate lines are drawn every frame (50fps (576i)in EU, 60fps in US). The primary reason for doing this is to reduce the flicker that that would be very obvious if the whole frame were drawn every time (so 25fps in the EU).

Later CRT TV's would remember the contents of every line and redraw the whole screen every frame time (and SECAM sets may have had this feature longer). CRT monitors topped out at [email protected] around the mid noughties.

More recent flat screen panels rely on a different method. Basically a pixel will stay in a particular state until it is told to do something different.

Andy Furniss[_3_] August 6th 15 10:43 PM

4k TV on Freesat or Freeview?
 
_Unknown_Freelancer_ wrote:
"Andy Furniss" [email protected] wrote in message


De-interlace and scale is what my (and I guess most peoples) TV
does to 25i I could let it do it or I can do it in s/w my self.


How do you know this? Do you have the source code from the
manufacturer?


No, but that doesn't mean they don't.

There must be many chips sold for the purpose (I know they also do more
complicated processing as well)

I do know that my TV de-interlaces as I can test it with a computer.

Manufacturers spend a lot of time and effort over their kit before
they put it to market. Ok, so there is the occasional lemon model,
but on the whole, most kit does what it says on the tin.

Just because an OLED screen only came out of the factory in January
2015, does not mean it can not interpret an interlaced scan 'is as.
Without the source code, for all we know (when watching 1080i) it may
well actually only update all the odd lines in one pass, and then all
the even in the next.


Should be easy enough to take a pic to prove/disprove. Many TV reviews
seem to test "the deinterlacer" so I assume many TVs don't work by
simulating a CRT.

In which case, leaving source interlaced stuff as interlaced IS the
best thing to do.

Let the equipment decide what to do with it.


Oh I can and do let my TV do its thing - observing the quality of its
de-interlacing and scaling also lets me say that I can equal/beat it
with my own processing. It's not top end TV, but not budget either, it
got good reviews.

As previously, SMPTE didnt make up their standards for fun. And Im
pretty sure the manufacturers didnt toss a coin when writing the
code to decide how it should handle different display modes.

And in 'scaling' you're again compromising your own pictures. Once
you scale, you've ruined your copy for good.


Well someone/something has got to scaled to get SD on an HD panel.

Who said anything about changing master copy - I can choose different
scale eg.lanczos at display time thanks to open source geeks and OpenGL.

I could also use ffmpeg, I can deinterlace on the fly, but I think a
motion compensated de-int that runs at 0.1 fps will beat it. I have
choice to do whatever I want

i.e. You resample all the colour spacing, you contrive any
representation.

Again, leave it as you got it, let the screen decide what to do with
it.


Or do better....

Leap forward fifteen years. Screens will still have backward
compatitibility, and will be capable of rediculous resolutions....
but at that point in time the manufacturers will have better means of
making older formats work well on their screens.


True, and some of the methods they will do realtime are likely already
proposed in existing papers, peoples masters/PHDs. Just they currently
are far too slow.

Although I notice certain 'arty types' in sports production
actually add a 'film effect' to some items. Aparently its 'art'.
....with a capital F me thinks!


Not filmic that would imply deint to 25p (eww) TV/me would of
course do 50p.


No. Because it still leaves Tx at 50i. You're not going to change the
whole transmission chain for one vt package, are you?


It was you that mentioned filmic, which I assumed meant 25p - I can't
recall suggesting changing Tx!

FWIW on FreeviewHD 25p is flagged as progressive.


The part run samples I pointed to are source quality aren't they?


TL;DR! But, I dont think thats relevant anyway. Im comparing source
SD to DTT 'HD'. Lab test vs Real world.


Well I've got sport recordings that are 10mbit and so it's not like they
never go that high. I must admit that Park Run at 5mbit is horrible, but
at 10 it's OK.

Whatever I try I can't get the raw 576i to look as good as a
10mbit encode of the 1080i.


But without the source HD material, how do you know what
detail/definition has been lost to compare?


I do have the HD source - that's what the 10mbit 264 was made from!

Comparing HD to HD wasn't the point, of course 10mbit is not as good as
the raw - but it's way better than the raw SD.

Generally compression occurs by smoothing edges (loosing detail), and
then finding repeat patterns in a frame (which buggers up captions).


Different cameras of course (same lens), detail -

ftp://vqeg.its.bldrdoc.gov/HDTV/SVT_exports/README.txt


Please do not misunderstand this and interpret this as me saying
SD
HD. Simply not the case. Its once a picture has gone through
the
terretrial transmission chain.


I don't see why my 10mbit x264 encode should beat pro kit -
assuming of course they would give that much bitrate to similar
content.



Because its compressed to ****e by Arqiva! Because the broadcasters
dont want to pay any more s!


I thought the BBC coded the main HD mux and Aquiva did com 7/8.


Again, why do you need to de-interlace ?? Wht are you intent on
de-interlacing? When SMPTE created these new standards they set
added interlaced for distinct reasons. i.e. They didnt do it for
a laugh, or while they were down the beer keller!


Reluctantly to halve the bandwidth while giving decent temporal
res.


But you wont be halving the data rate.


I meant that the reason interlaced still survives and gets standards
despite eg, the EBU trying to get rid for HD is because it's half the
data rate compared to 50p.

https://www.ebu.ch/en/technical/trev...editorial.html

I know its the other end of the spectrum, but the maths is still the
same: (or "math", if youre "murican") 1080i50 (1080 interlaced, 25
fps, alternate lines refreshed at 50fps) = 1.5Gb/s (because you are
sending 1080/2= 540 lines 50 times a second) 1080p50 (1080
progressive lines, refreshed at 50fps) = 3Gb/s (because you are
sending all 1080 lines every 50 times a second)


I can do sums.

So, unless you're converting 1080i50 (off air telly) to 1080p25, you
are doubling your data rate by converting to 1080p50


I am fully aware of that.
..
.....and ruining the source by deinterlacing it needlessly.


Or doing as good as/better than the realtime deint my TV does.

And it's precisely the same if you're doing this with SD off air.
Converting 576i to 576p doubles your data rate.


I know that - just trying to get raw SD at its best to compare with
10mbit HD for comparison - not as a general policy for viewing - as I
said I let my TV deint that.


BUT.... you may well notice the occasional f.cup with 'regional
news'. ENG have gone out and shot something, usually out of focus,
and recorded with more audio distorion than a Foo Fighters concert!
Content goes back to the edit, and the clown at the keyboard, who has
a non-linear (computer) edit in front of him, hasnt bothered to check
it on a proper telly. i.e. They've only watched it on their computer
screen. When it goes to air any horizontal motion shivvers....
because they've got the field dominance the wrong way round!


Well they should really have just done a quich check with yadif=1 :-)

I mean surely most people now see interlaced on a progressive
display = it's de-interlaced. If I put my Panny plasma into an
interlaced mode it de-interlaces (motion adaptively). It doesn't
become an interlaced display. I can de-interlace in s/w to achieve
the same effect on a dumber display (my 1920x1080 computer
monitor).


As per first point above, without manufacturers source code, how do
you know that just because your telly only fell of Dixons shelf
yesterday, that it still doesnt update the display in an interlaced
fashion?


Answered above for me, but it's unlikely even if there are TVs that
simulate CRTs they would do it for 576i on an HD panel.


And in encoding to 50p you double the required data rate.


True - but then weaved frames are also "extra" complicated so I
don't


No, they're not. Its just half the lines, with a flag.


I think in practice mpeg2 and h264 encoders do full weaved frames rather
than fields - but anyway it was just me thinking out loud about how to
compare x265 with x264.

It seems currently ffmpeg doesn't re-weave the output of its hevc
decoder. x265 also warns that interlace support is experimental if you
try to use it.


Better HD yes - I am still not convinced it's quite as bad as SD,
though - maybe I don't watch enough TV (usually motorsport) -
perhaps park run is misleading (I obviously don't have access to
much else to compare), but to me 10mbit x264 HD wins over raw SD
for that.



A 'lab test' is not as good as a 'real world' test. Our real world
test is FTA Freeview HD. Compared to source HD, you're all being
ripped off severely!


Well yes, but I am comparing for the claim that it's worse than raw SD
so if anybody ever broadcasts park run I will record it and see real
world rather than my "lab" :-)

UHD will be an even bigger rip off, with even more detail lost!


Still no update on BTW wholesale connect sin WRT BT UHD.

Given that their HD offerings were 7.5mbit 1440 or "premium" 1920 at
10mbit it will be interesting to see what their UHD is.

Andy Furniss[_3_] August 6th 15 11:03 PM

4k TV on Freesat or Freeview?
 
_Unknown_Freelancer_ wrote:

So you're turning the detail down to zero, removing picture
definition. Its a safe bet your telly makes Mary Berry look twenty
years younger!


As discussed in the past on here or utb - being kind to people on TV can
be deliberate also (soft focus/avoiding too close).

.....because in removing detail, youre removing definition. You're
applying a real time photo-shop!


I can see single line b/w/r/g/b/y/m/c pattern detail on my TV with
sharpness full down = it does not mean a blur filter is somehow engaged
for me, and putting sharpness down is often recommended in review/test
articles.

So, what is the point of HD for you?? You are removing the 'D' from
it.



Paul Ratcliffe August 7th 15 10:21 AM

4k TV on Freesat or Freeview?
 
On Thu, 6 Aug 2015 13:25:21 +0100, UnsteadyKen
wrote:

I know that, I have come across controls before, and it was you who
started blathering about numbered sliders and all that guff in a
desperate attempt to cover your arse when it was pointed out you were
talking bollox, IE, turning sharpness processing off does not reduce
the resolution and make the picture go all blurry,as you claimed.


It certainly did on the old tube cameras. Turning contours off made
the picture as soggy as anything and essentially unusable. I think we
used to do that during registration line-up, but it has been rather a
long time since then...

Dave W August 7th 15 10:55 AM

4k TV on Freesat or Freeview?
 
On Thu, 6 Aug 2015 13:25:21 +0100, UnsteadyKen
wrote:


In article:

"_Unknown_Freelancer_" says...

The numbers are no measure of anything, they're just there for the
non-technical human end user.

I know that, I have come across controls before, and it was you who
started blathering about numbered sliders and all that guff in a
desperate attempt to cover your arse when it was pointed out you were
talking bollox, IE, turning sharpness processing off does not reduce
the resolution and make the picture go all blurry,as you claimed.

Therefore I think I may lay claim to victory in this particular
pointless pedantic nitpicking contest and you should send a prize of
your choosing to the usual address forthwith.


I congratulate you for giving a splendid link demonstrating the
effect, and winning victory in my eyes.
--
Dave W

R. Mark Clayton[_2_] August 9th 15 11:28 AM

4k TV on Freesat or Freeview?
 
On Wednesday, 5 August 2015 21:02:39 UTC+1, Andy Furniss wrote:
R. Mark Clayton wrote:

FWIW, UHD IS more than four times the bandwidth.


No it is four times the resolution.


Yes, but the source bitrate is 8x as you have to account for current HD
only being 25 fps or 50 fields per sec. UHD doesn't use interlacing so
50fps doubles the source bandwidth on top of the res increase. This
means for sport that the vertical res increase is (more than?) 4 times
HD. The "more than" may be debatable - but I think interlaced gets extra
filtering to prevent interline twitter.


You are still thinking about building a rasterised image with the picture built up in [alternate] lines every [other] frame time.

More recent methods send the full frame every so often and the changes every frame time. This works great for static images or for video where things in the view change, but can generate artefacts when the camera pans or zooms.

R. Mark Clayton[_2_] August 9th 15 11:41 AM

4k TV on Freesat or Freeview?
 
On Wednesday, 5 August 2015 21:32:06 UTC+1, _Unknown_Freelancer_ wrote:
Snip


As it happens we only bought a [small] Freeview HD TV (for the kitchen)
relatively recently. The picture on the HD channels is noticeably better
than SD.



'SMALL'
Pictures ALWAYS look better on a small screen.
....like (in the old days) telly was alway a better picture when you were at
the caravan
....because the screen is smaller, all the mistakes and artifacts are
smaller, much harder to notice.

iPlayer looks fantastic on my phone... with its 3" screen!

You could have a small telly for your living room, so you dont notice the
crap.
.....but its smaller! No-one would do that.


Not quite correct. We have a 42" screen and view it from about 3m5 the picture is good AND you can easily tell HD from SD as there is a lot more detail. View from our dining area (about 7m5 away) you can't because the eye can't resolve the image sufficiently.



Our main TV is Freesat and the smearing etc. that you mention does not
occur very much on that, however BT TV (over the internet) of a football
match suffered in exactly the way you describe.


Freesat has far more bandwidth that Freeview.
Therefore is not subject to rediculous compression, resulting in VHS
quality.




FWIW, UHD IS more than four times the bandwidth.


No it is four times the resolution.


I wrote the nugget above in reply to RMCs post on 3rd August:
"UHD is certainly no more than four times the bandwidth of 1080i, which is
broadcast in the teens of Mbps (SD usually less than half that)."

i.e. He didnt understand my point that 4K required so much bandwidth.

Correct, it is four times the resolution.
HD x 2 in each direction.

The SMPTE solution was to carve the picture in to quarters.
So a 4K video source has four video outputs. Each 1080p.
1080p requires 3Gb/s

Four images at 3Gb/s = 12Gb/s data rate for one 4K image.
Eight times the data rate for 1080i50 HD.


I missed the higher frame rate. Compression over a frame four times the size should perform slightly better than compression over four frames, however this assumes the same algorithm.



And you want to compress 12Gb/s down to teens of Mb/s ????
Just what do you think will happen to your detail? ....to your picture
definition?
You think it will be 'Ultra High' do you?


Well I am sitting in front of a 4k monitor running at 60Hz. The signal
comes down one DisplayPort cable and IIRC it can manage 4k @ 30Hz on HDMI.


Ah, Mac.


Mac - spit!

No new PC with AMD A8-7600, Asrock MB & Iiyama B2888UHSU display.

I have a universal solution to all Apple problems:
www.wickes.co.uk/p/139250

Display port (allegedly) has a 32Gb/s bandwidth
https://en.wikipedia.org/wiki/DisplayPort


Only got DP 1.2, which gives max 17Gbps (for true 4k). Actually using ~16Gbps for 10bit colour on [email protected]

_Unknown_Freelancer_ August 9th 15 01:38 PM

4k TV on Freesat or Freeview?
 
"UnsteadyKen" wrote in message
...

In article:

"_Unknown_Freelancer_" says...

The numbers are no measure of anything, they're just there for the
non-technical human end user.

I know that, I have come across controls before, and it was you who
started blathering about numbered sliders and all that guff in a
desperate attempt to cover your arse when it was pointed out you were
talking bollox, IE, turning sharpness processing off does not reduce
the resolution and make the picture go all blurry,as you claimed.

Therefore I think I may lay claim to victory in this particular
pointless pedantic nitpicking contest and you should send a prize of
your choosing to the usual address forthwith.

--
Ken O'Meara


Ken,

Apologies for the delay in my response. I had to go to work. 39 hours in
three days + 300 miles.

If you wish to rest easy in a smug oblivious shallow victory, then please,
enjoy it.
Get your self a certificate printed too so you can hang it on the back of
the toilet door too.

But really, Ive worked in and around live tv for n decades*, Ive taken
verbal abuse from the Keys, Grey, Shrieves, Grey's brother and others too
insignificant to mention. As a result Ive got thicker skin than an
armadillo. You really think I care??

* where n is an integer between 1 and 8.


BTT, if I must legitimise your 'victory', here is why I wrote what I wrote:
In the begining I wrote that your telly must make Mary Berry 20 appear 20
years younger, because you turn your sharpness setting all the way down.

To which your reply was:
Oh give over, you have it totally arse about face, a setting of zero on most
TV sets turns the Photo Shoppish artificial edge enhancement processing off.

This _appears_ that you are saying that the sharpness control does not then
become an 'unsharp' control when turned 'fully down'. That because it starts
at zero, is proof that it does not 'unsharpen' the image.


To which I started to "blather on about sliders in a desparate attempt to
cover my arse".
When in my mind, I was attempting to bring a 'virtual control' (your tellys
sharpness metric) in to the physical world, by comparing it to something
many have encountered in the past on analogue equipment.

Which is why I used the example of three tellys, all with identical
internals, but with different markings on the exterior by the sharpness
control.

The point being, that no matter what scale displayed by the control, be it 0
to 10, 1 to 11, or -5 to +5, the effect is exactly the same on the inside of
the box.
Therefore, the numbers shown by the control, relate to absolutely nothing on
the inside of the box.
They do not actually measure anything. i.e. Gain/attenuation, filter input
value, flux capacitor voltage, etc

Thus, just because your tellys sharpness control starts at zero, this does
not neccesarily mean that there is no 'unsharpening' being applied to your
pictures.
i.e. The zero on your settings display is just a number. It is just a means
for non-technical types to interpret how much of an effect has been applied.

So raise a glass, victory, it appears (to some), is yours Ken.
https://www.google.ca/search?site=&s...tificate+maker





All times are GMT. The time now is 05:17 PM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.
SEO by vBSEO 2.4.0
Copyright 2004-2006 DigitalTVBanter.co.uk