A Sky, cable and digital tv forum. Digital TV Banter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » Digital TV Banter forum » Digital TV Newsgroups » uk.tech.digital-tv (Digital TV - General)
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

uk.tech.digital-tv (Digital TV - General) (uk.tech.digital-tv) Discussion of all matters technical in origin related to the reception of digital television transmissions, be they via satellite, terrestrial or cable. Advertising is forbidden, with no exceptions.

4k TV on Freesat or Freeview?



 
 
Thread Tools Display Modes
  #71  
Old August 15th 15, 12:04 AM posted to uk.tech.digital-tv
_Unknown_Freelancer_
external usenet poster
 
Posts: 75
Default 4k TV on Freesat or Freeview?

"Vir Campestris" wrote in message
...
On 14/08/2015 12:03, _Unknown_Freelancer_ wrote:
snip
Instead of wasting so much bandwidth on such guff, why not just turn up
the
bandwidth for present HD channels, and make Freeview better quality than
Sky
satellite or BT TV???
i.e. Make HD.... the best HD. FOR FREE.


Onik. Flap. Oink. Flap.


Preaching to the converted here!
I know it will never happen.
BUT, instead of wasting truck loads of cash on pointless DTT 4K, why not
just offer 'the best HD'?




Bear in mind CD audio is sampled at 44.1KHz. The entire broadcast world
uses
48KHz sample rates for everything. So that CD has already lost the very
high
frequencies. Anything above 22KHz to be precise.
Then compressing that in to a 128Kb/s mp3 file..... Any sound supervisor
worth their salt CAN recognise an mp3.
There are several who refuse to use mp3 files in their programmes,
insisting
on .wav source files.

If you take a 96Kb/s mp3 file, its easy to hear how bad it is.
128Kb/s is only just a little above that.
i.e. That file is only just above the point where audio is defeningly
obvious how bad it is.

H265 is H264 tweaked.
The macroblock size is increased (the grid which the encoder chops the
picture in to), and they rejigged the colour space. Thats it.
Its really not worth getting all moist about h265 as being the best thing
since the Altair 8800.
Its not some amazing soloution that will allow cinema quality pictures to
pushed down a dial up connection.

Just like that 128Kb/s mp3 file, its h264, but just above the point where
the masses can tell its crap.


I can hear the defects in 128k MP3, but not in 256k MP3 nor in 128k WMA. I
haven't tried on AAC. DAB, OTOH? 80k MP2? Yuck.

And as for Freeview? My techie son and I have learned not to mention the
faults when my wife is there. She just gets cross.


Lets just say, perhaps yours is not the only abode where that situation
arises!



Andy



Ads
  #72  
Old August 15th 15, 12:16 AM posted to uk.tech.digital-tv
_Unknown_Freelancer_
external usenet poster
 
Posts: 75
Default 4k TV on Freesat or Freeview?

"Andy Furniss" [email protected] wrote in message
o.uk...
_Unknown_Freelancer_ wrote:

Take a football match being covered in 4K. (Ive seen some.) Now get
your main gantry camera to frame on a stationary wide angle* facing
across the pitch. *As wide as the lens will go. Not pointing at
anything in particular. Such is the detail in that picture you can
make out individual facial expressions of people in the crowd in the
opposite stand (usually around 100metres away). Its a fair wager that
detail will _never_ be present in any "Freeview 4K".


Instead of wasting so much bandwidth on such guff, why not just turn
up the bandwidth for present HD channels, and make Freeview better
quality than Sky satellite or BT TV???


I am curious what bitrate you think is enough for HD?


I dont know.
I really cant be arsed to carry out loads of tests.
But I do know that when I watch Freeview HD, I got annoyed!
.....yes I know what the doctor will tell me!

I get annoyed because it looks so, well, ****!
And this '****' is sold as "HD".
Its a bleedin' con!



AIUI BT HD is max 10mbit for their (BTW) premium offering or 7.5 1440
standard. For UHD they require a connection of 44mbit (I really hope
they are not allowing for record one and watch another in that!).

I don't know what Sky HD uses, though I have "come across" some SD
transport stream motorsport rips that don't seem any higher than the
same content from the BBC.


AFAIK Sky and Sky Sports use different bit rates for different channels, and
occasionally change it depending on content.
I know Sky Soprts F1 has (or did) have a much higher bit rate so it looked
great.
......but then they went and spent ALL of the money.
So things may have changed.
There's probably a nerds thread on DigitalSpy where someone lists all the
present TS details!
I cant be bothered to trawl right now!



i.e. Make HD.... the best HD. FOR FREE.


I agree that should happen - but what Brian wrote was -

"You could use a whole DVB-T2 multiplex and the H.265 codec.
It would arguably be a better redition of 4K than the current HD
Freeview is of 2K."


Like, that would happen?!!! (Retorical question, btw!)



I think 40mbit hevc 2160p50 should surely be better than the current HD
offering - maybe in the case of 1080i25 high motion bits even better
than "quality/raw HD" deinterlaced/scaled up for a UHD TV.


Hmmmm.....pushing the boundaries there me thinks.
i.e. Back to that 128Kb/s mp3 file again.



I guess it should really go to HD - but there's a T2 mux spare on my
transmitter with just nulls and QVC.

Hmm, maybe I should find the keys to my lab and test -

Would grainy 65mm 2160p50 film scans with professionally made (for VQEG)
1080i25 derivatives do



  #73  
Old August 15th 15, 09:15 AM posted to uk.tech.digital-tv
Andy Furniss[_3_]
external usenet poster
 
Posts: 130
Default 4k TV on Freesat or Freeview?

Andy Furniss wrote:
I recall reading years ago that rf output ist gen game consoles/early
home computers used to pull a trick on interlaced CRTs to make them
progressive (IIRC there is a pulse/something to mark top/bottom field
and they just repeated one rather than alternating). It did say this
worked on most but not all TVs and I recall wondering "why no gaps",
so perhaps all CRTs have a fat spot/spots.


Which makes me wonder what "paused" means/does in the world of analogue kit.

I may well be that taking photos of paused images is only ever going to
get one field as the kit sending the signal is repeating the same field
over and over.
  #74  
Old August 15th 15, 12:01 PM posted to uk.tech.digital-tv
Java Jive[_2_]
external usenet poster
 
Posts: 1,774
Default 4k TV on Freesat or Freeview?

On Sat, 15 Aug 2015 00:17:28 +0100, Andy Furniss [email protected] wrote:

Java Jive wrote:

I ask because I can see no evidence at all that my, admittedly
old and first generation, LCD is doing any de-interlacing.

I don't see any evidence it doesn't


I beg to differ. I go into some detail on the subject, referencing
a particular image of an athlete's arm in the CRT vs LCD web-page. I
am convinced that neither that nor any other video is de-interlaced
by my TV. It's a very early model Panasonic with analogue tuners.


Looking at that image and reading your site I think what I would call
de-interlacing differs from what you would.


Yes. I think that is becoming clear ...

I consider any processing to display interlaced content on a progressive
display as de-interlacing. It may be that it is very simple like line
doubling fields and a bit of filtering to compensate for the different
spatial position of the fields, but it's still processing that is needed
and wouldn't be if a native interlaced display were being driven. Doing
line doubling hurts resolution just weaving fields together is good for
res with static portions of the frame but artifacts on motion. Being
clever and adding edge interpolation/motion detection to get the best of
both is what "advanced" de-interlacers do.


I have to say that I prefer my own definitions ... I don't see how
anything can truly be called de-interlacing unless electronically it
combines the content of *two or more successive* fields to produce an
image that is different from what it would have been had the lines of
each field been drawn on the screen in the appropriate place as
received. As we both seem to acknowledge, there are other things that
an LCD does, such as buffering and scaling, which are necessary for it
to display content correctly, but as long as it is only electronically
processing *one* field at a time, and not electronically combining it
with the content of, temporally speaking, neighbouring fields, before
drawing it on the screen, I don't think it makes any sense to call
that de-interlacing. If you were going to use that definition, you'd
more less have to say that a CRT de-interlaces as well, and the
definition of the word thus becomes too wide and general to be useful.

Judging by the arm shot your TV looks like it's doing fields - I accept
what you write about the pairs not matching, but that could be
additional processing that's nothing to do with interlacing.


But the *simplest* and *cheapest*, and therefore most probably
correct, of the many possible explanations is that it is simply
drawing the fields on the screen pretty much as received.

I don't know the details of your screen


The pictures was from the set taken for the ancillary artifact
demonstration, and as it was not possible to do a field-exact, and
therefore meaningful, visual side by side comparison for this - I
had no way of releasing the shutter on a particular given field, and
therefore could not guarantee that pictures taken on different TVs
were of exactly the same field - all I could hope to do was show that
both types of TV showed the dot-crawl type of motion artifact.
Therefore, I simply used the most convenient LCD model to hand, which
was a Panasonic TX-22LT2, a larger model in the same range as the one
used for the side-by-side comparison of the original experiment, a
Panasonic TX-15LT2.

but it could be, like the
monitor I am currently looking at, only 6 bit and using spatial and or
temporal dithering to fake 8 bit colour, which could explain why the
lines differ 1:1 chroma but the brighter edge steps look paired.


Can't comment on that.

As well as colour LCDs also have to pull tricks to get motion to "work"
- just holding pixels on for the whole frame/field period is great for
flicker, but is really bad for making motion look blurred, so they
"soomehow" have to try and work around that.


From your descriptions and others' I think it likely that modern LCDs
do more processing than they did at the time of the original
demonstration, either by default or via menu options, and I probably
ought to update some of the wording of the page accordingly, but at
the time I never saw any evidence of more complex processing for
either of the two LCDs used in it.

I still have the TX-15LT2, though its screen is scratched after I
contrived to pull it off the bedside furniture :-( but I no longer
have the other two TVs. Nor do I have any of the original content
used. This being so, I can't do any further investigations, unless
they can be done from the original photographs, most, probably all, of
which I still have.

I notice some of your pics are from tiny screens (OK the 22" isn't so
tiny) I wonder given how much portables used to overscan whether they
could be doing 1:1 and loosing lots off top & bottom - just a thought.


Yes, quite possibly, though, as described above, no longer having most
of the kit, I can't check this.

Multi field deinterlacing is "advanced" - it's possible but sub optimal
to call processing one field de-interlacing. processing may involve more
than just line doubling eg. edge detection and smoothing to hide the
half res steps on diagonals.


See above ...

So it's certainly buffering, and it's certainly scaling, it HAS to
be doing both, but de-interlacing, I'm pretty sure not.

So why no artifacts? Well this LCD doesn't have the vertical
resolution it should, 492 vs 576, so perhaps they are being lost in
being scaled, or perhaps they are doing the same trick as with CRTs,
and making each scan line overlap over its neighbours in the
previous field.


I doubt weave would be lost in scaling - IME it looks far worse when scaled.


Perhaps the same trick as CRTs then, halving the vertical resolution.

As for overlapping - well yes, though I would say totally overwrite
rather that overlap.


However we describe it, the result would be the same.

I wonder whether a pic of a full size CRT would
look the same as your portable. Saying that, though, I recall reading
years ago that rf output ist gen game consoles/early home computers used
to pull a trick on interlaced CRTs to make them progressive (IIRC there
is a pulse/something to mark top/bottom field and they just repeated one
rather than alternating). It did say this worked on most but not all TVs
and I recall wondering "why no gaps", so perhaps all CRTs have a fat
spot/spots.


One machine is not a statistically significant sample, but if I had to
generalise based on those photographs and the very few others I've
ever managed to find on the web, I would guess that they're all pretty
much like mine. However, if anyone here with experience of,
particularly, factory setting up, or even just repairing, CRTs could
describe for us how they set such things as the focus, their
description might illuminate this point considerably.

As for LCD motion artifacts you mention in your other post - early
LCDs were poor and may have had poor de-interlacers, without seeing
it running it's hard to know what you mean.


Well, I wasn't saying that my LCD shows artifacts, in fact I've
never seen it show the combing effects claimed on interlaced input,
only dot-crawl from using CV as the input source. Merely I was
suggesting why in general LCDs might legitimately show such motion
artifacts if fed an interlaced signal, because they their technology
allows them to display what they are fed more faithfully than CRTs.


Perhaps it's to do with things other than interlacing - like it's hard
to get slow/always on LCDs to do motion properly.


I do remember that the very earliest LCDs used as laptop screens used
to leave motion trails, but I've never seen that on either of my LCD
TVs, nor on recent laptops, such as the Dell that currently functions
as my best 'TV'!
--
================================================== ======
Please always reply to ng as the email in this post's
header does not exist. Or use a contact address at:
http://www.macfh.co.uk/JavaJive/JavaJive.html
http://www.macfh.co.uk/Macfarlane/Macfarlane.html
  #75  
Old August 15th 15, 08:42 PM posted to uk.tech.digital-tv
_Unknown_Freelancer_
external usenet poster
 
Posts: 75
Default 4k TV on Freesat or Freeview?

"Andy Furniss" [email protected] wrote in message
o.uk...
Andy Furniss wrote:
I recall reading years ago that rf output ist gen game consoles/early
home computers used to pull a trick on interlaced CRTs to make them
progressive (IIRC there is a pulse/something to mark top/bottom field
and they just repeated one rather than alternating). It did say this
worked on most but not all TVs and I recall wondering "why no gaps",
so perhaps all CRTs have a fat spot/spots.


In having a fat raster you ensured there were no black lines between scan
lines.

A bit like helical drum video recorders always moved the tape slower than
one head width per rotation.
i.e. The head would always over write part of the previous pass.


Years back computers were low rez anyway... even with 625 PAL kit.
So it didnt really make any difference if both fields were precisely the
same.
Saved on processing too.... which there wasnt too much of.



Which makes me wonder what "paused" means/does in the world of analogue
kit.

I may well be that taking photos of paused images is only ever going to
get one field as the kit sending the signal is repeating the same field
over and over.


Depends on the kit concerned.
Some had an option switch. This would select whether your freeze/slow motion
was field based or frame based.

Field based would send the present field to both fields on the display.
Frame based would send the present frame (both parts of it) to the display.

(IIRC) The Sony BVW75 was the first analogue deck to include an onboard TBC
and frame memory.

When you pause or played tape slowly, there would be a horizontal stripe
somewhere with no picture content.
(Depending on the kit) It was either a grey stripe or just looked like a bit
of screwed up picture.

When recording to tape and normal speed the stripe recorded by the rotating
head would be stretched further along the tape. This was a good thing = more
bandwidth.
When paused or played at less than normal speed that distortion which
occurred when recording could not occur. The playback head could not read
all of one scan line, because the tape was not moving.
Therefore there was a point when the head passed from one track, over the
blank interleave, and in to a neighbouring track.
This is what caused the grey or screwed up stripe.

To remove this Sony added a frame store and called it DMC. Nothing to do
with a popular American rap crew of the time Id like to add, but Dynamic
Motion Control.

Whilst video RF was above a threshold data would be written to the frame
store.
When RF fell below that threshold no data would be written.
i.e. That part of the frame store remained unchanged.

Then reading from the framestore gave a perfect picture, with none of the
previous problems of analogue video tape.


Yes, before Betacam SP, Quad band could do slowmotion without distortion...
but it was a piece of **** for the time.
Hold a VT on a freeze for too long and the head would slice through the
tape! Genius! "Sorry, cant run that news item right now, tape's snapped
again."

And if the engineer hadnt bothered to line up properly, the resulting
pictures looked like the screen was a composite of four different horizontal
bands of picture, all with different luminance and chroma. ....like one of
those kids flip books.

Towards the end of the VHS era most domestic decks began to include this
sort of technology.
AFAIK, S-VHS decks did by default.... well they had to give you something
for the extra 200 you were paying!





  #76  
Old August 15th 15, 08:52 PM posted to uk.tech.digital-tv
_Unknown_Freelancer_
external usenet poster
 
Posts: 75
Default 4k TV on Freesat or Freeview?

"_Unknown_Freelancer_" /dev/null wrote in message
o.uk...
"Vir Campestris" wrote in message
...
On 14/08/2015 12:03, _Unknown_Freelancer_ wrote:
snip
Instead of wasting so much bandwidth on such guff, why not just turn up
the
bandwidth for present HD channels, and make Freeview better quality than
Sky
satellite or BT TV???
i.e. Make HD.... the best HD. FOR FREE.


Onik. Flap. Oink. Flap.


Preaching to the converted here!
I know it will never happen.
BUT, instead of wasting truck loads of cash on pointless DTT 4K, why not
just offer 'the best HD'?



Actually, I had a thought on this today.

IF Arqiva gave the masses 'the best HD', it would pull the rug from Virgin,
Sky and BT.
Because they would have nothing better to sell, because the masses would be
getting it better for free.

There would no doubt be some idiot court case or petition handed to Ofcom to
have the DTT bit rates slashed back to ****!

So, perhaps, there is a reason DTT is rubbish quality.... the ill logic of
driving profit!

On a similar note, there is discussion and chatter of putting iPlayer behind
a paywall. (DISCUSSION and CHATTER ONLY.)
The next thing we know, ITV come out saying they will HAVE TO put ITVPlayer
behind a paywall if the BBC do!
i.e. An even bigger truck load of bovine faeces!


  #77  
Old August 16th 15, 07:28 AM posted to uk.tech.digital-tv
alan_m
external usenet poster
 
Posts: 189
Default 4k TV on Freesat or Freeview?

On 09/08/2015 14:55, _Unknown_Freelancer_ wrote:


Point is though, yes, (at source) 4K is four times the resolution of HD, but
it produces eight times the data of HD.


What's more important for the viewer is how much of that data is lost in
the processing during transmission.

The UK public demand more channels rather than better (technical)
quality channels so the broadcaster always squeezes more channels into
the available bandwidth using more aggressive lossy algorithms.

Would there be a need for 4K broadcast data if HD was allocated the same
broadcast bitrates as 4k?



--
mailto: news {at} admac {dot] myzen {dot} co {dot} uk
  #78  
Old August 16th 15, 07:39 AM posted to uk.tech.digital-tv
alan_m
external usenet poster
 
Posts: 189
Default 4k TV on Freesat or Freeview?

On 15/08/2015 21:52, _Unknown_Freelancer_ wrote:

Because they would have nothing better to sell, because the masses would be
getting it better for free.


There are many products sold on the basis that the bigger the advertised
number the better the product.




--
mailto: news {at} admac {dot] myzen {dot} co {dot} uk
  #79  
Old August 16th 15, 07:56 AM posted to uk.tech.digital-tv
Roderick Stewart[_3_]
external usenet poster
 
Posts: 2,246
Default 4k TV on Freesat or Freeview?

On Sat, 15 Aug 2015 21:52:54 +0100, "_Unknown_Freelancer_" /dev/null
wrote:

Actually, I had a thought on this today.

IF Arqiva gave the masses 'the best HD', it would pull the rug from Virgin,
Sky and BT.
Because they would have nothing better to sell, because the masses would be
getting it better for free.

There would no doubt be some idiot court case or petition handed to Ofcom to
have the DTT bit rates slashed back to ****!

So, perhaps, there is a reason DTT is rubbish quality.... the ill logic of
driving profit!


Perhaps that's why they also have corner logos on most broadcasts, and
squish the end credits to one side and speed them up so they're too
fast to read while some **** tells you half the plot of the following
episode, or some other programme entirely, about 10dB louder than the
music. The only way to see TV programmes nowadays free from any
deliberate blemish is to pay for DVDs or watch them online.

An online service such as Amazon or Netflix costs about half as much
as the TV licence, so roll on the day when the only payment we have to
make is for what we're actually watching.

Rod.
  #80  
Old August 16th 15, 10:14 AM posted to uk.tech.digital-tv
Andy Burns[_9_]
external usenet poster
 
Posts: 389
Default 4k TV on Freesat or Freeview?

alan_m wrote:

The UK public demand more channels rather than better (technical)
quality channels so the broadcaster always squeezes more channels into
the available bandwidth using more aggressive lossy algorithms.


Maybe Mr Corbyn will have a referendum on quality vs quantity of
freeview channels :-P

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT. The time now is 11:17 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.SEO by vBSEO 2.4.0
Copyright 2004-2019 Digital TV Banter.
The comments are property of their posters.