A Sky, cable and digital tv forum. Digital TV Banter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » Digital TV Banter forum » Digital TV Newsgroups » uk.tech.digital-tv (Digital TV - General)
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

uk.tech.digital-tv (Digital TV - General) (uk.tech.digital-tv) Discussion of all matters technical in origin related to the reception of digital television transmissions, be they via satellite, terrestrial or cable. Advertising is forbidden, with no exceptions.

Duty cycle and transmitters



 
 
Thread Tools Display Modes
  #11  
Old September 30th 17, 04:39 PM posted to uk.tech.digital-tv
Bill Wright[_3_]
external usenet poster
 
Posts: 1,909
Default Duty cycle and transmitters

On 30/09/2017 10:36, Clive Page wrote:


I'm sure you are right, they must be near their maximum power most of
the time.* But in the old analogue days the transmitter power was
usually several hundred kilowatts for main stations, simply because you
need to get a signal-to-noise ratio of 100:1 or more to avoid the viewer
seeing noise in the picture.* With digital signals the signal-to-noise
ratio needed is much lower, just a few to one so that the bit error rate
is negligible.* Sorry don't have time to look up the accurate figures.


20dB near enough.

Bill
  #12  
Old September 30th 17, 04:49 PM posted to uk.tech.digital-tv
Bill Wright[_3_]
external usenet poster
 
Posts: 1,909
Default Duty cycle and transmitters

On 30/09/2017 12:36, David Woolley wrote:

Also remember that the interfering multiplex will be from the same
transmitter, so not a case of a very strong signal blocking a very weak
one, although multiplexes do vary in power.


No, the problem arises when strong but garbled muxes from a nearby but
screened tx are adjacent to weak but clean muxes from a distant tx.

Bill

(The effect of the overload is modelled in determining which
multiplexes, and how much power, can be used at a site.)

Satellite channels alternate the polarisation between adjacent
multiplexes, to allow the adjacent multiplexes to be rejected.


In fact muxes on the two polarities overlap. There's no significant
space between muxes on each polarity.

Bill

  #13  
Old September 30th 17, 06:46 PM posted to uk.tech.digital-tv
R. Mark Clayton[_2_]
external usenet poster
 
Posts: 506
Default Duty cycle and transmitters

On Saturday, 30 September 2017 09:46:38 UTC+1, Brian Gaff wrote:
I was pondering something the other day about the use of bandwidth and how
hard the transmitters had to work.
In the old analogue days where a single channel took up a whole 8 meg
chunk, much of the time being AM the transmitter was not working that hard,
but today as they are multiplexes and tuning over them with an analogue
radio shows the whole 8 megs to be basically a broadband constant noise, it
seems to me that the power circuits and most of the rest are working flat
out all the time.
I do not know if the transmitters themselves were all replaced at each site
or just more added for the extra multiplexes, but if the former I'd have
thought it unlikely they would have been able to cope with that high power
level continuously.
also in the receiver department of tvs, we used to have guard channels so
bleed over did not happen, but nowadays this seems to not be the case as
once again the next channel starts very near where the previous one stopped.
How do they stop massive intermodulation in the tuner and hence a drop in
the quality of the digital stream?
Obviously it works, but it all seems to be counter intuitive.

I assume that the sat channels now do similar things to squeeze channels in,
as well.
People I talk to are however starting to notice some of the weirdnesses of
the Digital system more these days. Strange lack of small movements in large
areas of grass or single colour areas, occasional little triangles on busy
sequences and the terrible quality of many SD channels apparently.
I also notice a bad tendency toward the audio artefacts one hears in
lossy compressed audio that has been fed through too many encode decode
cycles for their own good.
As I have said before, I'd imagine if every channel was broadcasting a
picture of white noise on a screen then the system just could not cope at
all!
Brian

Maybe they are, but the power levels on analogue were many times what they are now - e.g. https://en.wikipedia.org/wiki/Winter...itting_station

was 0.5Mw, now 100kW per channel. And a TV signal was IIRC FM and pretty busy when there is a program.
  #14  
Old September 30th 17, 07:00 PM posted to uk.tech.digital-tv
NY
external usenet poster
 
Posts: 1,187
Default Duty cycle and transmitters

"R. Mark Clayton" wrote in message
...
was 0.5Mw, now 100kW per channel. And a TV signal was IIRC FM and pretty
busy when there is a program


Analogue TV was AM, with one of the two sidebands low-pass filtered
(so-called vestigial sideband modulation). The sound was FM on a carrier
spaced a few MHz from the video carrier (I forget the spacing) - presumably
on the sideband which has had the higher frequencies filtered out.

As other people have mentioned, 625-line TV was negatively modulated, so the
highest power is for the sync pulses, then for black, with the white parts
of the signal having least modulation. This was done to reduce the visible
effect of spark interference (eg from badly-suppressed car ignitions),
because black dots are less noticeable than white ones. 450-line TV used the
same type of VSB AM but with positive modulation so spark interference
showed as white speckle on the picture. I'm not sure whether 405-line sound
was AM or FM.

  #15  
Old September 30th 17, 08:35 PM posted to uk.tech.digital-tv
Woody[_5_]
external usenet poster
 
Posts: 1,759
Default Duty cycle and transmitters


"R. Mark Clayton" wrote in message
...
On Saturday, 30 September 2017 09:46:38 UTC+1, Brian Gaff wrote:
I was pondering something the other day about the use of bandwidth
and how
hard the transmitters had to work.
In the old analogue days where a single channel took up a whole 8
meg
chunk, much of the time being AM the transmitter was not working
that hard,
but today as they are multiplexes and tuning over them with an
analogue
radio shows the whole 8 megs to be basically a broadband constant
noise, it
seems to me that the power circuits and most of the rest are
working flat
out all the time.
I do not know if the transmitters themselves were all replaced at
each site
or just more added for the extra multiplexes, but if the former I'd
have
thought it unlikely they would have been able to cope with that
high power
level continuously.
also in the receiver department of tvs, we used to have guard
channels so
bleed over did not happen, but nowadays this seems to not be the
case as
once again the next channel starts very near where the previous one
stopped.
How do they stop massive intermodulation in the tuner and hence a
drop in
the quality of the digital stream?
Obviously it works, but it all seems to be counter intuitive.

I assume that the sat channels now do similar things to squeeze
channels in,
as well.
People I talk to are however starting to notice some of the
weirdnesses of
the Digital system more these days. Strange lack of small movements
in large
areas of grass or single colour areas, occasional little triangles
on busy
sequences and the terrible quality of many SD channels apparently.
I also notice a bad tendency toward the audio artefacts one hears
in
lossy compressed audio that has been fed through too many encode
decode
cycles for their own good.
As I have said before, I'd imagine if every channel was
broadcasting a
picture of white noise on a screen then the system just could not
cope at
all!
Brian

Maybe they are, but the power levels on analogue were many times
what they are now - e.g.
https://en.wikipedia.org/wiki/Winter...itting_station

was 0.5Mw, now 100kW per channel. And a TV signal was IIRC FM and
pretty busy when there is a program.


Its all about measuring.

Analogue TV Tx power was measured on peak sync but the video typically
averaged about 10dB lower than that, so for EM the e.r.p. was 870KW
but with 13dB antenna gain and 1dB cable loss the TX output was 50KW,
that is 2x25KW in parallel. The average video power level would be
around 87KW e.r.p.

DVB is a COFDM of 1705 carriers which to a power meter looks like a
block of continuous level.

Now with DTV the e.r.p. at EM is 174KW so the real (averaged) e.r.p.
is actually 3dB higher. The same calculation applies to WH.

You also have to remember that a modern TV can work on about 20dB less
signal voltage (assuming 100% quality) to get a perfect stable
picture, so its win-win all round really.


--
Woody

harrogate3 at ntlworld dot com




  #17  
Old October 1st 17, 12:15 AM posted to uk.tech.digital-tv
Mark Carver[_2_]
external usenet poster
 
Posts: 322
Default Duty cycle and transmitters

On 30/09/2017 21:35, Woody wrote:

DVB is a COFDM of 1705 carriers which to a power meter looks like a
block of continuous level.


Was 1705 (aka 2k) That mode ceased at DSO. All UK DVB-T1 is now 6,817
(aka 8k)

UK DVB-T2 is at 32k I think ?

--
Mark
Please replace invalid and invalid with gmx and net to reply.
  #18  
Old October 1st 17, 02:20 AM posted to uk.tech.digital-tv
Bill Wright[_3_]
external usenet poster
 
Posts: 1,909
Default Duty cycle and transmitters

On 30/09/2017 20:00, NY wrote:

As other people have mentioned, 625-line TV was negatively modulated, so
the highest power is for the sync pulses, then for black, with the white
parts of the signal having least modulation. This was done to reduce the
visible effect of spark interference (eg from badly-suppressed car
ignitions), because black dots are less noticeable than white ones.


Funny how we still had white dots

450-line TV used the same type of VSB AM but with positive modulation so
spark interference showed as white speckle on the picture. I'm not sure
whether 405-line sound was AM or FM.


It was AM.

Bill

  #19  
Old October 1st 17, 02:22 AM posted to uk.tech.digital-tv
Bill Wright[_3_]
external usenet poster
 
Posts: 1,909
Default Duty cycle and transmitters

On 30/09/2017 21:35, Woody wrote:

DVB is a COFDM of 1705 carriers which to a power meter looks like a
block of continuous level.


Yes, well, it depends on the measurement bandwidth.

Bill
  #20  
Old October 1st 17, 07:46 AM posted to uk.tech.digital-tv
Brian Gaff
external usenet poster
 
Posts: 6,807
Default Duty cycle and transmitters

You mean no fun any more!
I can well remember watching the old 25 line uhf stuff from France with its
positive going video and am sound. Seemed totally daft, as at this distance
interference was very obvious and often disrupted the picture badly and the
sound was hissy.
On the Pal 625 stuff from Belgium as we used to see it on normal tvs with
modified intercariers sound the picture looked visibly less dirty as
negative going video was used of course.

At least digital seems to have made all the world use very similar systems
and it must decomplicate TV manufacture for the world market no end.

I was interested in the actual powers of the transmitters, as obviously most
sites are not in the middle of their catchments areas so the aerials all
have perceived gain due to the directional effect.

Brian

--
----- -
This newsgroup posting comes to you directly from...
The Sofa of Brian Gaff...

Blind user, so no pictures please!
"Woody" wrote in message
news
Top posted for Brian.

Right, where do we start.

Taking Emley Moor ITV as an example, in the olden days pre D S O we had
two transmitters of 25KW each running in parallel. Each transmitter
comprised two Klystrons one for video putting out 25KW and one for sound
putting out something lower although I cannot remember the exact level.
Although peak black could be higher the peak power was measured on the
sync pulse and the average video level was typically 10dB below peak. With
an antenna gain of 13dB and a feeder loss of 1dB this gave an e.r.p. of
870KW peak or an average of about 87KW. The Klystrons ran the cathode
at -21KV which meant that the anode was very close to or at earth
potential making feed much simpler.

DVB being a COFDM has a pretty constant level when measured by a typical
power meter. The transmitters are IMSMC 10KW and run main/standby. Again
with 13dB antenna gain and 1dB feeder loss this gives an e.r.p. of about
174KW - which is a 3dB increase on the original average power.

Yes the transmitters are running constantly at that power but as they are
modern 'valves' which are being underrun and water cooled under very tight
processor control they are in fact much more reliable.

Lower powered sites such as Sheffield use Rhode and Schwartz or NEC solid
state units, one for each mux, duplicated on some sites but not all.

Relays use a single unit tray non-duplicated of 20W or 50W capability
which, should they fail, are a simple swapout which takes about 20 mins as
the replacement try can be programmed before leaving base.

There is no longer any need to go to site for periodic maintenance checks.
As every transmitter sits on the Arqiva engineering network an engineer
can be detailed to do a PM on a site from his office. All parameters are
visible including power up and down the aerial which shows if the outside
bits are healthy.

Oh what fun it is.


--
Woody

harrogate3 at ntlworld dot com


"Brian Gaff" wrote in message
news
I was pondering something the other day about the use of bandwidth and how
hard the transmitters had to work.
In the old analogue days where a single channel took up a whole 8 meg
chunk, much of the time being AM the transmitter was not working that
hard, but today as they are multiplexes and tuning over them with an
analogue radio shows the whole 8 megs to be basically a broadband
constant noise, it seems to me that the power circuits and most of the
rest are working flat out all the time.
I do not know if the transmitters themselves were all replaced at each
site or just more added for the extra multiplexes, but if the former I'd
have thought it unlikely they would have been able to cope with that high
power level continuously.
also in the receiver department of tvs, we used to have guard channels so
bleed over did not happen, but nowadays this seems to not be the case as
once again the next channel starts very near where the previous one
stopped. How do they stop massive intermodulation in the tuner and
hence a drop in the quality of the digital stream?
Obviously it works, but it all seems to be counter intuitive.

I assume that the sat channels now do similar things to squeeze channels
in, as well.
People I talk to are however starting to notice some of the weirdnesses
of the Digital system more these days. Strange lack of small movements in
large areas of grass or single colour areas, occasional little triangles
on busy sequences and the terrible quality of many SD channels
apparently.
I also notice a bad tendency toward the audio artefacts one hears in
lossy compressed audio that has been fed through too many encode decode
cycles for their own good.
As I have said before, I'd imagine if every channel was broadcasting a
picture of white noise on a screen then the system just could not cope at
all!
Brian

--
----- -
This newsgroup posting comes to you directly from...
The Sofa of Brian Gaff...

Blind user, so no pictures please!





 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT. The time now is 03:02 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.SEO by vBSEO 2.4.0
Copyright 2004-2017 Digital TV Banter.
The comments are property of their posters.