A Sky, cable and digital tv forum. Digital TV Banter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » Digital TV Banter forum » Digital TV Newsgroups » uk.tech.digital-tv (Digital TV - General)
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

uk.tech.digital-tv (Digital TV - General) (uk.tech.digital-tv) Discussion of all matters technical in origin related to the reception of digital television transmissions, be they via satellite, terrestrial or cable. Advertising is forbidden, with no exceptions.

Duty cycle and transmitters



 
 
Thread Tools Display Modes
  #1  
Old September 30th 17, 08:46 AM posted to uk.tech.digital-tv
Brian Gaff
external usenet poster
 
Posts: 6,807
Default Duty cycle and transmitters

I was pondering something the other day about the use of bandwidth and how
hard the transmitters had to work.
In the old analogue days where a single channel took up a whole 8 meg
chunk, much of the time being AM the transmitter was not working that hard,
but today as they are multiplexes and tuning over them with an analogue
radio shows the whole 8 megs to be basically a broadband constant noise, it
seems to me that the power circuits and most of the rest are working flat
out all the time.
I do not know if the transmitters themselves were all replaced at each site
or just more added for the extra multiplexes, but if the former I'd have
thought it unlikely they would have been able to cope with that high power
level continuously.
also in the receiver department of tvs, we used to have guard channels so
bleed over did not happen, but nowadays this seems to not be the case as
once again the next channel starts very near where the previous one stopped.
How do they stop massive intermodulation in the tuner and hence a drop in
the quality of the digital stream?
Obviously it works, but it all seems to be counter intuitive.

I assume that the sat channels now do similar things to squeeze channels in,
as well.
People I talk to are however starting to notice some of the weirdnesses of
the Digital system more these days. Strange lack of small movements in large
areas of grass or single colour areas, occasional little triangles on busy
sequences and the terrible quality of many SD channels apparently.
I also notice a bad tendency toward the audio artefacts one hears in
lossy compressed audio that has been fed through too many encode decode
cycles for their own good.
As I have said before, I'd imagine if every channel was broadcasting a
picture of white noise on a screen then the system just could not cope at
all!
Brian

--
----- -
This newsgroup posting comes to you directly from...
The Sofa of Brian Gaff...

Blind user, so no pictures please!


  #2  
Old September 30th 17, 09:36 AM posted to uk.tech.digital-tv
Clive Page[_4_]
external usenet poster
 
Posts: 68
Default Duty cycle and transmitters

On 30/09/2017 09:46, Brian Gaff wrote:
I was pondering something the other day about the use of bandwidth and how
hard the transmitters had to work.


I'm sure you are right, they must be near their maximum power most of the time. But in the old analogue days the transmitter power was usually several hundred kilowatts for main stations, simply because you need to get a signal-to-noise ratio of 100:1 or more to avoid the viewer seeing noise in the picture. With digital signals the signal-to-noise ratio needed is much lower, just a few to one so that the bit error rate is negligible. Sorry don't have time to look up the accurate figures. So typically the digital transmitter powers are in the tens of kilowatts rather than hundreds. So I assume it's not too difficult to arrange a bit of a safety margin so they don't overheat.

[big snip]

As I have said before, I'd imagine if every channel was broadcasting a
picture of white noise on a screen then the system just could not cope at
all!


I'm sure that's true, but I guess there is some countermeasure to detect white noise in the incoming stream so that the input feed is switched off when that happens.

--
Clive Page
  #3  
Old September 30th 17, 10:05 AM posted to uk.tech.digital-tv
NY
external usenet poster
 
Posts: 1,187
Default Duty cycle and transmitters

"Clive Page" wrote in message
...
As I have said before, I'd imagine if every channel was broadcasting a
picture of white noise on a screen then the system just could not cope at
all!


I'm sure that's true, but I guess there is some countermeasure to detect
white noise in the incoming stream so that the input feed is switched off
when that happens.


Presumably there's some override for that, to allow for brief periods of
white noise where it is used for dramatic purposes to signal that a TV has
lost its aerial feed etc. Yes I know that's not how digital TVs would
behave, but it's part of drama mythology about technology, in the same way
that when someone hangs up a phone, the person at the other end hears a
dialling tone (in practice they'd hear silence and then maybe a continuous
tone).

  #4  
Old September 30th 17, 11:36 AM posted to uk.tech.digital-tv
David Woolley[_2_]
external usenet poster
 
Posts: 526
Default Duty cycle and transmitters

On 30/09/17 09:46, Brian Gaff wrote:
I was pondering something the other day about the use of bandwidth and how
hard the transmitters had to work.


This is not to do with duty cycle.

Analogue systems wasted a lot of power in the carrier, and the rest was
very unevenly spread across the spectrum. As already pointed, out the
total power had to be higher to achieve an adequate SNR.

Designing an analogue transmitter is probably more difficult from a
power point of view, because, for example, dark scenes (I think I've got
the modulation sense right for the PAL systems) would require
considerably more power than an light ones, and light ones would involve
a power spike at the end of each line.

On a digital multiplex, I imagine that each digital channel is allocated
a fixed bit rate. If that bit rate is too low, the picture will degrade.

Note that digital video codec standards don't specify how to encode the
video, only how to decode it, so it is a decision for broadcaster and
the codec designer as to how they degrade the picture to meet the bit
rate limitations.

I would say the effect of noise like sources was factored out before the
signal left the studio, and before it was recorded.

Even if they use adjacent multiplexes, the power density in the
interfering multiplex only needs to be reduced sufficiently that it is
small compared with the eye height on the wanted multiplex. With 64QAM,
that would be maybe 25dB down.

Also remember that the interfering multiplex will be from the same
transmitter, so not a case of a very strong signal blocking a very weak
one, although multiplexes do vary in power.

(The effect of the overload is modelled in determining which
multiplexes, and how much power, can be used at a site.)

Satellite channels alternate the polarisation between adjacent
multiplexes, to allow the adjacent multiplexes to be rejected.
  #5  
Old September 30th 17, 01:35 PM posted to uk.tech.digital-tv
Jim Lesurf[_2_]
external usenet poster
 
Posts: 4,182
Default Duty cycle and transmitters

In article , David Woolley
wrote:
On 30/09/17 09:46, Brian Gaff wrote:
I was pondering something the other day about the use of bandwidth and
how hard the transmitters had to work.



Note that digital video codec standards don't specify how to encode the
video, only how to decode it, so it is a decision for broadcaster and
the codec designer as to how they degrade the picture to meet the bit
rate limitations.


I would say the effect of noise like sources was factored out before the
signal left the studio, and before it was recorded.


Actually, the signals are then reprocessed with the aim of getting what
information theorists would describe as a reasonably 'efficient' signal
pattern. One of the characteristics of this is, actually, that the result
shares many statistical properties with noise. Hence, statistically, it
doesn't - on average - matter much what the indivudual AV streams contain.
The stats of the output spectrum from the TX remain much the same.


Even if they use adjacent multiplexes, the power density in the
interfering multiplex only needs to be reduced sufficiently that it is
small compared with the eye height on the wanted multiplex. With 64QAM,
that would be maybe 25dB down.


Erm. Not quite sure what you're referring to here. One the spectra I have
in front of me, taken from our DVB antenna downlead I can see many example
of adjacent pairs of muxes whose levels are only different by around 3 to 6
dB. Yet I don't get reception problems even for receving the *weaker* of
the pair.

You won't get 'eye patterns' until the mutltiplex had been though some
stages of decoding - which provides some conversion 'gains'. One of the
advantages of the approach the UK adopted. Not sure off hand what the gain
values would be, but you may be thinking of what you'd get after these
steps. Not to the RF mux powers.


Also remember that the interfering multiplex will be from the same
transmitter, so not a case of a very strong signal blocking a very weak
one, although multiplexes do vary in power.


Here I get adjacent channel pairs from the same TX.

What they *do* avoid is having 'triple' MUXs - i.e. situations where one
mux had others of the same sort of level on *both* its adjacent sides. I
assume this is because of intermod problems which may arise. Which then
make the first stages of decoding harder.

Jim

--
Please use the address on the audiomisc page if you wish to email me.
Electronics https://www.st-andrews.ac.uk/~www_pa...o/electron.htm
Armstrong Audio http://www.audiomisc.co.uk/Armstrong/armstrong.html
Audio Misc http://www.audiomisc.co.uk/index.html

  #6  
Old September 30th 17, 03:11 PM posted to uk.tech.digital-tv
David Woolley[_2_]
external usenet poster
 
Posts: 526
Default Duty cycle and transmitters

On 30/09/17 14:35, Jim Lesurf wrote:
In article , David Woolley
wrote:

Actually, the signals are then reprocessed with the aim of getting what
information theorists would describe as a reasonably 'efficient' signal
pattern. One of the characteristics of this is, actually, that the result
shares many statistical properties with noise. Hence, statistically, it
doesn't - on average - matter much what the indivudual AV streams contain.
The stats of the output spectrum from the TX remain much the same.


I assumed the subject had shifted to the fact that noise like video
requires a higher bit rate, so would overload the throughput of the mux.
As you hint, a noise like signal into the modulator is actually desirable.


Erm. Not quite sure what you're referring to here. One the spectra I have
in front of me, taken from our DVB antenna downlead I can see many example
of adjacent pairs of muxes whose levels are only different by around 3 to 6
dB. Yet I don't get reception problems even for receving the *weaker* of
the pair.


Are you measuring the adjacent muxes after the IF filters? I was saying
that the IF filters only need to reduce the adjacent mux by this sort of
level, but might have to reduce an adjacent analogue channel much more
to avoid analogue signal interfering.

You won't get 'eye patterns' until the mutltiplex had been though some
stages of decoding - which provides some conversion 'gains'. One of the


Eye patterns happen after the analogue and DFT processing, but before
the error correction code processing, so there is no coding gain.

advantages of the approach the UK adopted. Not sure off hand what the gain
values would be, but you may be thinking of what you'd get after these
steps. Not to the RF mux powers.


No. Not the RF mux powers, but the powers on the skirts of the IF filters.


Also remember that the interfering multiplex will be from the same
transmitter, so not a case of a very strong signal blocking a very weak
one, although multiplexes do vary in power.


Here I get adjacent channel pairs from the same TX.

What they *do* avoid is having 'triple' MUXs - i.e. situations where one
mux had others of the same sort of level on *both* its adjacent sides. I
assume this is because of intermod problems which may arise. Which then
make the first stages of decoding harder.


Obviously the more input power you have the more the intermodulation is,
but the biggest intermodulation problems would have arisen from the spot
frequency carriers of analogue signals. As you point out above, digital
signals are noise like, so they have a very even power distribution, so
intermodulation products are low and also spread out.

My back of envelope calculation was actually based on aliasing in the
FFT that separates out the individual, COFDM, sub-carriers. The low end
of one multiplex is likely to alias into the high end of the adjacent
multiplex, and you want the aliased sub-carriers not to result in an
erroneous slicing. Actually, because of the FEC, you can even accept
some of that. However, I would assume that only becomes an issue when
the eye is degraded by a low overall SNR.
  #7  
Old September 30th 17, 03:41 PM posted to uk.tech.digital-tv
Jim Lesurf[_2_]
external usenet poster
 
Posts: 4,182
Default Duty cycle and transmitters

In article , David Woolley
wrote:
On 30/09/17 14:35, Jim Lesurf wrote:
In article , David Woolley
wrote:




Are you measuring the adjacent muxes after the IF filters? I was saying
that the IF filters only need to reduce the adjacent mux by this sort
of level, but might have to reduce an adjacent analogue channel much
more to avoid analogue signal interfering.


I'm reporting the RF levels as seen with essentially a spectrum analyser
attached to the downlead that would feed the RX.


You won't get 'eye patterns' until the mutltiplex had been though some
stages of decoding - which provides some conversion 'gains'. One of the


Eye patterns happen after the analogue and DFT processing, but before
the error correction code processing, so there is no coding gain.


There can be process gain in the earlier stages. DFT gives correlation
gain, for example. Methods like Viterbi tend just to shift the deckchairs
about in comparison IIRC. :-)


advantages of the approach the UK adopted. Not sure off hand what the
gain values would be, but you may be thinking of what you'd get after
these steps. Not to the RF mux powers.


No. Not the RF mux powers, but the powers on the skirts of the IF
filters.


OK. That said, I'm not sure how much 'hardware' filtering synthetic front
ends do these days.

Jim

--
Please use the address on the audiomisc page if you wish to email me.
Electronics https://www.st-andrews.ac.uk/~www_pa...o/electron.htm
Armstrong Audio http://www.audiomisc.co.uk/Armstrong/armstrong.html
Audio Misc http://www.audiomisc.co.uk/index.html

  #8  
Old September 30th 17, 03:48 PM posted to uk.tech.digital-tv
Woody[_5_]
external usenet poster
 
Posts: 1,759
Default Duty cycle and transmitters

Top posted for Brian.

Right, where do we start.

Taking Emley Moor ITV as an example, in the olden days pre D S O we
had two transmitters of 25KW each running in parallel. Each
transmitter comprised two Klystrons one for video putting out 25KW and
one for sound putting out something lower although I cannot remember
the exact level. Although peak black could be higher the peak power
was measured on the sync pulse and the average video level was
typically 10dB below peak. With an antenna gain of 13dB and a feeder
loss of 1dB this gave an e.r.p. of 870KW peak or an average of about
87KW. The Klystrons ran the cathode at -21KV which meant that the
anode was very close to or at earth potential making feed much
simpler.

DVB being a COFDM has a pretty constant level when measured by a
typical power meter. The transmitters are IMSMC 10KW and run
main/standby. Again with 13dB antenna gain and 1dB feeder loss this
gives an e.r.p. of about 174KW - which is a 3dB increase on the
original average power.

Yes the transmitters are running constantly at that power but as they
are modern 'valves' which are being underrun and water cooled under
very tight processor control they are in fact much more reliable.

Lower powered sites such as Sheffield use Rhode and Schwartz or NEC
solid state units, one for each mux, duplicated on some sites but not
all.

Relays use a single unit tray non-duplicated of 20W or 50W capability
which, should they fail, are a simple swapout which takes about 20
mins as the replacement try can be programmed before leaving base.

There is no longer any need to go to site for periodic maintenance
checks. As every transmitter sits on the Arqiva engineering network an
engineer can be detailed to do a PM on a site from his office. All
parameters are visible including power up and down the aerial which
shows if the outside bits are healthy.

Oh what fun it is.


--
Woody

harrogate3 at ntlworld dot com


"Brian Gaff" wrote in message
news
I was pondering something the other day about the use of bandwidth
and how hard the transmitters had to work.
In the old analogue days where a single channel took up a whole 8
meg chunk, much of the time being AM the transmitter was not working
that hard, but today as they are multiplexes and tuning over them
with an analogue radio shows the whole 8 megs to be basically a
broadband constant noise, it seems to me that the power circuits and
most of the rest are working flat out all the time.
I do not know if the transmitters themselves were all replaced at
each site or just more added for the extra multiplexes, but if the
former I'd have thought it unlikely they would have been able to
cope with that high power level continuously.
also in the receiver department of tvs, we used to have guard
channels so bleed over did not happen, but nowadays this seems to
not be the case as once again the next channel starts very near
where the previous one stopped. How do they stop massive
intermodulation in the tuner and hence a drop in the quality of
the digital stream?
Obviously it works, but it all seems to be counter intuitive.

I assume that the sat channels now do similar things to squeeze
channels in, as well.
People I talk to are however starting to notice some of the
weirdnesses of the Digital system more these days. Strange lack of
small movements in large areas of grass or single colour areas,
occasional little triangles on busy sequences and the terrible
quality of many SD channels apparently.
I also notice a bad tendency toward the audio artefacts one hears
in lossy compressed audio that has been fed through too many encode
decode cycles for their own good.
As I have said before, I'd imagine if every channel was
broadcasting a picture of white noise on a screen then the system
just could not cope at all!
Brian

--
----- -
This newsgroup posting comes to you directly from...
The Sofa of Brian Gaff...

Blind user, so no pictures please!



  #9  
Old September 30th 17, 03:55 PM posted to uk.tech.digital-tv
Woody[_5_]
external usenet poster
 
Posts: 1,759
Default Duty cycle and transmitters


"Woody" wrote in message
news
Top posted for Brian.

Right, where do we start.

Taking Emley Moor ITV as an example, in the olden days pre D S O we
had two transmitters of 25KW each running in parallel. Each
transmitter comprised two Klystrons one for video putting out 25KW
and one for sound putting out something lower although I cannot
remember the exact level. Although peak black could be higher the
peak power was measured on the sync pulse and the average video
level was typically 10dB below peak. With an antenna gain of 13dB
and a feeder loss of 1dB this gave an e.r.p. of 870KW peak or an
average of about 87KW. The Klystrons ran the cathode at -21KV which
meant that the anode was very close to or at earth potential making
feed much simpler.

DVB being a COFDM has a pretty constant level when measured by a
typical power meter. The transmitters are IMSMC 10KW and run
main/standby. Again with 13dB antenna gain and 1dB feeder loss this
gives an e.r.p. of about 174KW - which is a 3dB increase on the
original average power.

Yes the transmitters are running constantly at that power but as
they are modern 'valves' which are being underrun and water cooled
under very tight processor control they are in fact much more
reliable.

Lower powered sites such as Sheffield use Rhode and Schwartz or NEC
solid state units, one for each mux, duplicated on some sites but
not all.

Relays use a single unit tray non-duplicated of 20W or 50W
capability which, should they fail, are a simple swapout which takes
about 20 mins as the replacement try can be programmed before
leaving base.

There is no longer any need to go to site for periodic maintenance
checks. As every transmitter sits on the Arqiva engineering network
an engineer can be detailed to do a PM on a site from his office.
All parameters are visible including power up and down the aerial
which shows if the outside bits are healthy.

Oh what fun it is.


--
Woody

harrogate3 at ntlworld dot com


I should have said that a Klystron was a ceramic unit about 4ft tall
and maybe 4-5 inches diameter: the 'valves' now in use are an inverted
cone in shape about 18 or so inches deep and maybe 15 inches diameter
at the widest with contact rings that supply power etc etc. They run
off -35KV and on stations like Emley there are UPS units (a whole room
full of 'em) that will keep all four of the main transmitters (PSB1-3
and Com4) going for up to 10 minutes whilst the backup gen starts.


--
Woody

harrogate3 at ntlworld dot com


  #10  
Old September 30th 17, 04:21 PM posted to uk.tech.digital-tv
Bill Wright[_3_]
external usenet poster
 
Posts: 1,909
Default Duty cycle and transmitters

On 30/09/2017 09:46, Brian Gaff wrote:
I was pondering something the other day about the use of bandwidth and how
hard the transmitters had to work.
In the old analogue days where a single channel took up a whole 8 meg
chunk, much of the time being AM the transmitter was not working that hard,
but today as they are multiplexes and tuning over them with an analogue
radio shows the whole 8 megs to be basically a broadband constant noise, it
seems to me that the power circuits and most of the rest are working flat
out all the time.
I do not know if the transmitters themselves were all replaced at each site
or just more added for the extra multiplexes, but if the former I'd have
thought it unlikely they would have been able to cope with that high power
level continuously.


Don't forget that because of the much lower carrier/noise requirement at
the receiver, transmission powers are generally significantly lower.

I know it's a crude way to assess this, but I can tell you that if four
analogue channels are removed from the input of a distribution amplifier
and replaced by six muxes at levels that reflect typical relative
transmission powers then it will be possible to increase the gain of the
amp substantially before crossmod and intermod effects occur.

also in the receiver department of tvs, we used to have guard channels so
bleed over did not happen, but nowadays this seems to not be the case as
once again the next channel starts very near where the previous one stopped.
How do they stop massive intermodulation

You mean crossmod

in the tuner and hence a drop in
the quality of the digital stream?

It is remarkable how selective most receivers seem to be. Problems only
arise when the unwanted adjacent signal is stronger than the wanted one.
Stronger? Oh, anything from 6 to 20dB before anything untoward happens.

Obviously it works, but it all seems to be counter intuitive.

I agree, but our intuition is based on outdated and obsolete parameters.

Bill
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT. The time now is 03:03 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.SEO by vBSEO 2.4.0
Copyright 2004-2017 Digital TV Banter.
The comments are property of their posters.