Jump to content
 

Standby power supplies for signalling or signalling centres


imt
 Share

Recommended Posts

I don't think the size of the control area in itself has a big effect on the severity of this type of incident, except perhaps if the centre itself is hit which doesn't seem to have been the case this time and probably won't ever be for lightning strikes.  If the problem is somewhere out on the network it doesn't matter too much if the controlling panel is 100 yards or 100 miles away, and a local emergency panel wouldn't protect against damage to the trackside equipment and cabling - only to loss of communication to and from the main panel. 

 

A traditional manual signal box might be less prone to this sort of problem as there is less to go wrong and a signaller present to implement emergency working (if still fit for duty after the incident).  But there are other causes such as a lone signaller going sick or (as in one recent case) getting locked in the toilet.  The loss of one box on a route can block it nearly as effectively as the loss of a more modern control system affecting the whole line.  Chain, weak link and all that. 

Edited by Edwin_m
Link to post
Share on other sites

  • RMweb Gold

I don't think the size of the control area in itself has a big effect on the severity of this type of incident, except perhaps if the centre itself is hit which doesn't seem to have been the case this time and probably won't ever be for lightning strikes.  If the problem is somewhere out on the network it doesn't matter too much if the controlling panel is 100 yards or 100 miles away, and a local emergency panel wouldn't protect against damage to the trackside equipment and cabling - only to loss of communication to and from the main panel. 

 

A traditional manual signal box might be less prone to this sort of problem as there is less to go wrong and a signaller present to implement emergency working (if still fit for duty after the incident).  But there are other causes such as a lone signaller going sick or (as in one recent case) getting locked in the toilet.  The loss of one box on a route can block it nearly as effectively as the loss of a more modern control system affecting the whole line.  Chain, weak link and all that. 

True, but when the controlling panel is far distant, there are many intermediate locations between it and the infrastructure being controlled, each with its own set of vulnerabilities.

 

There's seldom, if ever, a single long cable linking them directly, and the greater the number of connections, the greater the chance of one being severed.

 

John

Edited by Dunsignalling
Link to post
Share on other sites

  • RMweb Premium

Surely modern control centres are linked to area/local nodes by meshed fibre connections, which are pretty resilient to lightning strike.

Correct. And before the fibre "networks", as now, we had diversly (duplicate) routed control circuits to provide resilience. However, this, again as now, wasn't a total guarantee against equipment failure from time to time causing disruption, but resilience has improved significantly over the last few years.

 

Regards, Ian.

Link to post
Share on other sites

  • RMweb Premium

Surely modern control centres are linked to area/local nodes by meshed fibre connections, which are pretty resilient to lightning strike.

 

Lightening (or several million volts) is quite capable of frying fibre optic cable if it hits too close (hint, glass / plastic melt under high heat conditions).

 

However usually the problem is not the fibre links from the control centre - its the trackside transmission rings which all use ordinary copper cabling between location cases etc. If lightening hits the ground in close proximity to those then it can nuke stuff over a very large area

Link to post
Share on other sites

Lightening (or several million volts) is quite capable of frying fibre optic cable if it hits too close (hint, glass / plastic melt under high heat conditions).

 

However usually the problem is not the fibre links from the control centre - its the trackside transmission rings which all use ordinary copper cabling between location cases etc. If lightening hits the ground in close proximity to those then it can nuke stuff over a very large area

Lightning can do quite a lot of damage through locally raising the earth potential. Normally that is not too much of a problem, but where railways are involved, the conductivity of the rails can result in some interesting potential differences between conductors that are locally earthed and others that are remotely earthed.

 

Jim

Link to post
Share on other sites

On a related topic , a colleague and I were working back from Bristol earlier this week when our attention turned to the exhaust outlets for what we presume were the standby generators for the now defunct Bristol Power Box. We were mulling over exactly what power units lurked inside the said building and whether it would turn out to be a treasure trove of Paxman , Maybach or similar. 

 

Does anybody happen to know what engines they were?

Link to post
Share on other sites

  • RMweb Gold

On a related topic , a colleague and I were working back from Bristol earlier this week when our attention turned to the exhaust outlets for what we presume were the standby generators for the now defunct Bristol Power Box. We were mulling over exactly what power units lurked inside the said building and whether it would turn out to be a treasure trove of Paxman , Maybach or similar. 

 

Does anybody happen to know what engines they were?

Can’t help with any details but I do remember that the standby generator on the ground floor at Reading Panel made a pretty impressive sound when the S&T dept. performed their routine test on it (usually at around 0100 on a Sunday).

  • Like 1
Link to post
Share on other sites

  • RMweb Gold

To go back to the original question about what could/should be done, there is clearly a difference between protecting existing installations and what is done in good practice for new installations. I've been involved with several new MRT systems and in general the following will apply.

 

SIgnalling centres and local signalling rooms are usually fed from a system-wide power supply system. This will take power from one or two bulk substations (BSS) that convert power form the local electricity authority at high voltage into a 22kV or 24kV distribution network. Each BSS will be fed by two incoming feeds taken from different parts of the power supply network in case one network is knocked out.

 

The railway system power supply system is duplicated (cables either side of the railway) in case one side is broken.

 

These power supplies feed into duplicate transformers at locations using power. Each transformer is sized to take the full load plus a margin and are usually used together. Essential power supplies are distributed locally on a system that is separate from the 'domestic' use.

 

To counter for both transformers failing, a back-up diesel generator set will be used.

 

To smooth the power available and keep supplies available while standby systems kick in, each critical system will have an associated Uninterruptible Power Supply (UPS) - ie a battery back-up. Again for critical systems these will be duplicated.

 

Lightning protection will be built into the civil design along the entire line. Essentially this will be continuous conductors placed both sides of the route connected to a series of earthing rods at frequent intervals. When lightning strikes the protection conductor it is rapidly dissipated to earth through the multiple earthing rods. With this protection in place, damage to sensitive equipment is usually confined to cases where the earthing arrangements have been disturbed.

 

It would cost a fortune to bring the UK rail network into line with this.

I'm assuming that you are referring to signalling supply points (ssps) where we also supply conductor rails. Some ssps have 25kV supplies as secondary supplies, many have diesel alternators. UPSs were introduced as a SPAD mitigation measure to cover blackouts that happen between mains failure and secondary supplies kicking in.

Link to post
Share on other sites

I'm assuming that you are referring to signalling supply points (ssps) where we also supply conductor rails. Some ssps have 25kV supplies as secondary supplies, many have diesel alternators. UPSs were introduced as a SPAD mitigation measure to cover blackouts that happen between mains failure and secondary supplies kicking in.

Specifically referring to power supply systems - for traction and system power - on modern metros: any resemblance to current UK practice will be coincidental.

 

The comments about duplicated transformers relate to the domestic supply at each station/control centre. Traction transformer rectifier units in my experience are not duplicated at each location as there is enough resilience in a well designed system to cope with loss of a transformer at one site. Transformers for traction are always independent of transformers for station/control centre/depot supplies in a well designed system.

Link to post
Share on other sites

I remember visiting Westbury Panel signal box before it opened (father worked for Westinghouse) and being shown round a room full of batteries. If I recall correctly these were the last resort, if the mains and the generator had failed. That would have been 1984 (I think).

The bank of batteries would probably be the DC supply of 130 Volts nominal   ( B130 in S&T talk )  to operate the motorised  point machines local to the signal box.

Edited by Pandora
Link to post
Share on other sites

On a related topic , a colleague and I were working back from Bristol earlier this week when our attention turned to the exhaust outlets for what we presume were the standby generators for the now defunct Bristol Power Box. We were mulling over exactly what power units lurked inside the said building and whether it would turn out to be a treasure trove of Paxman , Maybach or similar. 

 

Does anybody happen to know what engines they were?

 

Bristol panel is not quite defunct.  It still controls from between Parson St and Nailsea to the boundary with Exeter panel just short of Cogload Jn.

Link to post
Share on other sites

  • RMweb Premium

One of the big problems with the railway standby systems is that they are regarded as fit and forget, which is not the way to treat them. Having been a DC power man for BT, I have seen the contrast between the two ways of looking after supplies.

In BT's case each exchange is provided with an N+1 rectifier system (ie the number of rectifers you need to cover the load, plus an extra one to cover for failures and for battery recharge after mains fail), supplying a number of battery 'strings' (which are designed to cover the load for 4 hours before the battery volts drop below 46v dc (float being 54.5v for a nominal 50v supply). Each exchange has a standby alternator set, again sized to cover the load with at least a 30% headway (its often much more at the rural exchanges). The power group go round (or certainly used to!) and do engine runs at least 6 monthly, where the mains is killed and the engine set allowed to take the exchange load, and then three or four 3kw fan heaters are added to the genny to add a much harder load test.

The battery strings are inspected at the same time, and any set of cells that are over 5 years old are replaced with new ones. The rectifiers are pretty much bullet-proof, and get replaced on failure (there are remote alarms back to the Network Operating Centre for most bits of telecoms kit), but as there is an extra in the power plant the exchange won't go off the air, and indeed the batteries can also take the slack as well.

 

Now on the railway, genny tests do take place, and there are UPS, but from the little info that I've gleaned since being on the railway, the batteries in the UPS aren't routinely changed every 5 years like on BT, and as UPS batteries actually work much harder than proper DC power systems, I find that surprising. There are also a surprising amount of genny tests that fail for one reason or another, and these failures usually cause the UPS's to fail in a domino effect (one caught fire not that long ago near here!).

Manual boxes aren't supplied with any engine sets, there are batteries, but they are what look like carriage batteries, and although float charged, don't seem to get any maintenance at all, which is strange.

REB's (those portable equipment buildings you see at the lineside) again don't seem to have gennys, and presumably also have UPS's, but again often when being tested the UPS fails....

 

I often wonder if anyone does ups calculations when new gear is added to existing sites, as I certainly used to do load checks with a clamp meter when added kit, and it was suprising the amount of places I used to get to that had loads on UPS's far in excess of what they should have been.....

 

Andy G

Link to post
Share on other sites

  • RMweb Premium

One of the big problems with the railway standby systems is that they are regarded as fit and forget, which is not the way to treat them. Having been a DC power man for BT, I have seen the contrast between the two ways of looking after supplies.

In BT's case each exchange is provided with an N+1 rectifier system (ie the number of rectifers you need to cover the load, plus an extra one to cover for failures and for battery recharge after mains fail), supplying a number of battery 'strings' (which are designed to cover the load for 4 hours before the battery volts drop below 46v dc (float being 54.5v for a nominal 50v supply). Each exchange has a standby alternator set, again sized to cover the load with at least a 30% headway (its often much more at the rural exchanges). The power group go round (or certainly used to!) and do engine runs at least 6 monthly, where the mains is killed and the engine set allowed to take the exchange load, and then three or four 3kw fan heaters are added to the genny to add a much harder load test.

The battery strings are inspected at the same time, and any set of cells that are over 5 years old are replaced with new ones. The rectifiers are pretty much bullet-proof, and get replaced on failure (there are remote alarms back to the Network Operating Centre for most bits of telecoms kit), but as there is an extra in the power plant the exchange won't go off the air, and indeed the batteries can also take the slack as well.

 

Now on the railway, genny tests do take place, and there are UPS, but from the little info that I've gleaned since being on the railway, the batteries in the UPS aren't routinely changed every 5 years like on BT, and as UPS batteries actually work much harder than proper DC power systems, I find that surprising. There are also a surprising amount of genny tests that fail for one reason or another, and these failures usually cause the UPS's to fail in a domino effect (one caught fire not that long ago near here!).

Manual boxes aren't supplied with any engine sets, there are batteries, but they are what look like carriage batteries, and although float charged, don't seem to get any maintenance at all, which is strange.

REB's (those portable equipment buildings you see at the lineside) again don't seem to have gennys, and presumably also have UPS's, but again often when being tested the UPS fails....

 

I often wonder if anyone does ups calculations when new gear is added to existing sites, as I certainly used to do load checks with a clamp meter when added kit, and it was suprising the amount of places I used to get to that had loads on UPS's far in excess of what they should have been.....

 

Andy G

Hi Andy,

 

My railway career has spanned a tad short of 45 years, 40 of which have been on the "T" of S&T and I couldn't agree more with what you say. The BR 'Strowger' telephone exchanges always followed the GPO/BT practice in just about everything - even the N+1 rectifiers and batteries. And yes, we used to "load test" the exchange batteries - which was good for overtime as from start to finish it would typically take 12 - 15 hours.

 

We also had standby batteries at all the radio installations (CSR/NRN etc.). The 'bean counters' were, as always, reluctant to spend money on renewals as they had been fed the "fit and forget" line by the hierarchy saying they would last a minimum of 10 years, if not longer! In Railtrack days and after a succession of failures due to the standby batteries not taking the load for more than just a few minutes instead of the design 12 hours (which resulted in substantial penalty payments to the TOCs for train delays), the penny finally dropped with the powers that be, that we engineers were perhaps offering sound advice after all, and we needed a regular regime of battery replacements on operationally critical command and control systems. Although most manufacturers/suppliers stated a working life of 7 years for their batteries, we actually managed to persuade our bosses for a replacement programme of every 5 years.

 

As I've said elsewhere on RMweb, I stick by sound advice given to me when I started my railway career all those years ago (and it applies equally at home as well as at work), "look after your batteries and your batteries will look after you"!

 

Regards, Ian.

Edited by iands
  • Like 3
Link to post
Share on other sites

  • RMweb Premium

I've just had a conversation with my brother who lives in York. He has a friend who is fairly senior in Network Rail and apparently the signalling chaos a week ago was du to a lightning strike taking out York ROC. He obviously doesn't know the technical details of what was hit/fried etc. He lives about a mile from the station and his router was fried. He also had some interesting tales of being hit by lightning when he was a pilot for BOAC then BA. He reckons he had between 6 and 8 strikes during his career but no serious damage apart from a cracked fibreglass nose cone and a few bent panels. Apparently they had more problems with hail which once took out the leading edge flaps on a jumbo.

 

 

Jamie

Link to post
Share on other sites

  • RMweb Premium

York Roc isn't very lucky, in its first year of use, the basement flooded, taking out lots of comms equipment. Strange as York doesn't have much of a history of flooding.....

 

Andy G

Link to post
Share on other sites

  • RMweb Premium

York Roc isn't very lucky, in its first year of use, the basement flooded, taking out lots of comms equipment. Strange as York doesn't have much of a history of flooding.....

 

Andy G

According to my brother an awful lot of people in York had their routers destroyed in the same storm. Many of them are still waiting to get replacements as the sudden demand has overwhelmed the ISP's supply chain. My brother was lucky as he used his own router and just went out and bought a new one the same day.

 

 

Jamie

Link to post
Share on other sites

  • RMweb Premium

"Strange as York doesn't have much of a history of flooding....."

 

Are you referring to just the station area, or York in general? If the latter, then I'm afraid your comment is somewhat wide of the mark.

 

Regards, Ian.

Link to post
Share on other sites

  • RMweb Premium

Lightning is a tricky beast to protect against. The NTE that is on your phone line used to have a gas discharge tube in it, but invariable there was no earth connected to allow the discharge to dissipate to earth, but it appears that these haven't been fitted for some time now, so there is effectively no lightning protection at the subscribers end now.  

I have my phone line connected onwards to a Box connection 301, and in the strips 237A I have gas discharge tubes plugged in, but most importantly I have a decent earth connection, which has so far protected my equipment.

 

There appears to be a whole range of telephone line surge/lightning protectors on the market, all quite costly, which is presumably why no-one ever fits them!

 

Andy G

Edited by uax6
Link to post
Share on other sites

"Strange as York doesn't have much of a history of flooding....."

 

Are you referring to just the station area, or York in general? If the latter, then I'm afraid your comment is somewhat wide of the mark.

 

Regards, Ian.

I took that in the ironic/sarcastic way it was meant!

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...