Jump to content
RMweb
 

Standby power supplies for signalling or signalling centres


imt

Recommended Posts

  • RMweb Gold

I am not trying to start a polemical attack on RailTrack - this is meant to be a sensible question.

 

Looking at the news, once again we have a situation (full details unknown) where signalling centres and signalling has been affected by extreme weather.  Given that this is now the new normal, what can/could/should be done to prevent these problems.  We DO still get air traffic system failures - but they are rare nowadays.  In my day of safety critical systems everything had dual segregated power supplies or batteries or standby generators.

 

Is this a financial problem for the railways or are there practical problems that the uninitiated like me simply don't see or understand?

Link to comment
Share on other sites

If you're referring to the signalling problems in the Leeds area today, I'm not sure we even know yet what the lightning struck and whether a backup power supply would have helped.

 

I'm sure someone else can add to the general detail of whether Network Rail's signalling centres have backup supplies in the first place.

Link to comment
Share on other sites

air traffic controllers don't have electronic infrastructure on the ground to the same scale as railways. railways have track circuits, signals, axle counters, location cabinets. all attached to the ground (earthed) and are the path of least resistance any of these get struck by lighting it will cause signalling failures which no battery or generator will help with as the whole circuit gets the full discharge from the lighting strike. it can last for a few mins or could start fires.  

Link to comment
Share on other sites

  • RMweb Premium

I remember visiting Westbury Panel signal box before it opened (father worked for Westinghouse) and being shown round a room full of batteries. If I recall correctly these were the last resort, if the mains and the generator had failed. That would have been 1984 (I think).

Link to comment
Share on other sites

Most signalbox have a ups (uninterruptible power supply) fitted, and/or generator fitted, this keeps the box going

 

On all SSI installations have a Furse unit ( it’s a combined fuse and surge unit) this limits the damage a lightening strike does,

 

It’s usually the on the ground kit that suffers, or if you lose both of the SSI data links then it’s game over, all the equipment is designed to minimise the fault to smaller area, and prevent too much damage.

 

If you get repeated strikes, then you get the most problems

 

Dan

  • Like 1
Link to comment
Share on other sites

I am not trying to start a polemical attack on RailTrack - this is meant to be a sensible question.

 

Is this a financial problem for the railways or are there practical problems that the uninitiated like me simply don't see or understand?

I’m guessing that the lightning hit line side and not the signalling centre in York or there would be even more disruption?

Link to comment
Share on other sites

Even control centres that are provided with UPS and diesels are not totally immune to power supply outages. I know of two instances, neither on NR, of major control centres having both of these backups losing power to their control systems due to very exotic types of failures. The probability of such a loss becomes exceedingly low once UPS, then diesel, are added to probably two 'mains', but it can never be driven down to absolute zero.

Link to comment
Share on other sites

  • RMweb Premium

I am not trying to start a polemical attack on RailTrack - this is meant to be a sensible question.

 

Looking at the news, once again we have a situation (full details unknown) where signalling centres and signalling has been affected by extreme weather.  Given that this is now the new normal, what can/could/should be done to prevent these problems.  We DO still get air traffic system failures - but they are rare nowadays.  In my day of safety critical systems everything had dual segregated power supplies or batteries or standby generators.

 

Is this a financial problem for the railways or are there practical problems that the uninitiated like me simply don't see or understand?

In all my railway career (45 years and counting) we've had some form of power back-up arrangements at PSBs/major Signalling Centres - it depends on how you use it/configure it that dictates how effective it will be. That said, an awful lot more has been done in the last few years in this area, once the "penny dropped" with the hierarchy on the correct way to mitigate "power failures".

 

I must admit I had to look up "polemical", but Railtrack hasn't been in existence since 2002.

 

You may be jumping to a conclusion in assuming that a lighting strike has disrupted the power supply to cause the signalling problems. The lighting strike could have hit many items of lineside kit to cause the disruption to the signalling systems and left the power supplies still intact (not unknown).

 

Regards, Ian.

Link to comment
Share on other sites

Excellent reply from iands.

 

Network Rail, and before them BR and Railtrack, have had to spend large amounts of money on providing standby supplies for signalling locations, not so much due to the direct risk from bad weather but to protect railway operations from loss of incoming supply from the power companies. UPS is a later development to ensure a seamless transition from incoming mains to standby supply. Lightning strikes can, and have today, caused major signalling failures; It is easy to say that equipment must be protected from such events, but if that was possible it would surely already have been done, given the disruption (and cost) to the railway ?

Link to comment
Share on other sites

In a past life as an alarm engineer the firm I worked for supplied the burglar alarm system protecting the equipment rooms at the bottom of several police radio stations.

This very much in the days of the Northern Ireland troubles.

The masts were, for obvious reasons, on hilltops and prone to lightning strikes, one duly taking a direct hit one day and I was despatched to assess the damage.

If I recall correctly the police radio technician said it had blown all his equipment as well as the fire service radio which was in the same building.

On opening the alarm control panel I was greeted by a charred circuit board and partially melted cables. Not only that but every detector on the system was the same as well as all the cables in the steel conduits around the walls of the building.

We had to replace the entire system including all cables. Mother nature can be far more destructive than, I think, many people really understand.

Link to comment
Share on other sites

Doesn't matter if you've got twenty backup power supplies it won't make one iota of difference if the external plant takes a lightning strike.  Lightning plays by different rules and as others have indicated even the most sophisticated surge and protection equipment doesn't guarantee immunity.  I've seen equipment protected with everything known to man reduced to a smouldering pile of scrap by a lightning strike. 

  • Like 2
Link to comment
Share on other sites

  • RMweb Gold

I'm with Great Central. We can easily buy, for not very much, a multi-way socket that includes a lightning protector. Yeah, right. Several zillion volts being rebuffed nobly by a few pence-worth of electronics. I'm no scientist but, really.... So the best back-up system known to physics is a bit useless if the lightning lands on the interlocking itself, or a route into it.

 

When I was involved - as a user - in the design of signalling systems nearly 40 years ago, we knew we could virtually operate one from the moon. But resilience under damaged conditions is very different. I think Oxted IECC interlocking had been in use a matter of months before a lightning strike knackered it and the railway it enabled. So mega-boxes, which make operation cheaper on good days, become a liability on those days when the system can't play. In the '70s, schemes included local control panels, which could be invoked if the remote control were lost. The signalman in the controlling box would ask by phone for a route to be set and bingo. I did a check on the Lewisham panel in the early '80s and it worked fine - but how often did it need to be used? So these were not pursued in later schemes.

 

Given NR's need to spread the margarine as thinly as it dares, the bad days are infrequent, kind-of predictable if regrettable. But gold-plating costs.

  • Like 4
Link to comment
Share on other sites

I have seen what lightning can do and how indiscriminate it can be against both protected systems and those that are unprotected. One day it will leave something alone and the next it will turn it to charcoal. I have seen lightning strike a pole route with exposed copper wires and all the equipment still work perfectly afterwards. I have watched the arresters under the box take a hit. Believe me i wasn't so close for the next strike. We are told lightning strikes the highest object. It doesn't. I've moved away from an overhead electrification stanchion only to see a telegraph pole struck thirty yards away. Whatever we invent if lightning decides it's going through it it will go through it whatever protection arrangements are in place.

Link to comment
Share on other sites

  • RMweb Premium

A few years ago a lightning strike hit Belford church tower, dislodging quite a bit of masonry, and bounced off on to the nearby NRN tower taking out the basestations, power supplies, cables etc. Took a few days to get it all restored.

 

Lightning will seek out the path of least resistance to earth

 

Regards, Ian.

Link to comment
Share on other sites

But do we actually know what has been affected by a lightning strike? There is the obvious evidence of the aforementioned charcoal in place of a relay or other piece of equipment but sometimes a piece of equipment just ceases to work with no obvious damage.

 

What effect to the 'leaders' in this video have on sensitive solid state equipment?

 

 

They exist more than tenfold for every lightning strike. What are they? If they meet up and create a path then whatever they are flowing through has had it. But what of one passing through a circuit board for the short period that they exist.

?

 

Some people describe them as harmless. I for one would take a step back if someone asked for volunteers to step forward to test if they were harmless.

  • Like 1
Link to comment
Share on other sites

To go back to the original question about what could/should be done, there is clearly a difference between protecting existing installations and what is done in good practice for new installations. I've been involved with several new MRT systems and in general the following will apply.

 

SIgnalling centres and local signalling rooms are usually fed from a system-wide power supply system. This will take power from one or two bulk substations (BSS) that convert power form the local electricity authority at high voltage into a 22kV or 24kV distribution network. Each BSS will be fed by two incoming feeds taken from different parts of the power supply network in case one network is knocked out.

 

The railway system power supply system is duplicated (cables either side of the railway) in case one side is broken.

 

These power supplies feed into duplicate transformers at locations using power. Each transformer is sized to take the full load plus a margin and are usually used together. Essential power supplies are distributed locally on a system that is separate from the 'domestic' use.

 

To counter for both transformers failing, a back-up diesel generator set will be used.

 

To smooth the power available and keep supplies available while standby systems kick in, each critical system will have an associated Uninterruptible Power Supply (UPS) - ie a battery back-up. Again for critical systems these will be duplicated.

 

Lightning protection will be built into the civil design along the entire line. Essentially this will be continuous conductors placed both sides of the route connected to a series of earthing rods at frequent intervals. When lightning strikes the protection conductor it is rapidly dissipated to earth through the multiple earthing rods. With this protection in place, damage to sensitive equipment is usually confined to cases where the earthing arrangements have been disturbed.

 

It would cost a fortune to bring the UK rail network into line with this.

Link to comment
Share on other sites

  • RMweb Premium

Presumably tools like FMEA, FTA and ETA are used to analyse the dependability of the arrangements and understand the failure modes and consequences? One of the big mistakes a lot of people make in systems where high dependability is essential is thinking the answer is redundancy but without considering common mode failures which negate all the benefits of redundancy.

 

A few years ago I was tasked with making a recommendation to my then employer (one of the big household energy suppliers) on whether or not to provide energy to secure data centres. When I looked at it the levels of dependability required were demanding pretty much 100% dependability and with horrendous liquidated damage liability and yet the profit to be made was nothing to get excited about. My recommendation which was accepted was not to go near it with a ten foot pole as there were alternative opportunities to make more at lower risk.

Link to comment
Share on other sites

  • RMweb Premium

As an aside to this I once watched back up supplies in action. It was in 1972 during a series of strikes in the Electrical industry with rotating power cuts ongoing, some of which were predictable, some not. I was part of a tour group being shown round Trent power box. Part way through the tour, Loughborough had a power cut and with it the remote interlocking. Within seconds, IIRC, lights started to come back on on the panel and our guide said that the standby generator had kicked in as it was supposed to. I think that they had to manually reset the train describers with headcodes. All in all it was very impressive to me as a 19 year old student.

 

 

Jamie

Link to comment
Share on other sites

  • RMweb Premium

As an aside to this I once watched back up supplies in action. It was in 1972 during a series of strikes in the Electrical industry with rotating power cuts ongoing, some of which were predictable, some not. I was part of a tour group being shown round Trent power box. Part way through the tour, Loughborough had a power cut and with it the remote interlocking. Within seconds, IIRC, lights started to come back on on the panel and our guide said that the standby generator had kicked in as it was supposed to. I think that they had to manually reset the train describers with headcodes. All in all it was very impressive to me as a 19 year old student.

 

 

Jamie

 

 

Most of the ships I was on had standing instructions to the engineers that if the ship blacked out that we were not to interfere, let it black out and let the power management system recover it. The automated system was far faster and more effective than a well meaning human starting to finger dab to save the day. The lights were restored pretty much immediately and then the systems ramped up all the key systems, started main generators etc.

  • Like 1
Link to comment
Share on other sites

  • RMweb Gold

Panel Boxes etc do have stand-by generators but the huge areas covered by modern control centres means there are dozens, if not hundreds of intermediate installations that are vulnerable to lightning etc. in addition to the "big house" itself. Everywhere there used to be a "box" there will be a cabin or a container full of the gubbins that enables its former business to be conducted from a hundred miles away.

 

Ever reducing staff levels "on the ground" also ensure that there is unlikely to be anyone "handy" to an incident to institute any alternative working arrangements when the worst does happen. In short, over great swathes of the network, even when a 'Plan B' is feasible, it may take some considerable time to put it into action.

 

Away from main centres you will be very fortunate if your "local" MOM is within a half hour's drive when anything kicks off. In many places, it can be easily double that. Many of them cover large territories and if he/she is at one extremity when a failure occurs at the other (Murphy's Law) there might be eighty miles to cover before they can do any good. Ditto S&T and PWay gangs.

 

The essential problem is that, the longer the chain of communication becomes, the further away any source of assistance is likely to be when a link does break.

 

John

Edited by Dunsignalling
Link to comment
Share on other sites

  • RMweb Premium

Panel Boxes etc do have stand-by generators but the huge areas covered by modern control centres means there are dozens, if not hundreds of intermediate installations that are vulnerable to lightning etc. in addition to the "big house" itself. Everywhere there used to be a "box" there will be a cabin or a container full of the gubbins that enables its former business to be conducted from a hundred miles away.

 

Ever reducing staff levels "on the ground" also ensure that there is unlikely to be anyone "handy" to an incident to institute any alternative working arrangements when the worst does happen. In short, over great swathes of the network, even when a 'Plan B' is feasible, it may take some considerable time to put it into action.

 

Away from main centres you will be very fortunate if your "local" MOM is within a half hour's drive when anything kicks off. In many places, it can be easily double that. Many of them cover large territories and if he/she is at one extremity when a failure occurs at the other (Murphy's Law) there might be eighty miles to cover before they can do any good. Ditto S&T and PWay gangs.

 

The essential problem is that, the longer the chain of communication becomes, the further any away source of assistance is likely to be when a link does break.

 

John

That reads almost like what I remember being reported about the big signalling outage on the GWML not long ago. IIRC some, or a, circuit board had gone and the nearest replacement was at Doncaster and there weren't sufficient qualified personnel available nearby. I seem to remember reports of someone driving from Didcot to Doncaster to get spare parts.

 

Jamie

Link to comment
Share on other sites

  • RMweb Gold

To go back to the original question about what could/should be done, there is clearly a difference between protecting existing installations and what is done in good practice for new installations. I've been involved with several new MRT systems and in general the following will apply.

 

SIgnalling centres and local signalling rooms are usually fed from a system-wide power supply system. This will take power from one or two bulk substations (BSS) that convert power form the local electricity authority at high voltage into a 22kV or 24kV distribution network. Each BSS will be fed by two incoming feeds taken from different parts of the power supply network in case one network is knocked out.

 

<snip>

 

It would cost a fortune to bring the UK rail network into line with this.

 

Thanks very much for the detailed thoughts - which is of course what I was after.  Presumably I didn't ask the question very well - for which I apologise.

 

I was concerned at the post-hoc cost - however if this REALLY IS the new normal for the summer in the UK we are going to have to look at what needs doing and prioritise the requirements.

 

Thanks again.

Link to comment
Share on other sites

  • RMweb Gold

Panel Boxes etc do have stand-by generators but the huge areas covered by modern control centres means there are dozens, if not hundreds of intermediate installations that are vulnerable to lightning etc. in addition to the "big house" itself. Everywhere there used to be a "box" there will be a cabin or a container full of the gubbins that enables its former business to be conducted from a hundred miles away.

 

Ever reducing staff levels "on the ground" also ensure that there is unlikely to be anyone "handy" to an incident to institute any alternative working arrangements when the worst does happen. In short, over great swathes of the network, even when a 'Plan B' is feasible, it may take some considerable time to put it into action.

 

Away from main centres you will be very fortunate if your "local" MOM is within a half hour's drive when anything kicks off. In many places, it can be easily double that. Many of them cover large territories and if he/she is at one extremity when a failure occurs at the other (Murphy's Law) there might be eighty miles to cover before they can do any good. Ditto S&T and PWay gangs.

 

The essential problem is that, the longer the chain of communication becomes, the further any away source of assistance is likely to be when a link does break.

 

John

 

I can see your point.  My wife was a senior manager in British Gas when it was sold to "Sid".  The prediction then was that safety and maintenance would go to the wall in favour of profits to shareholders.  Her response when I asked her about it was that of Mr Urquhart - "You may think that, you may well think that - but I couldn't possibly comment".  She and a large number of her colleagues were provided with well padded "redundancy" not long after.

Link to comment
Share on other sites

My experience is that the big issue to changeover and standby systems is the response of the loads to an event. In a large system, with many loads, it can be very challenging to ensure that no key load drops out and fails to reset.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...