Almost since railways began, they have enjoyed the privilege of being able to provide, operate and maintain their own telecommunications network. In the early days, these would have been telegraph systems communicating by single needle instruments akin to Morse code.
With the advent of telephones, most countries adopted a monopoly regime, often controlled by the postal authorities, to manage the spread of telephony services and to avoid the complicated interfaces that would have arisen if uncontrolled competition had been allowed to happen.
However, the railways (and sometimes the military) were permitted to operate independent networks, but even these had restrictions placed upon them as to breakout connections into the public network.
It is only in recent times, under privatisation, that this monopoly has been disbanded, but still with a government dictated set of rules (in the UK it’s the Office of Communications – Ofcom) to ensure competitive fair play and maintenance of standards. Railways continue to maintain an independent network, as communications for both operational and commercial purposes remain a vital part of running a train service.
Even in the early days, telecom networks needed some form of overview management, if only to get faults fixed and to manage expansion. Traditionally, the focal point would have been the switchboard operator, who would interface between the telephone user and the maintenance engineer. With the widespread use of automatic switching by the 1960s, and the consequent reduction in switchboard services, fault control centres emerged, often staffed on a 24-hour basis, with elementary facilities to assess the performance of the network and to manage the fault rectification process.
From voice to data
The advent of data networks needed a much-enhanced form of network control and, inspired by systems existing on Southern Pacific in the USA, British Rail introduced a Communications Data Control (CDC) to oversee the data connections for its new TOPS (Total Operations Processing System) rolling stock control network, which it introduced in the 1970s. This was the interface between the mainframe computers of the time and the data terminals in freight yards, control offices and suchlike.
Whilst the transmission network had significantly grown in capacity, the technology was still analogue, with modems used to achieve the data requirements. Originally in London, the CDC moved to Nottingham with a duplicate office established in Crewe, thus giving the diversity and resilience required for continued operation should any emergency occur.
Many other data systems were introduced, covering a multitude of applications for which the CDC had responsibility for round-the-clock connectivity.
Exploiting the network
With the existence of a nationwide network, it was perceived by many that the BR telecom assets could be used for the greater good, using the network to become another nationwide public operator with the railway requirements being bought-in at preferential rates.
This led to the creation of British Rail Telecom (BRT) and the eventual sell-off of the core network structure, but the venture was flawed for two basic reasons. Firstly, many local telecom assets remained with the rail companies, leading to confusion as to who owned what and an uncomfortable interface between the two. Secondly, it brought home to senior rail management that telecoms was an essential part of rail operations and to entrust this to what was, effectively, a third party, introduced enormous risks.
Privatisation, Network Rail and telecommunications
Much has been written on rail privatisation and the pros and cons can be debated at length. In the UK, following some serious accidents, the infrastructure has effectively been re-nationalised to become Network Rail.
The lessons of BRT were learned but, equally, the national telecom assets would not fit comfortably into the route and geographic management structure produced for the track and signalling.
Thus, in 2011, Network Rail Telecom (NRT) was created to operate the cable, transmission, voice and data networks as a nationwide entity. A significant investment had been made to create the Fixed Telecom Network (FTN) that provides a nationwide, resilient digital transmission bearer borne upon fibre cables and used to support the GSM-R radio network.
Subsequently, an additional optical network – FTNx – has been constructed to support the increasing usage of IP enabled devices.
Both FTN and FTNx are now used as a universal bearer for all railway telecom and data requirements, including links for signalling SSI (solid-state interlockings) and the SCADA (supervisory control and data acquisition) power control network from their respective control centres.
Transmission and cable hierarchy
To obtain near-guaranteed resilience, the FTN is structured in a series of transmission rings based on SDH (Synchronous Digital Hierarchy) technology. The core rings, known as STM16 (Synchronous Transport Module, level 16) have a 2.5Gbit/sec capacity, each capable of carrying the equivalent of 30,248 telephone circuits, with nodes at the main rail centres.
Below these in the hierarchy are the access rings, known as STM4 and STM1, which have capabilities of 622 Mbit/sec and 155Mbit/sec, with access points at all major stations, depots and office locations.
Even further down come the local access links using PDH (Plesiochronous Digital Hierarchy) at 2Mbit/sec and 64kbit/sec that incorporate analogue-to-digital conversion, where needed, for connection to data terminals, telephones, TV cameras and other devices.
A similar hierarchy exists for the FTNx network, to enable full connectivity for IP based systems. The core level has a capacity of 80 x 100Gbit/sec on a single pair of fibres using DWDM (Dense Wavelength Division Multiplex). Railway usage is small in terms of this capacity and efforts are being made to exploit spare bandwidth for other user groups, particularly in remoter areas. NRT will retain ownership of the network, to avoid the situations that developed with BRT.
The NRT portfolio includes 18,000km of fibre cable, 22,000km of copper cable and 3,500 data connectivity nodes. The rings are routed via separate geographical paths and cover the entire rail network. Where a ring formation is not physically possible, such as in Cornwall, the alternative path is acquired from BT or another supplier on a rented basis.
The GSM-R radio network consists of two MSCs (Mobile Switching Centres) and 2,500 radio masts, the latter all requiring a local REB (Relocatable Equipment Building) for the associated BTS (Base Transmitting Station) equipment. These BTSs are connected to a series of BSCs (Base Station Controllers), each of which can support up to 250 BTSs with each BSC connected to both MSCs via the FTN network.
A total of 55,000 radio devices exist, mounted in train cabs (4,000 traction units are so equipped), on yellow plant, and used as personal mobile handsets. A recent addition has been to give the emergency services access to GSM-R for particular locations, the Severn Tunnel being an example.
All this sounds a lot, but it is small by comparison with the public mobile providers. However, with GSM-R (and any future radio network) being a fundamental element of the European Rail Traffic Management System (ERTMS), it can be seen how vital all of this is to rail operations.
The Network Management Centre
As can be imagined, the day-to-day management of the entire NRT operation needs to be professional and all embracing. It is the nerve centre of the operation and Rail Engineer went to visit the site to see just what is involved.
Arranged into two sections, one for FTN, the other for FTNx, the array of screens enables the controllers to view the performance of all the transmission rings and the GSM-R sites. A key objective is the continued operation of the signalling and SCADA systems that Network Rail has for train operation and forms part of the overall digital transformation that is taking place.
Currently, around 200,000 devices of all kinds are monitored. However, as the Digital Railway is introduced in the coming years, more capacity will be needed, so the NMC is being expanded to cater for the passenger experience, safety and security, e-ticketing, multi- media information and suchlike.
All sites are already monitored for environment, security access, air conditioning, power provision, door open/closed conditions as well as equipment performance and alarms. SSI signalling links are automatically re-routed within 15ms should a problem occur. Manual re-routing of circuits is possible if the automatic routines are viewed as inappropriate.
Should a cable strike occur, the typical repair time is between four and five hours. If a length of cable is stolen (fortunately, thieves are learning that fibre has no value), it may take up to three days to effect a replacement, depending on where it happens and ease of access. In such circumstances, there is a risk of the remaining path also developing a problem – it does sometimes happen – and there are escalation procedures in place to speed up the repair.
The FTN network’s control centre equipment and screens show the performance of circuits on graphical displays and the level of detection is such that a fibre distortion can be immediately detected, which could mean a cut or theft about to happen.
Use of the FTNx network is expanding all the time. On the new Borders Railway in Scotland, all signalling and telecommunications links are based on IP-networking and come under the control of the NMC.
The GSM-R network has terminals at signalling centres to enable signallers to access the radio facilities. Similar terminals are available in the NMC to permit testing of the on-air radio performance from the BTS sites.
Things have come a long way since the early days of communication networks and having the means to monitor and control the systems is a vital element of network operation. It’s all part of the digital revolution.
Is Big Brother watching you? Maybe, but it is for your own good!
Read more: Ethical engineering