Imagine using a train and a strange, busy station on your own if you are unable to read the signs and unsure of where you need to go. Network Rail has been involved with the trial of a possible solution to help customers who are partially-sighted or blind to make end- to-end journeys involving rail.

As part of the Reading station redevelopment, Network Rail was keen to make sure the station was as user friendly and as accessible as possible for all customers. Part of this work involved working with Guide Dogs on the design of the station. Formally called The Guide Dogs for the Blind Association, Guide Dogs provides mobility for people who are blind and partially-sighted and also supports research into technology to assist such people. The work at Reading involved signage, tactile paving, audio announcements and braille maps. As part of its consultation work, Guide Dogs also introduced Network Rail to the Microsoft Directional Audio trial.

The system

A team from Guide Dogs, Microsoft and Future Cities Catapult was testing a system on a sample journey from Reading to London encompassing walking routes, shopping, bus and train travel. A small headset is paired with a Windows Phone application and uses cloud-based location and navigation data. This works with a network of Bluetooth and Wi-Fi beacons to create a personalised Microsoft soundscape transmitted through the wearer’s jawbone.

The application helps both orientation and navigation and also provides enhanced contextual information, such as points of interest and additional journey details, to help the user build up an understanding of their surroundings. This information is transmitted through bone-conducting technology, which means that sounds appear to come from outside of the user’s head. For example, if there is a coffee shop on the user’s right, ‘coffee shop’ will be read out from their right, allowing them to build up a mental image of their surroundings.

The headset is a modified pair of AfterShokz headphones that hooks over the wearer’s ears and rests on their jawbone, transmitting sound to their inner ear using vibrations. This means that the wearer can hear sound from the headphones and from their environment simultaneously as the headset does not cover their ears. As a result, users are very aware of their surroundings giving them the confidence to make their own decisions rather than just be guided by a pre-recorded script.

20140910_edlmn_ms_0545 [online]

On the back of the headset there is a small 3D-printed box containing a Bluetooth receiver and transmitter, an accelerometer, a gyroscope and a compass, and a GPS chip so that the user’s position can be tracked. The user selects a destination using their smartphone and the headset provides audio cues – it emits a continuous pinging sound when the user is following the correct route and a swishing noise if they wander off-course. It can also inform them when they reach junctions and issue verbal directions.

Buttons on the headset allow access to more-detailed or historical information about specific points of interest. So, for example, on passing a town hall, a press of the button will find out when it was built and what is happening there. All information is sourced from Microsoft Bing.

Microsoft’s audio technology means that the sound is directional, so if the attraction in question is several metres ahead to the right, the sound will appear to come from that direction. Each headset is tailored to the user’s unique specification, to create a model of their head. This helps to enhance the quality of sound placement.

The Bluetooth beacons are matchbox-sized and for the trial were battery operated. For accuracy within buildings both Wi-Fi (802.11) and Bluetooth (802.15) devices are used. The Wi-Fi communicates directly with the smartphone, with the blue tooth beacons communicating via the headset and then to the smartphone. Supporting information, such as on those points of interest, is pulled through a combination of Bing and/ or processed on the Azure cloud platform as necessary.

Bing is the Microsoft web search engine and Azure is a cloud- computing platform created by Microsoft for digital applications and services through a global network of Microsoft-managed data centres.

Reading station and Network Rail

Network Rail started to work with this new technology in September 2013. It quickly became clear that there was synergy with the trial and the Reading station development as the technology could easily be installed as part of the station programme. Network Rail enabled the station part of the end-to-end journey by providing a Bluetooth route through the station from the ticket barrier to the platform. This involved installing hard-wired Wi-Fi devices and Bluetooth beacons along the route.

For the trial at Reading, eight Wi-Fi devices and twelve Bluetooth beacons were installed. Network Rail worked with Microsoft closely on a daily basis to move the devices and beacons around the station to get the best positional location and communication links possible. The system was subject to extensive testing and set-up by Microsoft over a three-week period to understand the signal strength and triangulation. The application developers, located in Seattle, USA, were involved in refining and modifying the system for a station environment. Once it was confirmed a success, the system was announced in November 2014.

All of the journeys made by blind and partially-sighted people during the trial were made with the back-up of a sighted person being present.

20140910_edlmn_ms_0267 [online]

Phase 2

Feedback from users was very positive, and the evidence from phase 1 will be used to make a safety argument and validation for a standalone phase 2. This will look at how the technology, and a large number of mains-powered beacons, can be installed throughout the station, not just for wayfinding but also for beneficial supplementary contextual information regarding retail outlets, waiting rooms, refreshments and additional journey details. Even retail staff may be provided with Bluetooth beacons so that partially- sighted people know where to obtain assistance to help them through a station.

Looking to the future

The long-term ambition for this audio technology is to bring other organisations and local authorities across the UK on board. Then more people living with sight loss, or anyone living in a city, can benefit from its services. With two million people in the UK already living with impaired vision, the potential impact of this kind of project is great. The ability to travel independently can significantly affect a person’s ability to have a social life and their ability to get a job.

It is possible that the technology could be used to enhance the usability of stations for all customers and not just the blind or partially-sighted. Complex, busy stations are not the easiest places to find one’s way around and can be very off putting and intimidating places, in particular for the occasional user of rail transport. Anything that can be done to make rail travel easier and more attractive is welcome.

We would like to thank Matthew Jackson, senior programme manager, Thames Valley Area Infrastructure Projects, Stations and Civils, Network Rail, for his assistance with this article.