Just how do you see the ‘invisible hazard’? It might be obvious to the naked eye, but that doesn’t mean you’ll see it.

eye hazard

Even if you’re highly trained. Even if you’re aware of the safety risks. Even if you normally consider yourself alert and vigilant. Expectations have a huge part to play here. Our expectations have a disturbing way of interfering with our ability to take the safest course of action. If we don’t expect to see something, it can be ‘ghosted away’ from our attention.

To all intents and purposes, it ceases to exist. This has huge implications if we want to prevent accidents. Despite the fact a potentially huge object – a ship or a plane for example - may be in our field of vision, we may behave as though it simply isn’t there. Our expectations have a disturbing way of interfering with our ability to take the safest course of action.

Failing to spot ships

For example, take the fatal collision between the USS Greeneville nuclear submarine and a Japanese fishing vessel, the Ehime Maru, in 2001. Captain Waddle of the Greeneville was performing an emergency surfacing manoeuvre in a demonstration for a group of VIP civilian visitors. The submarine shot to the surface and its rudder ended up slicing the hull of the 191-feet long Ehime Maru, which sank within 10 minutes, with the tragic loss of nine lives. Was this a case of looking, but not seeing?

It would appear so from the National Transportation Safety Board’s 59-page report. Before performing the manoeuvre, Waddle carried out the necessary periscope scan. But though he looked right towards the Ehime Maru, he failed to see it. How could a normally vigilant captain with all his experience look right at a ship and not see they were on a collision course?

Though the report exhaustively also details the role of distraction and miscommunications in the accident, the key to understanding the immediate cause is in the captain’s words: “I wasn’t looking for it, nor did I expect to see it.

Failing to spot planes

If you’re a pilot coming in to land, you wouldn’t normally expect to see another plane parked on the runway directly in your path. This kind of event, called a ‘runway incursion’ is statistically a very rare occurrence. Yet it does happen, so pilots need to remain constantly alert to avoid a catastrophic collision. In fact, the world’s deadliest aviation accident involved a runway incursion, when two Boeing 747 passenger jets collided in Tenerife, claiming the lives of 583 people in 1977.

If you were landing an aircraft and saw a plane on your approach, you would obviously need to abort the manoeuvre immediately to avoid an accident. But even vigilant, experienced pilots can make critical errors with potentially terrible consequences.

One important study conducted by a NASA research scientist, Richard Haines, demonstrates how our attention can fail us at critical moments. Haines’ research focused on testing eight experienced pilots, with more than a thousand hours’ flight time each, on a Boeing 727 flight simulator. In the experiment, the pilots were given extensive training on the use of a ‘head-up display’. This meant critical information such as altitude, bearing, spread and fuel status were displayed on the windshield.

The pilots practised numerous landings in a range of weather conditions, with and without the head-up display. Once they were familiar with the standard landings, Haines inserted a nasty surprise. On this occasion, the pilots broke through the cloud cover and the runway became visible for their landing attempt just as before.
They monitored their instrument readouts on the head-up display and made their decision as to whether to land or abort.

This time, however, they had to contend with Haines’ surprise in the form of a large jet turning onto the runway right in front of them. Astonishingly, two of the pilots failed to see the jet in the simulator.

When shown a video recording of his landing, one of the pilots said, “I honestly didn’t see anything on that runway.” Both pilots expressed surprise and concern that they had missed the ‘unmissable’ object in their path. Because they didn’t expect to see the jet there, they never saw it. They behaved as though it simply didn’t exist.

Everyday accidents

We have been talking about the potential for big accidents where our expectations fool us into believing a situation is safe when it isn’t. Thankfully, collisions between nuclear submarines and ships, or between two planes, are extremely rare.

But how many accidents on a much smaller scale could be prevented if we acknowledged the potentially dangerous role of expectations? This will no doubt mean humbling ourselves to accept that a high level of prior experience is no effective barrier to an accident.

With the understanding that our minds are vulnerable in this way, we may be able to perform more safely at work, as well as drive more safely on our roads. We can’t always predict the hazards of the future. But the assumption that we will have the same, safe experience we did yesterday is likely to be tested at some point. Perhaps not today, this week, or even this year.

We won’t know exactly when danger will approach. Mindful of this, we can prepare ourselves better for the unexpected. Staying alert to the hazards we least expect could be the key to fewer incidents. At the very least, it should keep us on our toes.

Spot the unexpected, then speak up!

Sometimes you may spot something unexpected in the normal course of your duties. It could be a near-miss, or close call. Although the situation may be recovered at the time, the safety risks could still be present. It’s all too easy to forget to speak up about it because you’re too busy. Or perhaps you’re worried about the repercussions if you do speak up. If there hasn’t been an accident yet, it doesn’t mean there won’t be one in the future. Similar situations may arise in future with less favourable consequences. Many reports to CIRAS start with a feeling that something unexpected could happen if a safety risk is left unaddressed. You can always report to CIRAS in complete confidence knowing that your identity will be protected, whatever happens. Here is an example of someone spotting a hazard, then choosing to report to CIRAS:

Spotting the hazard

  • Observed somebody almost trapped in doors of departing train
  • Contributing factor: Malfunctioning screens making it difficult for drivers to see passengers on the platform
  • Internal reporting had not led to a fix

Reported to CIRAS

  • Reporter’s identity protected
  • Report led to screens being fixed
  • Train operator and Network Rail worked together to ensure safer dispatch

Impact

  • Improved driver visibility of platform
  • Less risk of passenger injury