L
o
a
d
i
n
g
.
.
.

What Gets Measured Gets Done

Article from ProfessionalSafety: Journal of the American Society of Safety Engineers
By Brett Solomon (Senior Consultant at Sentis)

first aid hard hatsIt is insufficient to talk about safety and only measure process performance and production.  If workers are measured against production targets, that is where they will focus.  Most companies monitor lagging indicators such as their lost-time injury frequency rates (LTIFR) or incident severity.

However, measuring lagging indicators has two downfalls: 1) it is reactive since the incident has already occurred; and 2) it is only an illusion of safety.  It is possible for someone to drive 100 mph all the time and never crash.  If we measured the driver’s LTIFR, it would indicate that s/he is a safe driver even though the opposite is true.

On the other side, one can always drive under the speed limit and for various reasons be involved in several crashes.  This does not necessarily mean that person is a bad or unsafe driver.  Remember, the Deepwater Horizon was celebrating a significant LTI-free period on the very day of the Macondo well blowout.

Another problem with measuring safety is that it can quickly turn into a paper exercise.  As soon as workers are instructed to hand in 10 nonconformances or a foreman must complete seven PTOs for the week, the reason behind the activity gets ignored and completing the required amount becomes the goal. Connected with this is safety rewards.  A company should recognize and reward people for their dedication and commitment to safety.  However, reaching a target such as 1 million LTI-free hours doe not mean a company had no LTI; it just means none were reported.  Companies also must understand that some goals (what they measure) are in conflict with each other.

One company has taken two bold steps concerning this.  Historically, safety was on everyone’s scorecards.  If an incident occurred, the site was penalized.  It did not take long to notice that people were hiding incidents.  Having safety on the scorecards became counterproductive to what the company wanted to achieve: to learn from mistakes to avoid or not repeat them.  To address this, the company made a change.  If an incident occurred, the site would be penalized, as usual; however, if site managers would prove to the executives that they understood the underlying causes of the incident, ensured that it did not recur and share this learning with other sites, their lost points would be redeemed.

This sent a decisive, positive message — safety is still an absolute value and if something happens, consequences remain.

Yet, it opened the door for people to start sharing their concerns and even admit to their mistakes because it was not about punishing them.  The focus was on learning and saving lives.  This move changed the culture of fear of failure to open and honest reporting, sharing and learning in the group.

Eventually, the company removed LTIs from the scorecard because the metric is not a true indication of safety culture.  More emphasis is being placed on identifying the leading indicators when it comes to safety.  Dekker (2006) argues that “safety is not the absence of something, but it is the presence of something.”  This calls for leaders to apply their minds when it comes to measuring safety.  OSH professionals must take the time to ascertain what behaviors and processes are needed to create the desired safety culture and monitor it proactively.