big-data-charts-graphs-mathis.gif

Safety and Performance Excellence: The Magic Pill Called "Big Data"

Sept. 7, 2016
New technology can lead to better results and make it easier to determine leading indicators, but it also has its limitations.

Applying analytics to safety data is being touted as the perfect way to determine the right leading indicators for safety. Compared to blind choices and wild guesses, the use of statistical analysis certainly has some advantages. Admittedly, pursuing the wrong leading indicators can be expensive and dangerous. Modern analytics has progressed significantly with the computerization of mathematical processes and use of massive sets of data. Using this technology to find a starting place for determining leading indicators makes a lot of sense. That said, analytics is not necessarily the silver bullet or magic pill either. The early use of analytics to determine leading indicators potentially has at least five serious limitations:

Garbage In, Garbage Out

The data being used to seek out leading indicators often is taken from accident investigations, near-miss reports, behavioral observations, site and process audits and similar sources. All of these are subject to massive subjectivity on the part of the person gathering the data and vary tremendously due to different methodologies, such as root-cause analysis and differing observation and audit processes. The level of analysis and description differs greatly on accident and near-miss reports. Items on audit lists and behaviors on observation checklists can be different site to site. Physical conditions can be different and their audits can have very little similarity. The old premise of data analysis is the quality of the output never is any better than the quality of the input.

Correlation versus Causation

In truth, data analysis can determine correlation but cannot determine causation. The fact that outcomes generally are better when certain things happen upstream is interesting and often a good starting place. However, it has been proven many times that imitating the success of another person, site or company is no guarantee of success for the imitator.  Marshall Goldsmith pointed out the futility of imitation in his book, What Got You Here Won't Get You There. Looking for the cause of success or failure still relies on the lagging indicators won which we are trying to end our reliance. The truth is some sites are successful in spite of the way they manage and practice safety, and finding correlations between their methods and their outcomes has limited value.  

Hawthorne Effect

The famous Hawthorne experiments found a correlation. They determined that focusing attention on something temporarily can improve performance. The issue is, when the attention begins to diminish, so does the improved performance. This is problematic for data analysis. A correlation between a new safety program, training session, management directive or almost any other activity focusing attention on safety could show up in "big data" as a corollary. Based on that relationship, the conclusion could be made this is a significant leading indicator of safety when it was, in fact, a temporary phenomenon. Seeing results from such efforts in the past, many organizations launched on a veritable parade of new programs one after the other to keep up safety performance.

They found in the end that these did not produce permanent change and actually lost impact after too many new "programs of the month" started to wear out the attention span of workers. It can be difficult to differentiate between permanent and temporary correlations if they are all labeled as safety programs.

Assuming Site Sameness

The "big data" folks were quick to point out much of the data they used was either hidden or tucked away at the site level rather than gathered enterprise-wide. In their zeal to get enough data points to attain statistical significance, they easily could overlook the significant differences between these sites. Assuming an organization simply is the sum of its sites is like assuming a culture simply is the sum of the people in it. As a business and safety consultant with decades of experience, I have seldom seen a cookie-cutter approach work at multiple sites. Every site has at least small differences that can have a big impact on what works and what doesn't. Assuming the leading indicator determined from data analysis of multiple sites will be a good fit for each site is far from scientific or pragmatic.  Sites with vastly different approaches to safety have produced good lagging indicators. Changing one site to match another, just because the overall data suggests trends, actually can undo highly customized safety efforts that work well for that particular site or culture.

Stopping at Two-Dimensional Thinking

While the development of leading indicators can take safety metrics from one-dimensional (results metrics) to two-dimensional (cause-and-effect metrics), the real world in which safety is practiced is still three-dimensional.  The Balanced Scorecard (multi-level leading indicators), which is in wide use by many organizations to measure their overall performance, seems a better approach. The early attempts to identify leading safety indicators tended to focus on activities such as training, meetings, participation in safety audits and observations, etc. Then the attention turned to worker behaviors and to safety culture. While none of these is sufficient alone, they all make up a process flow for how many organizations approach safety. Safety activities drive safety culture which drives safety performance which, in turn, drives safety results. Looking at leading indicators in sequence, rather than just as a set of individual or unrelated metrics, is much more descriptive of the reality of most safety processes. So far, big data is looking for lone data points that correlate with results rather than looking for a sequence of corollaries.

While the application of new technologies to safety always has potential for improving results, it is important to keep these theoretical tools in a reasonably pragmatic framework.  Subject-matter experts should work closely with safety professionals who understand the workplace realities. Analytics still is a science. For it to be transformed into a technology that can benefit safety will require a realization of its limitations as well as its power.

Terry Mathis, founder and CEO of ProAct Safety,  served as a consultant and advisor for top organizations. A respected strategist and thought leader in the industry, Mathis has authored four books, numerous articles and blogs, and is known for his dynamic and engaging presentations. EHS Today has named him one of the "50 People Who Most Influenced EHS" four consecutive times.  Mathis can be reached at [email protected] or 800-395-1347.

Sponsored Recommendations

ISO 45001: Occupational Health and Safety Management Systems (OHSMS)

March 28, 2024
ISO 45001 certification – reduce your organizational risk and promote occupational health and safety (OHS) by working with SGS to achieve certification or migrate to the new standard...

Want to Verify your GHG Emissions Inventory?

March 28, 2024
With the increased focus on climate change, measuring your organization’s carbon footprint is an important first action step. Our Green House Gas (GHG) verification services provide...

Download Free ESG White Paper

March 28, 2024
The Rise and Challenges of ESG – Your Journey to Enhanced Sustainability, Brand and Investor Potential

Free Webinar: Mining & ESG: The Sustainability Mandate

March 28, 2024
Participants in this webinar will understand the business drivers and challenges of ESG and sustainability performance, the 5 steps of the ESG and sustainability cycle, and prioritized...

Voice your opinion!

To join the conversation, and become an exclusive member of EHS Today, create an account today!