The Security Effort Testing Lab
Several of my friends and university associates have criticized me for being critical of them and their approach to security. Ironic to use criticism to condemn criticism. But for me, I was not critical but rather analytical. Although I disagree with many of their conclusions and approaches, I carefully read everything they wrote. I find academic research interesting and potentially valuable. However, such activity falls under science and technology security. Science discovers the principle and technology applies it to the real world.
During my corporate career, I’ve led multiple project teams tasked with solving a problem or exploring options to capitalize on an opportunity. In each team, we had academic experts in the field. In no team, they have never been responsible. I think the structure of these teams says a lot about the best role for academics and technologists.
Unapplied science is practically useless. Technology based on false assumptions is not only useless, but dangerous. The two must work together harmoniously to be truly effective.
People from both disciplines have tried to learn from each other and be a holistic practitioner of good workplace safety practices. Few, if any, have achieved lasting success doing so. Not that there hasn’t been success in reducing accidents using the knowledge of these practitioners. Several of these self-proclaimed security experts have success stories related to their efforts, and almost all have failures as well.
What works with one safety culture does not necessarily work with another.
I believe that these failures are not strictly due to bad practices or faulty logic. Every academic who proclaims that his methodology is superior to all others has a failure rate. The most successful academics who started big consulting firms blame their failures on the consultants in the field. The assumption is that the methodology is perfect, but the field staff is not. This assumption is partially true. Consultants can make mistakes and not execute the plan. But consultants can also take the exact same approach and succeed at one site and fail at another.
So what really differentiates success from failure? I suggest that this is not the scientific basis of the approach. It is not technology developed from science. And it’s not the inconsistencies in consultants’ practice when it comes to service delivery. The difference is the safety culture of the site where the improvement efforts take place.
What works with one safety culture does not necessarily work with another. Each group of workers had a different experience and came to different conclusions about the safest way to work on their site. Each group of workers has a different relationship model between its members, which dictates what is acceptable to discuss and what is not.
Each culture is influenced by a different set of supervisors and managers who may have very different leadership styles and practices. Each culture is influenced by different working environmental conditions, including equipment interfaces and procedures. Each crop has a unique set of pressures for production numbers, quality, and timing. All of these factors make it nearly impossible to develop a methodology that works for everyone.
Many academics struggle to accept that there is no one-size-fits-all approach to security. Science seeks universal truths, and those truths should address security universally. But the devil is in the details. Although science may be universally true, its application can be as varied as the culture in which you are trying to apply the science.
In a lecture years ago, an academic pointed out a principle of psychology called stated intent. The premise was that if a person declared their intention to do something to others, they were more likely to do it. As for how declared intent could be used safely, he suggested asking workers to fill out a card indicating their intent to wear a particular PPE. When he asked the other panelists what they thought of this approach, one suggested that he would recover the card from where they didn’t want it.
The science was accurate, but the technology didn’t mesh well with the culture of safety. If you want to build a structure, the physics are the same on Earth as on Mars. However, the environment is different and the application of physics must be adapted to the environment. Universal truths must be tempered by situational realities.
Once, a client company asked me to develop training for salespeople to sell a very technical product. They wanted to know if they should teach technicians how to sell or teach salespeople the science behind the product. I told them that I had success teaching science to non-scientists, but never had success teaching sales to non-sales people. We called in their sales force and taught them how to sell the new product with great success.
I think this illustrates the challenge of marrying science and technology safely. I have found it much easier to take people familiar with the culture and its members and teach them the science of security rather than teaching scientists all the soft skills needed to implement security processes in a specific safety culture. I have successfully taught a few consultants how to assess a security culture and customize a security approach that fits the culture. Most consultants only mastered a few of the skills needed to deliver a customized security improvement.
I have spent the past 28 years evaluating security cultures and customizing approaches to help organizations achieve security excellence. The five books, 250 blogs and podcasts, and over 200 articles I published were based on my experiences with my client companies. I think the real world of security is the perfect laboratory to study and perfect security technology. The litmus test for any approach is whether it works in the real world, where it really matters.
Because I no longer work directly with clients or consultants, I lack real-world material and won’t be writing a regular column for a long time. Thank you all for following. Above all, thank you for caring passionately about your safety and the safety of others.
Terry Mathis, Founder and former CEO of ProAct Safety, has been a consultant and advisor to top organizations around the world for the past 28 years. He recently retired and was replaced by Shawn Galloway, the former president of ProAct Safety. Terry and Shawn have worked closely over the past few years on numerous projects around the world and have co-authored five books together. Shawn can be reached at [email protected] or (800) 395-1347.