Managing the Socio-Technical System
The Socio-Technical System
One of the challenges of high-consequence industries is the daily exposure
to risk. Improving outcomes in these industries requires a detailed understanding
of this exposure. The term “socio-technical” refers to human beings operating
inside technical systems, comprised of departmental policies, procedures, training,
and equipment. Managing the risks of socio-technical systems involves an
understanding of the complex relationship between people and systems. In other
words, the systems that are designed in should depend, in large part, on a detailed
understanding of the capabilities of frontline employees to react in high-risk
situations; conversely, the actions and decisions of the employees will often depend
on their knowledge, skills, and abilities in utilizing the tools that have been given
to them. Ultimately, success at preventing adverse events will depend on how well
organizations are able to hold the socio-technical system accountable – the policies,
procedures, and equipment, as well as the personnel.
A number of industries have built increasingly complex systems in certain high-risk situations to manage better manage the risk of catastrophic outcomes. For example, success in building and operating a nuclear power reactor requires multiple layers of barriers, redundancies, and recovery strategies to ensure acceptable levels of risk. The systems are replete with human error-tolerant designs. In most industries, however, organizations are not able to match the level of system design resources found in the nuclear industry. In other high consequence industries, including healthcare, aviation, fire and police service, we must continue to rely on the good judgment and experience of the professionals performing the job.
The Human Component
Often by necessity, frontline employees remain central to how risk must be managed. So what role does human behavior play? In essence, humans become components within the systems we design. In a perfect system, neither mechanical parts nor humans would fail. But again, this is not the world in which we live. If we expect mechanical parts to eventually wear out and sometimes malfunction, shouldn’t we consider the limitations of our employees as well? A machine might not get fatigued, forgetful, distracted, rushed, over-stimulated, frightened, or bored. Yet, each of us may go through these states of consciousness on any given day.
A Symbiotic Relationship
System design and human behavior are symbiotic. That is, a relationship often exists between the two that can be mutually supportive, or at other times, can lead to complacency. The system we build around predictably fallible human beings will depend on how often they will fail and how significant the consequences are when they do. Similarly, the choices we make as humans depend, at least in part, on how reliable we view the system around us.
The strategies employed in most high-consequence industries must dive deeper into the complex relationship between systems and behaviors, understanding that the critical thinking skills of the human being are a central design component in our system. And while humans continue to perform in ways no computer has been able to match, we also fail in predictable ways: distractions, fear, fatigue, drift, low risk perception and lack of situational awareness all contribute to undesired outcomes.
Systems engineers in high-consequence industries are trained to predict human variability in critical circumstances, and then design error-tolerant systems to compensate for these vulnerabilities. Organizations with well-designed socio-technical systems will recognize the vulnerability of single failure paths within their systems - that is, where they may be one human error or at risk behavior away from causing harm – and work to build resiliency. Yet many organizations outside of the nuclear industry often expect employees to be perfect without recognizing the importance of these system design strategies, thinking that holding humans accountable for their outcomes through punishment will ensure that they never make a mistake or drift into at risk behaviors. But this is not the world in which we live. Organizational accountability requires an understanding of system design, human behavior, and how to achieve maximum reliability within each.
Predicting the “Unpredictable”
Perhaps counter to conventional wisdom, human behavior can actually be quantified and predicted, in the aggregate. Although we may not be able to predict with certainty when a particular human being will fail, we usually know how humans can fail within a given system or environment. The different ways in which people make mistakes and drift into risky choices is often well-understood, if not documented, within most organizations. (For example, we don’t know who will be next to forget to leave the gas pump handle in the car after re-fueling and drives away from a particular gas station, but we know with near-certainty that it’s going to happen, given sufficient time.)
In the day-to-day activity of high across industries across the globe, systems have been established for producing results, based on mission and operating performance goals. But the key to preventing unintended consequences, or adverse events, is to understand that this work is comprised of imperfect systems interfaced with predictably fallible human beings. The challenge is to not simply rely on employees to be perfect, but to identify where systems and people are vulnerable, and work to optimize reliability in both of those areas. An effective organization will recognize that system design provides the framework for success, but that we can do no better than the limitations inherent in those designs. Theoretical models such as the one in Figure 1 are important to understand what needs to be done. But at the end of the proverbial day, the answer to how we achieve success in managing the socio-technical system effectively will depend on collaboration.