High Reliability
Reluctance to Simplify
In our next lesson on High Reliability, we talk about a deceptively simple principle: the reluctance to simplify. When everything about your organization is dedicated to streamlining and making things more efficient, how do you maintain the ability to know all the real complexity of your processes?
High Reliability Organization Principle #2:
Reluctance to Simplify
(or, Why you always need to struggle with the devil in the details)
- 1. Resisting the temptation to KISS (Keep it Complex, Stupid)
- Examples of resisting over-simplifications
- Laboratory examples of resisting over-simplification
- 2. Expect the unexpected (admit your inability to predict)
- Examples of expecting failure
- Laboratory examples of expecting failure
- 3. Embrace Diversity of Experience and Opinion
- Examples of healthcare diversity
- Laboratory examples of diverity
- Conclusion
- References
- See the introductory lesson: What is High Reliability?
- See lesson on Principle 1: Preoccupation with Failure
December 2008
In this lesson, we discuss the High Reliability Organization principle of Reluctance to Simplify. This principle evolves naturally out of one of the other HRO principles, the Preoccupation with Failure. When an organization is intensely committed to detecting existing errors and future errors that can and might occur, that organization must also beware of simplifications and abstractions that blur the operational details and potential danger signs.
In essence, a high reliability organization exhibiting this principle is trying to fight its own instincts. Any organization is inherently focused on creating simplifications and efficiencies – otherwise there would be no organization at all. A reluctance to simplify means making certain that each simplification is warranted and that critical information is not buried in the process.
There are several ways this reluctance to simplify manifests itself in organizational behavior:
- First, organizations need to resist oversimplifications of the difficulties and problems they face, as well as a rejection of the temptation to ignore them.
- Second, organizations must recognize that their systems can fail in ways that they will be unable to anticipate or imagine.
- Finally, HROs encourage a diversity of views and opinions, as a way to guard against oversimplification and blindness.
Healthcare is a constant tug of war between complexity and simplification. New technologies are always coming into the field, sometimes promising efficiencies, sometimes opening new frontiers on care. New technology and new procedures can make healthcare faster, cheaper, better. But can also generate more complex treatments, ever tighter demands on performance, and new and unpredictable outcomes.
1. Resist the temptation to KISS (Keep it Complex, Stupid)
Dr. Karl Weick, one of the co-authors of Managing the Unexpected, describes how the reluctance to simplify manifests itself in action:
“People who engage in mindful organizing regard simplification as a threat to effectiveness. They pay attention to information that disconfirms their expectations and thwarts their desires. To do this they make a deliberate effort to maintain a more complex, nuanced perception of unfolding events. Labels and categories are continually reworked, received wisdom is treated with skepticism, checks and balances are monitored, and multiple perspectives are valued. The question that is uppermost… is whether simplified diagnoses force people to ignore key sources of unexpected difficulties.”[1]
A typical oversimplification that occurs in organizations is the response to an error, accident or incident. Often the organization engages in a “Blame and Train” response. Find someone to blame for the error, the “bad apple.” Then insist on training everyone better so that the next time the error won’t occur. Unfortunately, this simplistic response has time and again failed to prevent future errors. The “simple” source of an error is often just the end result of a number of systematic problems in the organization. Blaming the Challenger disaster on a bad O-ring did not save the Columbia; a more complex appreciation of the organizational problems at NASA might have.
Examples of Retaining Complex Views
Here’s an example, this time closer to home. Patient Safety is a hot-button issue right now, with more and more of accreditation and regulation tied to safety behaviors. The simple fix to Patient Safety is to “Blame and Train” and to insist that everyone “do better.” But a more mature understanding of the problem is required:
“Christiana Care refused to simplify patient safety issues. When analyzing a particular patient safety problem, the Patient Safety Mentor Team could have simply blamed a physician or nurse for an incorrect order or practice and determined that an individual's error caused the problem. Instead, Christiana Care looked beyond the person directly involved in the error to examine the care processes and systems within which the error occurred.
“A reluctance to simplify has contributed to the success of each of their initiatives. By assembling an interdisciplinary team to explore septic shock, Christiana was able to identify multiple factors that contributed to poorer sepsis outcomes. When Christiana examined the safety of patients in their ICUs, they recognized that the solution was not merely a technical one. An electronic system for monitoring patients in the ICU was only part of what was necessary to enhance patient safety. Of equal importance was the need to introduce this technology and the staff who would support it in a way that addressed legitimate staff concerns about how eCare would be used and whether they should avoid it or integrate it into the care of their patients.”[2]
Examples of Retaining Complex Views in the Laboratory
Perhaps the easiest example of oversimplification is the current raging debate over “equivalent” QC, electronic QC, and the latest regulations for quality control. The regulations are quick to claim that new options for QC are “equivalent,” but as we have asked repeatedly, “Equivalent to what?” While EQC will certainly be more efficient, requiring only weekly or monthly QC, the scientific basis for EQC remains unproven and the safety of the practice is ambiguous at best.
Other laboratory oversimplifications include:
- Laboratory tests are commodities. Every test is essentially the same quality – the only differentiation is price.
- Achieving Compliance is the same as Assuring Quality. A laboratory that follows the rules and regulations automatically provides the quality of their testing.
- Following Manufacturer Guidelines is all that necessary to achieve quality control. As long as the instrument or device is operating within Manufacturer guidelines, it is achieving the necessary quality.
- If clinicians are not complaining, that means that tests are achieving their quality goals.
- The Quality Control procedure implemented in the laboratory can be the same for every test and every method. Using the same control rule and same number of controls on every test is an appropriate QC strategy for a laboratory.
- If one control is out, but the other controls are in, the test is not out-of-control.
- If one control is out, but when the control is repeated it is in, the test is not out-of-control.
- If one test is out-of-control, but all the other tests in the same run are in-control, the one test is not actually out-of-control.
We’ve covered these simplifications – and the countermeasures to them – in exhausting detail elsewhere. Six Sigma QC Design, Doing the Right Quality Right, and other best laboratory practices are methods of resisting the oversimplification of laboratory quality.
2. Expect the Unexpected: Anticipate the Unanticipatable? (Admit your inability to predict)
A second part of a Reluctance to Simplify is an acceptance of the limitations of your own knowledge. This goes back to the infamous Rumsfeldian discussion of "known unknowns" and "unknown unknowns."[3]
The oversimplified organization believes it knows all that it needs to know, and that it is even aware of the things that it doesn't know. All risks are known, thus risks can be handled and discounted. Witness the hubris of Wall Street investment banks, Hedge Fund managers, and the sub-prime mortgage industry, and you know the effect of such overconfidence.
The High Reliability Organization has a mature self-awareness. It recognizes its own shortcomings and limitations of foresight. The HRO assumes that it doesn’t know everything about current behavior, and will not know everything about future events:
“Here’s a simple example: people who work in the power plant don’t trust the simplifications found in drawings and blueprints. If they have an assignment to shut down something, such as the air supply in a nonoperating unit, they won’t do so until they actually walk down the whole system, looking for valves, added piping, or reroutes that have been made since the drawings were completed. Those recent add-ons that are missing from the drawings are potential sources of serious surprises.”[4]
When Sentara examined their medication dispensing problems, they could have embraced assumptions that the failure modes are all known:
- “Past mistakes when stocking or withdrawing medications from the machines have always been noticed and fixed before harming the patient. Therefore, we can assume that future mistakes will always be noticed and fixed before causing harm.
- “We know all of the things that can go wrong when withdrawing medications from the machines, so we are sure we have checks to prevent those problems from harming the patient.”
[T]hey could have believed that communication problems were all the same and could be easily corrected by improved e-mail, staff announcements, or some other overly simplistic solution. Instead, they recognized that communication problems included:
- Lack of communication within hospital units.
- Communication problems between different departments and administration.
- Insufficient information about the current status of patients.
- Lack of communication between work shifts and information being lost during transition times of patient care teams.[5]
Laboratory Examples of acknowledging the unknown failures of the future
Laboratories are well-suited to this aspect of Reluctance to Simplify. They engage in robust, continuous monitoring of quality of their processes, but don’t wait for rule violations to act. Laboratories are not content to accept that all the compliance and good performance of the past means that the future performance is assured.
Some of the most encouraging conversations I’ve had over the last year were with laboratory professionals, both on the manufacturing and in the laboratory itself, who were unwilling to give up daily controls – rejecting “equivalent” QC - even if the regulations allowed it. Instinctively, they wanted that regular check on their processes, because despite all the manufacturer assurances, despite all the data from past performance of the instrument, despite any regulatory waiver of responsibility, the next day, the next run, the next test result is unproven. I’ve sat down with laboratory managers and diagnostic engineers who wanted to enhance their quality monitoring by adding patient-data algorithms and additional statistical rules to their existing QC procedures, because they recognize that there were still blind spots in their quality management practices.
Embracing “traditional” statistical QC over the electronic or “equivalent” QC is one strong sign that a laboratory acknowledges the inability to know the possible failure modes of its processes. Electronic and “equivalent” QC practices are narrower in their ability to monitor error, covering fewer steps and now, they will monitor those steps less frequently. Statistical QC covers more steps in the total testing process, giving the laboratory a better chance to detect that next unpredictable error.
3: Embrace Diverse Experiences and Opinions
One of the great dangers to an organization is the well-known and documented behavior of “groupthink.” In the eponymous 1972 book, Yale psychologist Irving L Janis, explained how “panels of experts could make colossal mistakes. People on these panels, he said, are forever worrying about their personal relevance and effectiveness and feel that if they deviate too far from the consensus, they will not be given a serous role. They self-censor personal doubts about the emerging group consensus if they cannot express these doubts in a formal way that confirms with apparent assumptions held by the group.”[6]
Crowds of homogenous composition can lose perspective. Again, witness the stock analysts, economists, and financial experts, none of whom predicted the Crash of 2008. Or in the words of a classic Demotivator: None of us is as dumb as all of us.
“HROs resist simplification by seeking out different points of view; because differences, not commonalities, hold the key to detecting potential failures. Diversity also takes the form of checks and balances, from hiring new employees with varied prior experience to novel redundancies. Most often, redundancies involve duplication of work, but in HROs, redundancies also take the form of healthy skepticism driven by wariness about claimed competencies, and a respectful mindfulness about safety. Skepticism is also deemed necessary to counteract the complacency that many typical redundant systems foster.”[7]
In other words, diverse viewpoints bring in different information and different perspectives, all of which can challenge the current assumptions. Fresh thinking on an old problem sometimes provides a breakthrough, or points out an overlooked flaw. Out of many, we can find one solution.
Healthcare’s diversity of experience
No one can claim that the healthcare workplace isn’t filled with diverse perspective. There is an extreme specialization of roles in healthcare, from medical technologist to pathologist, nurse to surgeon, administrator to clinician. However, the hierarchy and authority gradient often inhibits the expression of divergent viewpoints. The doctor’s supreme position often means that nurses and other technical workers lower on the ladder are reluctant to offer challenges. In healthcare settings where workers are encouraged to speak up, however, the teamwork that results can deliver superior care.
As a microcosm of healthcare, the laboratory suffers the same problems. “Bench” technologists can feel like they are voiceless and powerless when their view contradicts the opinion of a doctor or pathologist. Within the laboratory, the low pay and de-skilling of workers threatens to remove the additional expertise that can help the laboratory handle unusual situations and scenarios. When lab employees get paid wages closer to fast food, it’s hard to expect more than French fries out of them.
A example of diversity in quality management in the laboratory might be a greater collaboration between the quality officer of the laboratory (whoever that might be, whether there's an official position or a de facto expert), the “bench” level technologists who interact with the methods and instrumentation itself, and the diagnostic support staff (outside personnel who may only visit rarely, but have greater knowledge of a particular instrument or condition). An honest exchange of experience and can result in leaps forward in performance.
Another example is the increasing use of listserves and other online resources for operational advice. Getting advice on instrument problems and performance from a different laboratory setting is just the kind of outside information that can be useful.
A more technical example is the proficiency testing, external quality assurance, or peer group program. All of these testing practices give the laboratory an outside perspective on analytical performance, as well as provide forums for additional advice and help.
Conclusion
“When a culture is less mindful and more willing to simplify, abstracting is done superficially in a more coarse-grained manner that confirms expectations, suppresses details, overstates familiarity, and postpones the recognition of persistent anomalies.”[8]
It’s clear by now that the Reluctance to Simplify is no simple matter. It’s really a paradox; how do you balance the speedy, simplifying actions that the business must make with the need to retain complexity, visibility, and a diverse, sometimes redundant, set of skills and perspectives? The old argument of efficiency versus quality, redux.
Remember that reluctance does not mean refusal or obstinance. Reluctance to Simplify isn't a license to refuse all changes. It just means that the gains in efficiency must be made only in ways that do not sacrifice necessary quality and safety.
References
1. Karl Weick , Making Sense of Blurred Images: Mindful Organizing in Mission STS-107, in Organization at the Limit, eds. William H. Starbuck & Moshe Farjoun, Blackwell Publishing, (Malden, MA) p.168.
2. Becoming a High Reliability Organization: Operational Advice for Hospital Leaders. AHRQ Publication No. 08-0022, April 2008, Agency for Healthcare Research and Quality, Rockville, MD. Appendix F: Case Studies in High Reliability Applications: EICU and Sepsis Prevention at Christiana Care Accessed December10, 2008
3.Westgard JO, Westgard SA, Equivalent Quality vs Equivalent Testing, LabMedicine, Volume 36, Number 10, October 2005, p.626. See also Seely H. Pieces of Intelligence: The existential poetry of Donald H. Rumsfeld. New York: Free Press, 2003.
4. Karl E. Weick and Kathleen M. Sutcliffe, Managing the Unexpected, Second Edition, Wiley, San Francisco, CA, 2007, p.55
5. Becoming a High Reliability Organization: Operational Advice for Hospital Leaders. AHRQ Publication No. 08-0022, April 2008, Agency for Healthcare Research and Quality, Rockville, MD. Appendix E: Case Studies in High Reliability Applications: Medication Dispensing Machine Redesign and Executive Walkarounds at Sentara Leigh. Accessed December 10, 2008.
6. Robert J. Shiller, "Challenging the Crowd in Whispers, Not Shouts," The New York Times, November 2, 2008. Business section, p5.
7. Institute of Safe Medication Practices, “Safety requires a state of ‘mindfulness’ (Part I)”, March 9, 2006 issue. http://www.ismp.org/Newsletters/acutecare/articles/20060309.asp?ptr=y Accessed December 10, 2008.