hp_mast_wide

The Health Care Leader's Role in Safety

November-December 2019

BY: KIM HOLLON, FACHE

Join me in a thought experiment: You are a railroad track operator with the job of pulling a manual lever to switch a train from one track to the other. A train is barreling toward disaster, but you can pull a switch to move it to a safer track. You aren't properly trained, and so you fail to do so. A number of people die as a result. It is easy to see in this straightforward example that the operator is at least partly at fault for the harm that results. In health care, when we fail to establish systems that protect our patients, it is less evident, but just as impactful.

The safety and quality of care is heavily influenced by the organization's leadership, what we focus on and how we develop systems that help people make the best choices. The knowledge leaders need to attain high reliability in health care is not uniformly understood, not something I was taught, nor have I witnessed at any other facility. Information and education from other industries are available for us to learn from, however we must adapt what we learn to fit our industry.

My journey toward high reliability has been circuitous. My aim has been constant, but the path of my education from mentor to mentor has been a winding road. As a consequence, there are times that I wonder what new management concept or standard might move health care system teams toward zero harm more quickly. I know from experience that if I had known years ago what I do now, lives could have been saved and suffering avoided. Technically, the education on high reliability may always have been available to me, but it certainly wasn't easily found or adapted.

I have been a C-suite executive of a hospital or health system for over 30 years. I have been hardworking and conscientious, focused on caring for our vulnerable populations, improving quality and providing value, always with an understanding of the privilege we have to serve the sick. But if I am brutally honest with myself, I have been among the leadership of an industry that has been aware of significant safety problems since at least 1999 when the Institute of Medicine published To Err Is Human: Building a Safer Health System.1

SLOW PROGRESS ON WELL-KNOWN ISSUE
When I first read that 44,000 to 98,000 people died unnecessarily on an annual basis in U. S. hospitals, I was shocked, certain there must be problems in the study, and that it couldn't possibly reflect problems in our hospital. I quickly purchased the book and read it cover to cover, trying to understand the research and recommendations. I became convinced that we had to do something, but I didn't fully understand the recommendations and found few local peers who had better answers for what to do, or who had even read the study. A year later, I spoke to 100 young executives about quality improvement and asked how many of them had read To Err Is Human. Only one person raised a hand. That was the first of many times I have discussed quality or safety with health care leaders and boards across the country and realized that they lacked up-to-date knowledge about the importance of our role in ensuring the safe and high-quality treatment of patients under our care.

Looking back, I only thought I understood my role. That's because I thought we were focused on quality and safety and that we held people accountable for it. I knew from data that our quality measures were better than average. I believed I was that trained railroad worker who daily pulled the lever to avert an unintended consequence. I prided myself on the quality improvement that our teams had accomplished.

When I had the opportunity to join and lead Massachusetts-based Signature Healthcare, I took to heart the philosophy of Lean, a well-known management approach, and began a comprehensive transformation of our strategic planning, daily operating systems, communication, process improvement, inventory management and human resource systems. In many organizations, Lean management is implemented as a process improvement technique or set of tools, and few consultants would recommend a system-wide blitz-implementation of Lean to this degree. However, I was eager to take what I was learning and apply it across the entire organization. So, we started shift-related daily huddles, transparent public posting of our departmental goals, performance metrics, workplace standardization, a standard problem-solving method to determine the root cause and counter measures, a suggestion system, and a monthly meeting process. We standardized a significant portion of all leaders' work to include engaging their team in daily improvements to our processes and outcomes by using Lean concepts and tools. I mandated that all leaders learn and adopt a new way to manage, turning their personal management systems upside down. They focused on observing employees and the processes they used in accomplishing work. They took note of variation and waste, then coached staff to improve the staff's own work rather than making changes from top down.

Relearning how to lead after 30 years of success was difficult for me and all of our team. Many of my closest allies continued to ask during those implementation years why we were changing everything we did, as the systemic change was hard on the organization. I responded that as long as any of our patients received less than perfect care, we had a moral obligation to change how we managed, making it easier for our staff to reach zero harm.

Staying the course through a system-wide, no-exceptions-allowed leadership method change tested my resolve many times. Approximately 20 percent of our leadership team chose to leave rather than change their leadership style. The change for all levels of management — asking them to spend leadership time improving their understanding of the root causes of problems and waste, then coaching staff who perform the work to design their own improvements — has been hard but also transformative, both personally and for our organization.

We have encouraged employees to take ownership of how they improve processes related to their jobs. We have implemented over 6,000 suggestions per year. We have removed chaos from our environment through standardization, using visual cues to reduce the chance of mistakes and a robust standardized problem-solving method, generating exceptional improvements in quality. Our patients very rarely suffer from infections, pressure ulcers, falls with injuries and other forms of hospital-acquired conditions. In our ambulatory areas, we have dramatically improved diabetic and hypertension control, cancer screening rates, admission rates per 1000 population and readmissions. For many public measures we are in the top 10% of performance, remarkable for an underfunded safety net health system. As a result of our improvement, the hospital and its medical group began to receive a number of awards and recognition for quality. I believed we were becoming a highly reliable organization. Until … I realized that was just not true.

A CULTURE OF SAFETY
One Saturday morning in 2013, the day before I was to run a marathon, I read High-Reliability Health Care: Getting There from Here. The article opened a new area of learning and transformed how I see my role as leader.2 The next day while running, I mulled over the author's message that hospitals will never approach the state of high reliability without implementing a robust process improvement method, technology to help prevent errors and a culture of safety. I knew we had a great learning system and really good technology for error prevention. I believed we had a great culture, but I had no idea what a culture of safety was. As I admitted to myself that I had no working model for a culture of safety or how to establish it, improve it and measure it, I had to acknowledge that my leadership was falling well short of what it should be. I think this might be the first time I realized that I was the railroad worker who had failed to pull the lever that moved the train from the damaged track, and that failure was allowing harm to happen in spite of all the process improvements we had implemented.

As I learned more about high-reliability and human error models, it dawned on me that my belief that a hospital could be error proof and checklist its way to zero harm was fundamentally flawed. I began to admit I had not thought deeply about human errors, their causes, and how they are influenced by the organization. I had heard of James Reason's Swiss Cheese Model of error-prevention, but I thought his model related to process improvement.3

Here's how I explained Reason's model, using a medication example: If a physician orders a medication on the computer, it prevents misreading the doctor's handwriting. When the pharmacist reviews the order, he or she can prevent mistakes in dosage. When the nurse pulls the medication from the dispensing system, it has safeguards to make sure it is the correct one, and finally, when the nurse scans the barcode on a patient's wristband and on the medication at the time of delivery, the double check ensures the right patient, right medication, right time and right dose. I thought each of those systems represented a defense system or one of Reason's "slices of cheese." (Each "slice" is considered a barrier to prevent a problem, but still contains potential holes in it, like Swiss cheese.) Reason believed that the way we lead and influence behavior, thought and culture in an organization had just as much impact on errors as the physical and computer processes used to perform work. As the CEO, I was the architect and chief inspector of organizational and cultural defenses, a job I was completely unaware of and untrained for. In some ways, I felt as though I had figuratively been asleep at the switch.

As I began to grasp the importance of this new role, I saw how systems theory impacted almost everything: it was a new lens through which to observe the health care delivery system. I began to read more widely and talked to experts about different aspects of safety: safety management systems, safety in health care, safe cultures, cognitive biases, human error theory and the design of a culture of safety. I also began to think more deeply about how Signature's leadership team organized for safety, including my personal biases regarding the relationship between boards and the CEO as it relates to safety and quality.

Typically, when I heard consultants say boards set the expectation for quality and are important to high-reliability, I scoffed at the notion, thinking that this was a clever way for governance consultants to gain more work. In fact, I have often challenged people to explain exactly how a board impacts quality. Typically, I've found the explanation lacking any implementable details. My personal experience is that board members are interested in quality and serving the community, but their knowledge of medicine as a discipline and health care as an industry is limited, and their understanding of quality and safety is rudimentary. I couldn't imagine, with such limited knowledge, how they could set a very high bar for safety.

In thinking more deeply about how Signature's board could add energy to our pursuit of zero harm, I decided we needed to talk more openly about constructive dissent in quality discussions and how executives and physician leaders can shut down probing questions. For about six months, we shortened all of our routine quality business matters and focused our board quality committee discussions on how the board members could become better coaches of the executives and physician leaders, and how we would measure success. Through trial and error, we also developed a checklist of questions that are asked at the end of each meeting, aimed at reducing the power distance and inviting any unspoken question. The result of this work has been surprising; our team has become less defensive in answering challenging questions and our board has begun to ask much better questions that help us think differently.

In addition to rethinking how I work with the board quality committee, I've begun asking very different questions when we experience employee or patient harm. I now see human error that causes harm is almost always a consequence of the organizational system. Human error is not the cause, but a consequence of the system. We ask a lot more questions about what exactly is the "system" and what we can do as leaders to change that system. More often than not, it is the system that is missing essential safe supervisory practices because most health care managers have developed their leadership habits within health care. And health care has few examples, if any, of high-reliability at the institutional level.

A CLOSER LOOK
A great example of how supervision impacts safety can be found by examining eye injuries in health care. Two years ago, our most frequent mode of injury was splashes of different fluids in the eye. The injuries rarely caused any significant harm, but they were early warning signs that we were not practicing safety.

In reviewing the injuries, we noticed that in almost all instances, the employee did not anticipate the splash and did not anticipate any personal risk. We provided goggles for occasions when employees emptied containers, opened tubes in the lab or other "risky" processes, but employees did not have protective eyewear with them at all times, to use at a second's notice. When we began to discuss this as a leadership team, our managers were initially not supportive of implementing a practice that no one enters a patient's room without protective eyewear. They did not believe they could successfully enforce it. For months we struggled with how to establish a policy about protective eyewear, when our employees and managers did not perceive the risk as high. We also discovered that if employees wore eyewear for long periods of time, it needed to be comfortable and protect the eyes from splashes that might come from different angles. Comfortable eyewear for staff who routinely wore glasses also became a concern. We removed the perceived barriers to wearing eye protection by researching options and providing attractive and appropriate eyewear that staff were more likely to wear. But we continued to have injuries from lack of use.

After solving for the perceived barrier to wearing glasses, we began to work on the low perceived risk of injury by making certain that any time someone was injured anywhere within our system, everyone learned about the injury, how it happened and, in particular, whether the employee had any perception of splash risk before the procedure. As we improved our leadership systems of communicating injury stories, we found improved compliance. Now whenever there is an eye injury, we ask, "Does the manager have a system for the safety coaches to observe for protective eyewear use on all shifts, and are results reported to the team on a frequent basis?" Reducing barriers to doing the right thing, increasing employee recognition of perceived risk, and reinforcing use through co-worker coaching were all organizational influences on human error and all belong to leaders to design and implement. Signature leaders have become much better at examining our behaviors, looking for the omission of these types of activities and preventing errors from becoming a consequence of our leadership failure.

As I have changed my leadership style and our organization starts to change its collective leadership and culture, we have had surprising success. Since our initial Leapfrog safety grade of B, we have had straight A's at each six-month rating. After several years of straight A's we implemented a safety management system and integrated it into our Lean management system, and reinforced it with standard leader work. To my surprise, we reduced our serious patient safety events by over 80% and have maintained that level of improvement for over three years. Experiencing that dramatic decline in harm affirmed what I was beginning to understand — that implementing a culture of safety and robust process improvement are both necessary to reach zero harm. If anyone had told me 10 years ago that we could reduce our serious safety events by 80% I would not have believed it, because I had no mental model of how different an organization could be.

Knowing what I know now, I have begun to think about the holes in our defense systems outside of the health care system that could affect patient safety. The organizations and systems that influence how we lead in health care are flawed. We do not adequately teach safety science in our graduate management programs; our industry educational development systems are not providing the right level of in-depth education to support change; our regulatory agencies have not caught up to best practices in safety management; the media does not understand the intersection of safety and leadership in ways to help hold health care accountable; and insurers and employers do not know how to judge a safe organization. There are no certifications for health care boards in safety, and the state and national health care associations seem more interested in protecting the status quo than establishing meaningful measures or processes to speed up the transfer of reliability practices.

With enlightened self-interest, boards should begin to call for increased public accountability, transparency and more rigorous external oversight. Just as the greatest athletes know they reach their potential only through a coach who can extract the most of their natural talent, we must increase the pressure for change through external influence. When we think about how vulnerable that makes our institutions, we should judge that vulnerability against the vulnerability of our patients, who are suffering harm at unacceptable rates. If our industry has not solved this problem in 20 years on its own, it will not likely solve it in the next 20. Our patients can't wait on us to improve at our current pace.

KIM HOLLON is president and chief executive officer of Brockton, Mass.-based Signature Healthcare. Signature Healthcare is comprised of a safety net community hospital and integrated medical group serving a diverse and socio-economically challenged population, south of Boston.

NOTES
  1. Institute of Medicine, Linda T. Kohn, Janet Corrigan and Molla S. Donaldson, eds., To Err Is Human: Building a Safer Health System, (Washington, D.C.: National Academy Press, 2000).
  2. Mark R. Chassin and Jerod M. Loeb, "High-Reliability Health Care: Getting There from Here,"Milbank Quarterly 91, no. 3 (September 2013): 459–90, https://doi.org/10.1111/1468-0009.12023.
  3. James Reason, A Life in Error: From Little Slips to Big Disasters, (Burlington, Vt.: Ashgate, 2013): 74-75.

 

 

The Health Care Leader's Role in Safety

Copyright © 2019 by the Catholic Health Association of the United States

For reprint permission, contact Betty Crosby or call (314) 253-3490.