Introduction to Patient Safety


Fatigue

Lack of sleep

Illness

Drugs or alcohol

Boredom, frustration

Cognitive shortcuts

Fear

Stress

Shift of work

Reliance on memory

Reliance on vigilance

Interruptions and distractions

Noise

Heat

Clutter

Motion

Lighting

Too many handoffs

Unnatural workflow

Procedures or devices designed in an accident prone fashion



As technology continues to advance and more and more electronic tools become available, it is necessary to understand how humans interact with the technology. Many errors occur when the interface between humans and machines is poorly designed. An example of well-designed technology is the smartphone. The developers purposely designed the phone to be easy to use right out of the box. The icons are easily recognized, and the features are intuitive. In The Design of Everyday Things, (Basics Books 2013), Donald Norman offers an easy way to understand the impact of poor design on human actions. The author describes the use of affordances, the design of a device, or an environment that helps a user perceive how to perform an action. An example is the design of handles or doors: a bar implies a push is needed to open a door versus a handle implies a pull is needed. Proper design of processes and equipment must be taken into account when making improvements. The Food and Drug Administration (FDA) offers advice on how to incorporate human factors into the design of equipment [28].



System Design


A system is a number of processes or steps that interact with each other to achieve a desired outcome. James Reason uses this definition to describe the difference between active and latent errors. Latent errors are those errors that result from poor system design [29]. The common approach to managing errors was to train and educate individuals and/or to punish, driven by the expectation that individuals will execute flawlessly. What we have learned, however, is that errors are common. Even the best-trained individual will find himself or herself in a position to make an error. Disciplining or removing the individual who made the error does not prevent someone else from making the error again if the contributing factors are part of the system. Reason referred to these as latent errors: errors just waiting to happen. The cause of latent errors includes poor design, situations where staff is constantly distracted, complex protocols, policies that do not support evidence-based practices, and pressures from management and others that cause individuals to take shortcuts.


Reason Swiss Cheese Model


Many times there are a series of steps in the process that are intended to block an error from reaching the patient. Reason likened these barriers to slices of Swiss cheese (Fig. 3.1). The holes represent flaws in the system that may go undetected until an event occurs. The more layers and the smaller the holes in each layer, the higher the chance of blocking an error. However, there are times when all of the holes line up, and the error reaches a patient. Efforts to address error reduction should focus on strengthening the design and the defenses of the system so that the opportunity for error is minimized and likewise is the opportunity for any errors to reach a patient.

A338705_1_En_3_Fig1_HTML.gif


Fig. 3.1
Reason Swiss Cheese Model for Error


Normalization of Deviance (Amalberti)


Amalberti and colleagues introduced us to the concepts of violations and migration, and they provide a framework to understand and manage them [30]. Violations are deliberate deviations from standard protocols which may result in bad or good results. Bad results are when a patient is harmed. Good results are when the protocol is violated because of its complexity and the outcome for the patient is good. The problem is that unless someone is harmed, these violations are seldom acknowledged or tracked and in fact sometimes encouraged and accepted and they become the norm. Managers build systems and processes in which they anticipate clinicians and staff to work, expecting operations in a safe space. Because of a myriad of external pressures or complexity of the procedures, individuals will migrate away from the safe space to the point where they may not be following the protocols just to complete tasks as expected. Amalberti calls this phenomenon migration to an illegal normal space. That is the area where many in healthcare function everyday. The systems and processes put undue pressure on clinicians, resulting in work-arounds and violations. The further someone drifts from this safe space, the greater the chance of serious harm. Managers are usually not aware of the staff performing in this space until something bad happens and there is an investigation. Staff is not likely to inform managers that they are performing in the illegal normal space because they fear being punished. It is the responsibility of managers to understand staff performance and the pressures that may be forcing individuals to perform in this space. Corrections must be made to the processes in the safe space so that people can use processes as designed.



How Often Do Errors Occur?


How often errors occur remains an unknown, and different error measurement strategies lead to very different results [12]. Mandated reporting by federal and state agencies, as well as nongovernmental groups such as the Joint Commission, may be useful to identify a subset of serious adverse events, particularly so-called “never” events such as wrong-site surgery. Another approach such as adjusted hospital mortality rates also measures safety at the very crude level of only extreme events. This measure is even less useful in pediatrics where the overall mortality ratio is lower and variation between hospitals is hard to measure [31]. Other sources of error detection range from regional or national malpractice claim data to mortality and morbidity conferences within a specific program. Many healthcare organizations utilize internal safety event reporting systems to measure safety within their own systems. Even in an organization with a very strong safety culture, such reporting will miss many events. At the other end of the spectrum, direct observation finds a higher rate of error than chart review, but both are extremely expensive and impractical to use outside of a research setting [32]. Automated review of discharge codes to detect adverse events has been shown to have relevance for pediatrics [33, 34]. The Institute for Healthcare Improvement (IHI) Global Trigger Tool detects adverse events at a rate nearly ten times the rate of the AHRQ Patient Safety Indicators [12]. A modified pediatric system has also been developed [35]. We have very limited tools to measure harm in ambulatory care and in the patient’s home or to measure preventable harm and potential adverse events in all settings.



How to Make Healthcare Safer




An Institutional Response to Patient Safety

In March of 1995, the leaders of the Dana Farber Cancer Institute (DFCI) and many others around the country woke to this headline:

Big Doses of Chemotherapy Drug Killed Patient, Hurt 2d. The two patients, one a reporter for the Boston Globe, received a fourfold overdose of chemotherapy which caused life ending damage to their hearts. The normal reaction at the time was to find out who was involved and discipline or dismiss them from employment so that they could not hurt someone else at the institution. During the investigation by a number of agencies including The Joint Commission, Boards of Registration in Medicine, Nursing and Pharmacy, and the Department of Public Health, it became evident that the clinical team involved included very capable and experienced individuals. The investigation also identified numerous deficiencies, including protocol violations, ineffective drug error reporting, and oversight of quality assurance by hospital leaders.

The response from DFCI leadership included the following:



  • New rules were adopted mandating close supervision of physicians in fellowship training.


  • Nurses were required to double-check high-dose chemotherapy orders and to complete specialized training in new treatment protocols.


  • Interdisciplinary clinical teams reviewed new protocols and reported adverse events and drug toxicities.


  • A trustee-level quality committee was reorganized and strengthened.


  • Discussions were begun regarding the transfer of inpatient beds to nearby Brigham and Women’s Hospital.

However, as important as these changes were to decrease the opportunity for error, the leaders of the organization under Chief Operating Officer James Conway learned that other more profound changes contributed to improving safety.

First was the adoption of a systems approach and design to prevent errors. Understanding the contribution of human factors contribution to the error, DFCI worked to design systems to prevent errors including the development of protocols and templates for chemotherapy ordering, as well as implementing technology to assist in the process. The application of the principles of standardization and simplification was critical to this change.



  • Safety was no longer to be viewed as someone else’s problem. All clinical staff and leaders, up the Board, had a responsibility and accountability to ensure safe practices.


  • DFCI developed a learning system through which staff and others collected and analyzed information from reporting systems, pharmacy interventions, and safety rounds. This analysis helped to identify opportunities for improvement.


  • DFCI began the process of engaging patients in advisory councils that provided patients’ view of the system and what kind of improvements would help them be safer.


  • The staff at DFCI adopted the approach that cancer care is very risky because of the condition of the patients and the medication used. As a result, the clinicians and leaders adopted a relentless pursuit of constantly improving. They recognized that mistakes will happen even in the best designed systems, and it is the responsibility of all the staff to identify these errors, mitigate their impact, disclose to patients, and provide support to the clinicians involved.


Learning from Other Industries


We often hear that aviation and healthcare have much in common. However, there are differences in that in the aviation industry, the teams involved consist of a smaller group of individuals, the norms and processes to operate a plane have been standardized and provide customization-based well-evaluated and practiced activities, and the equipment has been tested and will not react differently because of individual variation. Healthcare on the other hand involves a team with many players, best practices exist but may have to be individualized based on the patient, there is more than one way to achieve the same result, and individual autonomy has been allowed. So why the comparisons? [36]. John Nance in Why Hospitals Should Fly describes how a fictitious hospital can take the lessons learned in the aviation industry to help a hospital achieve the same kind of reliability found in the aviation industry [37, 38].

The comparisons between healthcare and aviation serve to help understand what should be in place to ensure that we provide the safest care possible for patients. Although there are many routines in healthcare that have been standardized, healthcare providers also encounter highly unpredictable situations which require rapid responses on a daily basis. Emergencies and departures from routine practices are unusual and to be avoided in other high-risk industries. In healthcare it is not uncommon to encounter a patient with an unknown diagnosis, where the disease may be masked or may be complicated by comorbidities.

High-risk industries have developed a culture in which individuals share a common vision and work together as teams, communicate clearly and frequently, have flattened the hierarchy, see any defect as an opportunity to improve, and have developed a learning system so that any improvements are shared with all who need to know. In healthcare, we identify these characteristics in a safety culture in which there is little tolerance for poor practice and staff are uniformly conscientious and careful. [38].


Framework for Preventing Error/Maximizing Safety


In order to achieve long-lasting improvements in safety, it is necessary to change the paradigm from improving safety as a project to improving safety as a part of the organization’s work in all ways, at all times. The second is to use a framework that provides the skeleton upon which all of the work can be added. There are two overarching components under which a set of elements must be in place and depend on each other: a learning system and culture [39].

Common to each is the role of leadership. It is the responsibility of leaders at all levels of the organization to develop an environment of teamwork, psychological safety, and respect. Psychological safety is an environment where people feel free to speak up, are respected, and are accepted [40]. Accountability is ensuring that individuals know their roles and are held to a standard of acting and in a safe way will receive the appropriate training to act in that way and will be judged fairly. Teamwork and communication are key building blocks to ensuring safe care. Healthcare providers develop a shared understanding, anticipate needs and problems, and have agreed methods to manage these as well as conflict situations. Empirical evidence from high-risk industries has been demonstrated to produce high-quality results [41]. Negotiation skills to be able to gain genuine agreement on matters of importance to team members, patients, and families are critical components of safety. Continuous learning refers to the organization’s commitment to collect and learn from defects and reflect on what changes are necessary to improve [42]. Improvement and measurement: in order to improve the processes we work in, organizations must adopt an improvement method which applies the appropriate techniques to the issues to be addressed in order to improve processes and outcomes. Measurement is a critical part of testing and implementing changes; measures tell a team whether the changes they are making actually lead to improvement. Reliability refers to the application of processes to ensure continued failure-free operations over time in which patients receive evidence-based care. Transparency refers to respectfully sharing data and information with staff and patients and families.

The patient safety movement urged us to move away from a culture of blame to a blame-free culture. The pendulum swung too far from one extreme to the other. Over time, we came to realize that we must act in a manner that is a balance between blame and blame-free, a balance between safety and accountability [43]. The biggest challenge in adopting this culture is the implementation across the entire organization. There are several guides available: James Reasons Decision Tree for Unsafe Acts Culpability [29], David Marx Just Culture [44, 45], and the Fair Evaluation and Response Chart [46].

The Manchester Patient Safety Framework (MaPSaF) is a tool to help National Health Service (NHS) organizations and healthcare teams in the UK and assess their progress in developing a safety culture [47]. The framework can be applied in the acute care, ambulatory, mental health, and ambulance settings. The Agency for Healthcare Research and Quality (AHRQ) sponsored the development of patient safety culture assessment tools for hospitals, nursing homes, ambulatory outpatient medical offices, community pharmacies, and ambulatory surgery centers [48]. Similar to the Manchester tool, organizations can assess the present state of the culture, identify where there are differences, identify strengths and opportunities for improvement, and conduct internal and external comparisons.

In order to change a culture, it is necessary to match strategy and culture. The ingrained attitudes and practices may be such that any new strategy will be at odds with the prevailing culture. In order to build a different culture, one must act in the new way that is desired. By matching the actions with the beliefs, over time attitudes will change and along with the culture.

Deming offered advice on improvement in his 14-point philosophy [49]. He included items such as make the vision clear. Slogans are great and may be memorable but may not clearly indicate the direction and what is expected of staff. He also added that organizations should continuously improve their processes and systems. This is the kind of change that will impact the culture of an organization. The phrase “Act your way into believing” comes to mind.


Governance


Healthcare board members, senior executives, and physician leaders play key roles in patient safety. Patient safety depends on effective governance with highly engaged executive leadership teams working with highly engaged boards [50, 51]. Ensuring safe and harm-free care is a board responsibility, not one that is delegated to the executive leadership team. Table 3.2 illustrates Conway’s six key steps for boards [52]. Empirical studies have shown that boards demonstrating effective patient safety leadership have positive impacts on their organization’s safety performance and that boards that review and track their organization’s performance have better quality outcomes [53]. Although ensuring high-quality, safe care was already clearly within the fiduciary responsibility of hospital boards, the Affordable Care Act of 2010 emphasized that responsibility still further.


Table 3.2
Six key steps for boards

















Setting specific, public, and transparent aims to reduce harm

Getting data and hearing stories that put a “human face” on harm data

Establishing and monitoring system-level measures to understand how the organization is achieving its aim(s)

Changing the environment, policies, and culture, to maintain an environment that is respectful, fair, and just for patients, families, and staff

Learning, starting with the board. Ensure the board and the staff are educated and knowledgeable about such topics as patient safety, leadership in patient safety, and strategies for improvement

Establishing executive accountability for clear quality improvement targets


Teamwork and Communication


Teamwork and communication are critical to healthcare delivery, which depends on multiple individuals and systems. Communication failure is a major contributing factor in 70% of sentinel events [54]. Multiple reviews have shown that various aspects of team function contribute to team performance [5557]. Effective teamwork has been shown to be a critical ingredient in multiple aspects of patient safety including the reduction of safety events, increasing safety culture, improving communication, improving staff satisfaction, and decreasing staff turnover [58]. There has been increasing recognition that patients and families can and should be core members of healthcare teams in addition to staff. Bedside multidisciplinary rounds and bedside report include patients and families in the care team. Inclusion of patients and families in other teams, such as process improvement or safety teams, is necessary to ensure that patient-centered care is designed with patients and families not for them.

High functioning teams have a common purpose, a shared mental model of the situation and the goals, effective communication, a common understanding of how each team member can contribute to the outcome, mutual trust with good cohesion and respect among team members, effective leadership, good situational awareness, and the ability to resolve conflicts. All members of the team participate in the work, and all feel comfortable speaking up regardless of rank or role. Leadership within a team is clear but flexible, and the same individual does not always serve in the leadership role. Conflicts can be raised and resolved. Teams emphasize “we” and “us” not “I” and “me.”

Effective strategies to improve teamwork focus on the cognitive and interpersonal skills needed to manage a process within a system rather than specific technical knowledge and skills. Team training focuses on facilitating human interaction and provides opportunities to practice and develop the necessary skills [57]. The principles of team training began with crew resource management (CRM) in the aviation industry and were first applied in healthcare in the 1990s [59]. TeamSTEPPS™ is a team training program developed by AHRQ specifically for use in healthcare [60].

Specific communication strategies facilitate team function. Structured briefings are opportunities to increase situational awareness, set a common goal, share information, and improve teamwork. De-briefings after an event, a simulation, or routine patient care provide an opportunity for teams to assess their own performance and identify opportunities for improvement. Planned and unplanned huddles help reestablish situational awareness and review existing plans and assess the need to adjust the plan.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Aug 14, 2017 | Posted by in ONCOLOGY | Comments Off on Introduction to Patient Safety

Full access? Get Clinical Tree

Get Clinical Tree app for offline access