Chapter 70 Blair L. Bigham and P. Daniel Patterson “First, do no harm” (origin unclear) Thousands of patients are treated by EMS providers each day. For most of these patients, their exposure to the health care system will improve their well-being. However, some will experience unintentional harm or be put at risk of being harmed. The sentinel Institute of Medicine paper To Err is Human: Building a Safer Health System brought to light the effects these risks and harms can have on patients and systems throughout the health care industry [1]. Since the release of this paper, health care systems and practitioners from a broad spectrum of fields have worked towards understanding the threats to patient safety, researching factors that contribute to unintentional harm and developing methods to reduce, eliminate or mitigate accidental harm. The term adverse event describes an occurrence that resulted in unintended and detrimental morbidity or mortality (patient harm). Adverse events are thought to stem from systemic weaknesses, individual behaviors, or a combination of the two. It has been estimated that one-third of patients admitted to acute care hospitals experienced at least one adverse event [2]. The uncontrolled and time-sensitive prehospital setting offers unique challenges that make adverse events all the more likely to occur. The concept of EMS as high-reliability organizations (operating relatively error-free operations over a long period of time) is new. Long ago embraced by the nuclear power, aviation, and military industries, high-reliability organizations avoid catastrophes, consistently make safe decisions, and have high-quality, reliable operations. This chapter summarizes the challenges and risks of emergency medical care delivered in the field, presents mechanisms that can address these challenges and reduce these risks, and provides a framework for becoming highly reliable. There is considerable research and theory focused on the predictors of error and adverse events in high-risk settings. Health care has borrowed from this work and adopted many of the concepts and practices that improve safety. Programs widely used in health care, such as the Agency for Healthcare Research and Quality’s TeamSTEPPS, are based on this prior research and theory [3]. Many of these programs or interventions may be active in the hospitals where medical directors practice. Below is an overview of the most common and widely accepted concepts in safety, which may aid medical directors in their efforts to adopt, adapt, or develop programs specifically for their EMS organizations and systems. Several factors can affect patient safety in EMS, and rarely does any one factor act alone to create an adverse event. These factors may be human, relying on people to either commit or omit certain functions, or systemic, depending on procedures, administrative controls, or engineering and design. When people and systems function properly, these aspects work to protect patients from hazards. However, weaknesses can occur. The Swiss cheese model [4] likens these weaknesses to holes in slices of Swiss cheese; many layers of Swiss cheese slices rarely line up to have a hole that one could peer through, but when the slices align in just the right way, a trajectory through the cheese opens up, and an adverse event can occur (Figure 70.1). The model attributes these holes to two conditions: active failures, where unsafe acts are committed by people, and latent conditions referred to as systemic flaws in design or processes that allow hazards to be present. When active failures and latent conditions align in the right manner, an adverse event can occur. While human error often contributes to adverse events, humans are considered the last piece of cheese in the Swiss cheese model. As humans are, by nature, not highly reliable, additional slices of cheese are installed in organizations to make processes safer. These system factors can include the workplace culture itself, written policy and procedure, training in process and best practice, and technological solutions or engineering modifications that account for human fallibility. Examples of system factors that may lead to ambulance collisions include policies that require lights and sirens use, poor training in emergency vehicle operation, a culture that glorifies speedy driving, and vehicles with poor reflective markings [5]. Examples of system-level safety improvements to these problems include evidence-based algorithms that recommend judicious lights and sirens use, provision of special vehicle operator training, a culture that emphasizes safety over speed, and ambulances with science-guided reflective markings [5]. Many different organizations work together to ensure EMS services are provided to the community. This includes all the partner organizations that contribute to a tiered response, including municipal fire and police agencies, ambulance dispatch centers, base hospitals providing medical oversight, and receiving hospitals. With these multiple groups come inherent opportunities for miscommunication and adverse events to occur. Fragmented oversight of the system could lead to a situation where the same adverse event goes unrealized and occurs repeatedly. Interagency collaboration and training can improve team performance [6]. Human traits that contribute to adverse events are known as human factors [7]. Examples of human factors that can have negative effects on patient safety include complacency, fatigue, eyesight, and inattention [8]. However, it is important to remember that human factors also contribute to safety, as human action or inaction is often the last “slice of cheese” protecting patients. Examples include seeking clarification from a partner or developing strong habit patterns for checking medication concentrations. Task fixation is a common human factor that can contribute to error in EMS. Commonly termed “tunnel vision,” it can occur during endotracheal intubation or 12-lead ECG acquisition. Here, providers are so focused on a task perceived to be important that changes in the patient condition, such as desaturation, or competing priorities, such as chest compressions, can be excluded from thought. Many EMS procedures involve many actions and decisions. When the critical step is completed, it is common for downstream sequential actions to be forgotten. An example would be failure to release a tourniquet after placing an intravenous cannula. Another term used alongside human factors is ergonomics. This refers to physical human limitations, and is most commonly employed in developing work environments that complement the human body. Applying ergonomic science has brought about color- and font-coded medications, advanced “track system” stairchairs, and cardiopulmonary resuscitation (CPR) metronomes. A classic example of a common adverse event that was addressed using human factors and ergonomic science is that of the tourniquet; previously made of latex that was a similar tone to Caucasian skin, phlebotomists and other health care workers were known to leave tourniquets applied after collecting blood. By changing the color of tourniquets to bright blue, the incidence of forgetting to remove tourniquets dropped dramatically. The visual cue of the bright blue was all that was required to help providers remember the step of tourniquet release. Communication is also a key component to safety [9]. Not being heard, or being heard incorrectly, can lead to a task not being performed, the wrong task being performed, or a task being performed in the wrong way. Examples include medication errors and procedures being performed on the wrong limb. Callouts are used to ensure clear communication among all members of a team. Yelling “Clear!” prior to discharging a defibrillator is an example of a callout. A readback occurs when the receiver repeats the message from the sender. For example: When a readback does not happen, the sender should challenge the receiver to make sure he or she interpreted the message correctly. For example: There is limited research describing the actions, inactions, and clinical decision making of EMS personnel in relation to safety. Prehospital care providers exercise clinical decision-making skills on each and every call. Two key outcomes of these decisions are working diagnoses and treatment plans. Often protocols and guidelines are used to help field clinicians arrive at an accurate working diagnosis, which increases the likelihood that a correct treatment plan will be initiated. However, error exists in this area. Physicians are estimated to make a misdiagnosis in 10–15% of cases, and this is likely higher in emergency medicine [10]. Over 100 biases contribute to error in emergency medicine and can be related to cognitive pathways used by emergency workers to arrive at decisions [11]. The first is the intuitive pathway, developed through repeated experience. In this pathway, patterns are recognized quickly, and interventions are applied without much thought. While this serves paramedics well, there will be times when intuition is wrong. Following an analytical pathway can improve reliability in decision making by applying conscious, deliberate thought processes to a clinical situation. While this may take longer, the process of careful examination and testing can improve upon the intuitive pathway. Analytical reasoning is resource intensive, and requires a certain state of mind that can be clouded by stress, workload, and human factors such as personal stress, sleep patterns, and diet [12]. By understanding how emergency physicians and prehospital care providers think in the clinical setting, we can start to appreciate how patient safety is safeguarded by making sound clinical decisions, and how poor decisions can lead to disaster. Remembering that nearly all clinical staff want to perform well and improve patients’ lives, it is important to examine poor clinical decisions from a system perspective and not place blame on individuals. Addressing clinical decision making is best done with educational strategies that train clinicians how to think critically. Described as the “ability to engage in purposeful, self-regulatory judgment” [13], this construct permits clinicians to make treatment decisions based on the analytical pathway when needed, specifically to “double check” and override the intuitive pathway. An example of this metacognition would be when a paramedic walks into a residence and sees a patient who is diaphoretic, clutching his chest. Intuitively the paramedic may think “Oh, this guy is having a STEMI!” but the analytical process of obtaining a 12-lead ECG and inquiring about risk factors and incident history may elucidate a scenario more suggestive of aortic dissection, pulmonary embolism, or cocaine toxicity. Croskerry describes the development of critical thinking which, while not innate, can be “taught and cultivated, but even accomplished critical thinkers remain vulnerable to occasional undisciplined and irrational thought” [12]. In contrast to hospital settings, there is a stunning lack of epidemiological data pertaining to adverse events in the prehospital setting, despite a recognized need to better understand patient safety in EMS systems [14,15]. While there is some evidence documenting medical error by prehospital care providers [16], research from time-sensitive areas in the hospital, such as the critical care unit or emergency department, can also shine a light on adverse events that likely occur in the field as well. In one retrospective chart review of 15,000 cases, the emergency department was the most prevalent location in the hospital for negligent adverse events to occur [17] and others have made efforts to establish definitions and measurements for error in emergency medicine [18]. Emergency medical services personnel often work in small, poorly lit spaces in an environment that is chaotic, unfriendly, and challenging for time-sensitive health care interventions; indeed, it is often the dangerous nature of the environment that has led to the call for help. Unlike a hospital, emergency scenes can be filled with distracters that can increase the odds that an adverse event will occur. Physical characteristics of these scenes include loud noises, poor lighting, uncontrolled movement of people and vehicles, and small spaces. Language barriers, noise, stress, and medical conditions may limit effective communication between providers and their patients. Providers often work from compact bags rather than large, well-labeled cabinets and drawers. This limitation reduces the opportunity to place visual cues or organize equipment for optimal performance. In addition to these challenging environmental factors, emotional stressors are often heightened by the presence of panicked family members and curious bystanders, and a lack of human and medical resources. The time-sensitive nature of EMS care further compounds these physical and emotional stressors. Further, EMS work can be complicated by multiple handoffs from BLS providers to ALS providers to air ambulance crews and finally to hospital staff. Lastly, EMS work is round the clock, and often EMS workers endure 12-, 14-, or 24-hour shifts with few opportunities for meals or rest [19]. This can lead to fatigue, which is known to play a role in adverse event incident rates [8,20]. The arena in which EMS providers work is rich with opportunities for adverse events attributed to provider or system errors. Importantly, unintentional error can have profoundly negative effects on EMS providers [21,22]. Increased stress, time away from work, family disruptions, job burnout, divorce, depression, and suicide in health care workers have all been correlated to adverse events [22]. There is no common language used to define adverse events in the EMS setting, making general discussion and comparisons challenging. The World Health Organization defines patient safety as the “reduction of risk of unnecessary harm associated with health care to an acceptable minimum” [7]. The term “acceptable minimum” refers to the collective notions given current knowledge, resources available, and the context within which care was delivered weighed against the risk of non-treatment or other treatment [7]. In other words, the acceptable minimum risk fluctuates based on the context of the health care delivery system. What may be considered an unacceptable risk in an operating theatre may be an acceptable risk in the prehospital setting, and vice versa. Harm need not occur for patient safety principles to apply; potential risks of unintended harm, termed near misses, are of core interest as they represent opportunities to implement safer practices before harm has been inflicted. Examples of near misses include drawing up, but not administering, the wrong medication, or charging a defibrillator when a patient has a palpable pulse.
Culture of patient safety
Introduction
How accidents happen
The Swiss cheese model
System factors
Human factors and ergonomics
Judgment and clinical thinking
Patient safety in EMS
The unique environment
Defining patient safety in EMS