Resuscitation

div class=”ChapterContextInformation”>


© Springer Nature Switzerland AG 2020
Philip C. Spinella (ed.)Damage Control Resuscitationhttps://doi.org/10.1007/978-3-030-20820-2_7



7. Hemostatic Resuscitation



Andrew P. Cap1  , Jennifer M. Gurney2 and Michael A. Meledeo3


(1)
Medical Corps, U.S. Army, U.S. Army Institute of Surgical Research, JBSA-FT Sam Houston, TX, USA

(2)
Joint Trauma System, US Army Institute of Surgical Research, JBSA – FT. Sam Houston, TX, USA

(3)
Coagulation and Blood Research, United States Army Institute of Surgical Research, JBSA-Fort Sam Houston, TX, USA

 



 

Andrew P. Cap



Keywords

HemorrhageBlood failureDamage controlResuscitationHemostaticWhole bloodLTOWBPlateletsComponent therapyBlood transfusion


Introduction


Evidence from the battlefield and the civilian prehospital environments is clear: hemorrhage is the leading cause of preventable death in trauma. The goal of the trauma medical community at large has been to reduce the number of these preventable deaths to zero [1]. Saving the life of an exsanguinating patient requires two objectives: early hemorrhage control and appropriate, hemostatic resuscitation. The more rapidly these are accomplished, the greater the chances of survival. There is a distinct reciprocity between hemorrhage control and appropriate resuscitation: hemorrhage causes a quantitative and qualitative failure in the hemostatic function and, more broadly, of the global homeostatic function of blood and the endothelium that contains it. To understand the pathophysiology of hemorrhagic shock and how this drives therapeutic imperatives, the blood-endothelium unit should be thought of as an organ system and the shock state as organ failure. The rapid onset of blood organ failure quickly leads to failure of the other dependent organs, and death within minutes to hours, depending on the rate of hemorrhage.


The critical role of blood in supporting other organs has long been appreciated; however, the degree to which the dysfunction of the blood-endothelial organ contributes to hemorrhagic death, and the time course over which this develops, has only recently been acknowledged. Basic research combined with a focused empiricism approach to analyzing clinical data from military and civilian trauma systems has led to the development of “damage control resuscitation” (DCR) [2]. The concept of damage control has been embraced in the military’s battlefield continuum of care; it describes a paradigm of early recognition and therapies to control hemorrhage facilitating the movement of casualties to higher capabilities. The DCR bundle in the military’s battlefield trauma system include prehospital hemostasis with tourniquets or hemostatic dressings, rapid surgical control of bleeding and avoidance of crystalloids, and hemostatic resuscitation or a whole blood-based resuscitation with either whole blood or blood components in a 1:1:1 ratio that recapitulates whole blood [3, 4]. Hemostatic resuscitation is central to the DCR bundle of care.


Hemostatic resuscitation stands in stark contrast to prior resuscitation strategies that prioritized maintenance of circulating volume (generally with crystalloids) and restoration of oxygen delivery (with red blood cells) but neglected delivery of hemostatic products [5]. Hemostatic resuscitation has profound implications not only for treatment of the individual patient but also for the design and support of entire trauma and blood banking systems. The application of DCR principles, with hemostatic resuscitation as its core, in both military and civilian settings, has reduced trauma mortality and is revolutionizing trauma care. This chapter will discuss the physiologic principles underpinning DCR; the history of its empiric, clinical evolution; and the importance of whole blood and blood component transfusion strategies, in addition to factor concentrates and adjuncts to hemostatic resuscitation. The chapter also reviews potential treatment strategies for the future.


Coagulopathy Is a Manifestation of Blood Failure: The Need for Hemostatic Resuscitation


The term blood failure refers to the physiologic consequences of untreated hemorrhage (see Chap. 3 for a larger discussion of blood failure). Severe injury with resultant hemorrhage results in failure of oxygen delivery by blood (a quantitative deficiency), leading to accumulation of oxygen debt and a cascade of events driven by cellular hypoxia and metabolic failure (quantitative and qualitative deficiencies in blood function). Endotheliopathy develops within the first 30 min of hemorrhage-induced hypoperfusion and is characterized by release of tissue plasminogen activator (tPA) with activation of fibrinolysis, as well as loss of endothelial glycocalyx and associated dysfunction of endothelial regulation of permeability and interactions with the coagulation and immune systems [6]. From a historical standpoint, coagulopathy, as it relates to hemorrhage, was described and studied in both the Korean and Vietnam wars [7, 8]. Coagulopathy now is known to develop in parallel with endothelial dysfunction. The exposure of tissue factor on damaged tissue activates thrombin generation, which is dramatically amplified on the surfaces of activated platelets, leukocytes, and endothelial cells, as well as on microvesicles derived from these and other cells [911]. Activation of fibrinolysis is coupled with consumption of fibrinogen by the burst of thrombin activity, leading to clot formation but also to increases in circulating fibrin monomer and fibrin degradation products [12, 13]. Oxygen debt directly alters coagulation function through effects on fibrinogen such as oxidative stress [14]. The combination of these factors can interfere with normal fibrin polymerization and the formation of stable clots. The net result is a quantitative loss of clotting capacity as well as a qualitative defect which is manifest in both traditional clotting time-based coagulation assays (e.g., prolongation of prothrombin time, PT) and viscoelastic assays of clotting. Traditional coagulation tests like the prothrombin time are also affected by the association of factors V and VIII with activated phospholipid surfaces where they serve as anchors for the assembly of the tenase and prothrombinase complexes, thus reducing their levels in circulating blood that is sampled for clinical testing. In some respects, this aspect of measured coagulopathy may be considered an artifact, though it does reflect the tremendous mobilization of resources to generate thrombin in vivo. Though plasma anticoagulant pathways that regulate thrombin generation, such as the protein C and S systems, are activated by elevated thrombin generation, prolongations in clotting times observed in hemorrhagic shock patients occur despite both elevated thrombin generation potential and evidence of prior and ongoing thrombin generation (e.g., elevated levels of thrombin-antithrombin complexes), as well as rapid fibrinogen consumption and degradation [15, 16].


In light of this, the early coagulopathy of trauma could be seen as a failure to regulate exuberant thrombin generation and fibrinolysis, leading to a fundamentally consumptive coagulopathy, rather than an anticoagulated state in which thrombin generation is inadequate to the hemostatic challenge [1720]. In addition, platelets, the vital cellular effectors of hemostasis, are rapidly activated (in part by exposure to elevated thrombin levels) and consumed early in response to injury but then develop qualitative functional defects, such as loss of aggregation function, through a combination of metabolic failure and inhibitory signaling, which magnify the effects of coagulopathy in the plasma [21, 22]. Decreases in platelet number and function are correlated with increased trauma mortality [23, 24]. During hemorrhage the combination of the loss of blood cells and the movement of interstitial fluid into the vascular space results in hemodilution. Hemodilution is exacerbated by the loss of endothelial barrier function and decreased Starling forces and amplifies functional defects in coagulation and platelet function. Loss of red blood cell mass reduces the buffering capacity of blood, exacerbating lactic acidosis, and leads to loss of platelet margination to the edges of the blood flow stream where they can attach to damaged tissue and initiate hemostasis [25, 26]. Red cells provide the bulk of clot mass and contribute to blood viscosity; their loss in hemorrhage is thus a central feature of coagulopathic bleeding. The downward spiral of blood failure encompasses all of the sub-systems of the hemo-vascular organ including loss of vasomotor regulation, beginning with systemic vasoconstriction due to adrenergic hyperactivity [27] followed by vasoplegia, and an immunopathology, resulting in systemic inflammation and dysregulated innate and adaptive immunity [2831]. The loss of hemostasis homeostasis in catastrophic hemorrhage described above reflects first and foremost the quantitative loss of whole blood, not just depletion of any single component. That central fact underlies the resultant pathophysiology and implies a treatment approach: lost organ function must be replaced. Hemorrhage control cannot be successful without restoration of oxygen delivery and hemostatic function. Severe blood loss must be treated with blood transfusion. Indeed, hemostatic resuscitation can be thought of as the core organizing principle of DCR. Application of these concepts has required a major change in clinical practice and the organization of trauma systems. The evolution of this transformation is described below.


The History of Damage Control and the Coaptation with Hemostatic Resuscitation in Trauma Care


Damage control resuscitation represents the convergence of concepts and interventions that control bleeding and treat blood failure. Principals such as early mechanical hemorrhage control and hemostatic resuscitation are the pillars of DCR. The term “damage control” has its roots in the US Navy and is the concept of refocusing the efforts of a damaged ship’s crew on fire control and leak containment. The translation of this concept to the massively hemorrhaging patient that will succumb to their injuries without rapid intervention is germane, especially when considering the military continuum of battlefield care. Damage control in the Navy are measures to keep a severely damaged ship afloat with temporary salvage techniques so that it might survive to arrive at a port for formal repairs. It is a process that requires quick decision-making and often painful trade-offs in stabilizing a devastating situation and curtailing losses. The Damage Control Handbook, published in 1945 by the Bureau of Naval Personnel, describes the rapid salvage approaches for damaged ships: “If the ship does not sink within a very few minutes after damage, she probably will survive for several hours.” The parallels in trauma management , especially when it comes to interventions for hemorrhage, are readily apparent. Naval damage control has four goals: extinguish the fire, stop the flooding, repair machinery, and provide care to wounded personnel. These concepts applied to the care of traumatically wounded patients are stop the bleeding and minimize contamination, temporize nonlethal injury, stabilize the patient’s metabolic disturbances, and then later perform definitive repairs. Hemostatic resuscitation is essential for bleeding control and minimizing metabolic disturbances; it is compulsory for effective DCR.


This principle of rapid salvage and stabilization of a bleeding patient instead of proceeding directly to definitive repair was described in 1983 by Stone et al. [32]. In this foreshadowing of the modern DCR approach, patients underwent abbreviated laparotomy for hemorrhage control to avoid additional bleeding from coagulopathy and the development of blood failure, although the term “blood failure” had not been described. Ten years later, the concept of damage control surgery was defined by Rotondo and Schwab as initial control of hemorrhage and contamination followed by intraperitoneal packing and rapid closure, allowing for resuscitation to normal physiology in the intensive care unit and subsequent definitive re-exploration [33]. This practice was widely adopted and is now considered the gold standard in the care of significantly injured patients. The initial description of the damage control concept did not include hemostatic resuscitation; instead it emphasized high-volume infusion of crystalloids and red blood cell concentrates, with minimal and late use of plasma and platelets; whole blood was rarely used. Despite the recognition as early as 1982 that crystalloids caused hemodilution and acidosis and contributed to hypothermia (if not warmed prior to infusion), leading to worsened coagulopathy and a “bloody vicious cycle” [34, 35], the resurgence of hemostatic resuscitation to treat trauma-induced blood failure, aka “the blood vicious cycle,” took an additional two decades to fully rediscover.


While surgical damage control was being refined and the damage control concept expanded to ICU care and other surgical disciplines, resuscitation strategies were slower to change. Resuscitation has classically been defined as an intervention designed to expand the intravascular space and restore oxygen delivery to vital organs. However, as alluded to above and as will be described further in this chapter, the convergence of the damage control surgery and hemostatic resuscitation has led to the evolution of resuscitation to now represent an intervention not only for oxygen carrying capacity but also for hemostasis and other aspects of blood function including maintenance of endothelial structure and function. Hemostatic resuscitation treats hemorrhagic shock and blood failure and is a key component of damage control. Both mechanical (surgical) hemorrhage control and hemostatic resuscitation are necessary for DCR; one without the other is insufficient for the hemorrhaging patient. Over the last two decades, these concepts have simultaneously evolved and coapted; hemostatic resuscitation was somewhat of a late addition to the DCR strategy – despite many of the concepts being employed since World War II. As blood failure become better elucidated, hemostatic resuscitation, which includes plasma, platelets, cryoprecipitate, and whole blood transfusion, complemented the damage control strategy and led to the integrated concept of DCR.


Damage Control and Hemostatic Resuscitation in Combat Casualty Care


Much of the current understanding of damage control resuscitation and hemostatic resuscitation have come from the recent military experiences in Iraq and Afghanistan. Damage control concepts have long been used by the military across the spectrum of care, and damage control should be viewed more as a strategy than a specific procedure. In the modern US military battlefield system of care, which evolved from roughly 2003 to 2018, the concept of damage control starts with the point-of-injury medic rapidly controlling hemorrhage with a tourniquet and hemostatic dressing use. Rapid evacuation to far-forward surgical care where abbreviated laparotomy and additional hemorrhage and contamination control procedures are performed is the next stage in the damage control spectrum of care. After hemostatic resuscitation with blood products and rewarming, the patient is evacuated to a higher level of care for definitive management. Important elements in the modern damage control strategy include early use of whole blood, minimal crystalloid infusion to avoid hemodilution, and active rewarming measures. These, combined with moving surgical and resuscitative capabilities closer to the point of injury, encompass a “bundle of care” which mitigates the effects of massive hemorrhage. The development of this system – which now includes prehospital blood transfusion – required a paradigm shift in the understanding of the goals of resuscitation and the importance of treating early blood failure with hemostatic resuscitation.


Interestingly, resuscitation for combat casualties with whole blood and plasma was used in the beginning of the twentieth century up until the Vietnam War. Blood and plasma were transfused to wounded soldiers promptly after injury, and mobile blood banks were used to deliver whole blood to the forward line of battle [36, 37]. Ironically, at the end of the Vietnam War, there was a movement away from whole blood toward crystalloid and blood component resuscitation. This was a reductionist attempt to provide a goal-directed strategy to replace circulating and interstitial volume while sequentially treating identified physiologic deficiencies. Crystalloids were given to increase volume, red blood cell concentrates to replace oxygen-carrying capacity, plasma to replace factors, etc. Given the rapid physiologic decompensation of patients suffering traumatic hemorrhage, it might have been anticipated that a sequential, goal-directed strategy such as this would have been difficult to implement effectively, especially in austere military settings. This resuscitation strategy was a scientific, intellectual, and elegant approach to resuscitation which was rooted in the urge to deconstruct the pathophysiology and measure precisely before treating; interestingly, it was widely adopted without direct comparison to the antecedent whole blood approach or any comparative study.


Relearning Lessons of the Past


In both military and civilian environments, there have been significant investments to advance the understanding of the physiology of hemorrhage and the development of mitigating strategies for hemorrhage control. Devices, medications, operative strategies, and attempts at optimization of transfusion practices have been aimed at improved hemorrhage control in order to avoid the aberrant and frequently deadly physiologic lethal triad (acidosis, coagulopathy, and hypothermia) associated with hemorrhage. However; in a remote setting or far-forward battlefield with limited access to equipment and supplies, successful resuscitation requires a care provider’s knowledge and recognition of hemorrhage physiology rather than technology and advanced materiel. It is critical that this knowledge be codified and passed on to avoid the cycle of relearning lost lessons.


Over the last two decades, there has been a paradigm shift in resuscitation strategies regarding both what is given as well as when it is given. The trauma community has learned that rapid transfusion using either whole blood or a combination of blood components that recapitulates the oxygen carrying and hemostatic function of whole blood will decrease death from hemorrhage. Additionally, the community has learned that resuscitation should begin as soon as the need is identified, at the site of injury if possible. Transfusion at the point of injury (POI), at a remove from the manpower and logistical support of a hospital, is called remote damage control resuscitation (RDCR) and is now recognized as the lifesaving intervention having the most potential to decrease preventable deaths from severe hemorrhage early after injury [38, 39].


Much of what has been learned about RDCR comes from theaters of war, where the most concentrated source of hemorrhaging patients is found. While the terms damage control resuscitation, hemostatic resuscitation, and remote damage control resuscitation are part of the “resuscitation lexicon” that has emerged during the recent conflicts in the Middle East, their concepts date back to World War I. Transfusion practice evolution during military conflicts has demonstrated that the battlefield is often a source of advancement and innovation in medicine, fueled by intense and concentrated patient experiences as well as the national impetus to improve patient outcomes. The history of battlefield medicine has given great insight into what has worked and what has not. Unfortunately, lessons learned from the past have had to be relearned during the recent conflicts. One of the most obvious of these is the employment of tourniquets for extremity hemorrhage control, but these relearned lessons also include principles associated with transfusion therapies dating back to World War I , such as “The indications for blood transfusion are based on the fact that transfused blood is the best substitute for blood lost in acute hemorrhage,” from the 1918 article The transfusion of whole blood: a suggestion for its more frequent employment in war surgery by Dr. LB Robertson [40]. This century-old article states that a seriously bleeding patient needs whole blood – what was lost must be replaced. While this seems apparent and even simplistic, resuscitation strategies employed since 1918 have varied significantly, incorporating usage of balanced salt solutions, colloidal volume expanders, blood component therapy, and finally, again, whole blood – despite the fact that Robertson and colleagues turned to blood because of the failure crystalloid- and colloid-based resuscitation. Many of the lessons learned from conflict over the years have not been effectively disseminated in peacetime to maintain continuity of best practices.


Resuscitation for the past several decades thus has consisted of crystalloid solutions (lactated Ringer’s or normal saline) and packed red cells (if available), a strategy that remains pervasive in locations that have insufficient blood products or supply chain deficiencies. These products temporarily restore perfusion pressures and provide some oxygen delivery but forego support for hemostasis and aggravate endotheliopathy and reperfusion injury. Hemostatic resuscitation incorporating plasma and platelets and minimizing crystalloids on the other hand offers a significant number of other benefits besides restoring perfusion. Even beyond the coagulation factors in plasma, blood products promote homeostasis which is critical to preventing exacerbation of other aspects of blood failure [6, 41]. As important as procoagulant factors are, the anticoagulants such as antithrombin, protein C, and protein S control excess thrombin generation remote from sites of injury and maintain hemostatic and homeostatic balance [42, 43]. Rapid replacement of what has been lost in hemorrhage (all the elements of whole blood) assists with early reversal of shock, hemostatic dysfunction, and endotheliopathy (including associated capillary leak and inflammation).


Physiologic Requisite for Hemostatic Resuscitation


The advocacy for interstitial resuscitation puts the cart before the horse: replacing what was lost from the interstitium is irrelevant if blood failure is not adequately treated. Indeed, interstitial resuscitation as a primary resuscitation approach proved harmful, and even Shires and colleagues sought to correct the misperception that crystalloid use could substitute for blood [44]. Nevertheless, a return to hemostatic resuscitation with whole blood or balanced components would only evolve due to the high casualty volume and logistical challenges of maintaining blood supplies experienced by the US military in Iraq and Afghanistan, more than three decades after the end of the Vietnam War. Improved data capture and enthusiasm for outcomes-based research enabled rapid dissemination of new practices and widespread adoption of modern military blood transfusion strategies.


How did these paradigm shifts occur? As US military casualties began to mount in 2004, blood supplies reaching trauma hospitals and forward surgical teams were found to be inadequate in the management of severely injured patients. Supplying fresh frozen plasma (FFP) proved to be difficult due to challenges in cold chain management and high bag breakage rates. Platelet units were completely unavailable. Resuscitation with red blood cells and crystalloid alone led to high rates of exsanguination, and there were shortages of red cell units. Physicians turned to whole blood collected from walking blood banks to supplement the inadequate component therapy. The experience with fresh whole blood transfusion proved revelatory. Outcomes were visibly better [45]. The Armed Services Blood Program , concerned about the risk for transfusion transmitted disease from using untested blood collected from walking blood banks , responded to the need for an expanded component supply by moving apheresis platelet collection teams into theater and by increasing RBC and plasma shipments [46, 47]. Clinicians attempted to reproduce the results they had seen with fresh whole blood by incorporating plasma and platelets early in the resuscitation of bleeding patients. When supplies of these components were exhausted, they switched back to whole blood. Aware of the unique circumstances they found themselves in, these remarkable clinicians recorded in great detail the interventions they applied and the outcomes they observed. Data from these early studies, conducted between 2003 and 2007, inspired similar efforts in civilian populations which confirmed and extended the battlefield findings. Importantly, these early data collection efforts led to the creation of the DoD Trauma Registry, which has grown into the largest combat trauma registry in history and has provided data for many important studies.


While many analyses of combat trauma data have been published, several have proven to be extremely important for the subsequent development of DCR hemostatic strategies. The first of these studies, and by far the most frequently cited paper on resuscitation from the Iraq War experience, was the 2007 study published by Borgman and Spinella [48]. In this seminal work, the authors described how an increasing ratio of plasma to red cell units was associated with dramatically reduced risk of death in combat trauma patients. As the ratio of plasma to red cells increased from 1:8 to 1:1, mortality dropped from 65% to 19%. This paper gave rise to the “1:1” plasma to red cell ratio concept. Perkins and Cap extended these findings with an analysis of the impact of adding apheresis platelets to hemorrhage resuscitation. They found that adding platelets in a ratio of ≥1:8 (i.e., about one apheresis unit for every 6 units of red cells) was associated with the highest survival (95%) compared with patients transfused the lowest ratio of platelets (64% survival) [49]. Spinella and colleagues showed that the best results were obtained when fresh whole blood was included in resuscitation, even when compared to component-based therapy that included platelets [45]. Multiple studies of whole blood use in Iraq and Afghanistan have confirmed that whole blood is associated with outcomes at least as good, if not better, than component-based therapy [50, 51].


Ultimately, damage control resuscitation (DCR) was understood to include the comprehensive treatment package of early hemostatic resuscitation with blood product transfusion, immediate arrest of ongoing hemorrhage (even if the therapy is not definitive), avoidance of crystalloids and colloids, maintenance of normothermia, use of hemostatic adjuncts, and physiologic stability to thwart the early coagulopathy of trauma and to decrease the likelihood of blood failure.


As hemostatic resuscitation began to take shape, the strategy of initial crystalloid resuscitation followed by a serial augmentation of red cells, plasma, and lastly platelets was abandoned and no longer considered optimal care [5257]. While it remains unclear if the detrimental effects of crystalloid are secondary to dilution of clotting factors and platelets, injury to the endothelium, or another primary effect of these acidotic, potentially pro-inflammatory fluids , it has been shown that even small volumes (approximately 1.5 liters) of crystalloid are deleterious. Both crystalloid- and colloid-based resuscitations ultimately may result in a decline in oxygen delivery, exacerbating acidosis and coagulopathy and thereby increasing blood loss which increases the challenge of surgical hemorrhage control in addition to the other derangements in physiology. In hemostatic resuscitation, only low volumes of crystalloids and colloids are used in both the prehospital setting and through the entire resuscitation including intraoperative management and the immediate postoperative period.


Thus, data emerging from the large numbers of casualties treated in Iraq and Afghanistan supported the use of a hemostatic resuscitation consisting of whole blood or blood component products (packed red cells, plasma, and platelets) administered in ratios that mimicked whole blood and had better efficacy in treating the coagulopathy of trauma. Similar results were observed in a large, multicenter observational study of civilian trauma patients [58]. The Prospective, Observational, Multicenter, Major Trauma Transfusion (PROMMTT) study was a comparative efficacy investigation in ten trauma centers that demonstrated how early transfusion of higher plasma and platelet ratios (versus red cells) was associated with decreased mortality during the initial 6 h after admission [59]. PROMMTT demonstrated the challenges of survival bias in studies evaluating an exsanguinating patient cohort, the importance of coordinating efforts necessary to transfuse the optimal ratio of plasma and platelets within minutes of hospital arrival, and, most importantly, that suboptimal transfusion ratios are associated with early death. PROMMTT provided critical evidence that helped inform the design of the follow-on randomized trial which has fueled the evolution of transfusion practices in the hemorrhaging trauma patient [5963].


Hemostatic resuscitation has been shown to improve outcomes when surgical hemorrhage control is necessary; additionally, evidence is mounting suggesting that it can improve rates of successful nonoperative management in Grade IV and Grade V blunt liver injuries [64, 65]. In a retrospective analysis of more than 1400 blunt liver injuries before and after implementation of hemostatic resuscitation at a Level 1 trauma center, increased success rates of nonoperative management were observed; additionally, a significant improvement in survival was achieved [65].


As military resuscitation practice evolved over the course of the conflicts in Iraq and Afghanistan, hemostatic resuscitation approaches were translated to the prehospital environment , with medics at the point of injury and helicopter evacuation crews administering transfusions. From a physiologic standpoint, this transition had obvious appeal since it offered the possibility of reducing shock dose and preventing coagulopathy. Observational studies of both the US and British military experience as well as US civilian trauma system experience were indeed promising [6669]. The most detailed such study, by Shackelford and colleagues of the US Joint Trauma System, observed a striking reduction in mortality among combat casualties transfused within 30–40 min of injury [70]. In addition, the “golden hour” decision by Secretary of Defense Robert Gates to increase the number of helicopter evacuation platforms in Afghanistan available to transport casualties to surgical care within 1 h, much applauded for its association with a reduction in combat mortality, was found to have had its beneficial effect primarily through the early delivery of blood transfusion to wounded personnel [71]. This experience supported the expansion of hemostatic resuscitation as part of an overall DCR approach to settings remote from hospital capabilities and the coining of the term “remote damage control resuscitation” or RDCR.


Since hemostatic resuscitation principles state that plasma should be the primary volume resuscitation fluid in order to reduce endothelial dysfunction, and restore lost coagulation factors, the Department of Defense funded two randomized studies of plasma-based prehospital resuscitation to evaluate whether moving the hemostatic resuscitation approach out of the hospital would improve trauma outcomes as suggested by the multiple observational studies discussed above [2, 5658, 72, 73]. The COMBAT trial was a single-center study comparing plasma (2 units administered by paramedics) to normal saline in ground ambulance evacuation [74]. This study did not find a reduction in trauma mortality with early plasma transfusion, but evacuation times were so short (<20 min for both arms) that many patients could not receive the intervention before arrival at hospital (only 32% received both units of plasma during transport). This study was halted early due to futility. The PAMPER trial was a multicenter study that compared plasma (2 units administered by flight crew) to standard of care (generally crystalloids but include red cell units) in helicopter evacuation of trauma patients. This study found a substantial survival advantage for early plasma transfusion – reduction of 30-day mortality by one-third (22% vs. 32% mortality). The difference in findings between the two studies may be due to many factors, and cross-study comparisons are hazardous; however, the mortality difference may have been due to the long evacuation times (approximately 40 min) in PAMPER compared to COMBAT [69]. Neither study identified any disadvantages to beginning transfusion support in the prehospital environment, and in both, most patients receiving early transfusion went on to require further blood transfusion support, indicating that triage algorithms could be successfully implemented by prehospital providers. Overall, the weight of currently available clinical evidence as well as our current understanding of hemorrhage, coagulopathy, and blood failure supports the implementation of RDCR, or early hemostatic resuscitation, in both civilian and military settings, particularly when transportation time to fully capable trauma hospitals exceeds 20 min.


What to Transfuse and When


While the choice of which product to deliver first depends on the condition and need of the patient, hemostatic resuscitation prioritizes platelets first, followed by alternating red cell units and plasma units in a ratio of 1:1:1 to best mimic whole blood, when whole blood is not available. For patients requiring transfusion, early delivery of plasma and platelets is associated with improved survival within the first 6 h. Additionally, maintaining the 1:1:1 ratio of platelets, plasma, and red cells has been shown to improve outcomes including reduced mortality and cessation of anatomic bleeding. This hemostatic damage control resuscitation is currently the standard operating procedure for massive transfusions within the military and many civilian centers [75].


In addition to what is being transfused, the timing of transfusion is critical. Many advances in trauma and critical care emphasize that the more expeditious the intervention, the more efficacious the therapy. Well-understood early interventions that result in improved treatment effects are antibiotics in sepsis, time to neurosurgical intervention for extra-axial traumatic hemorrhage, time to tourniquet placement for extremity hemorrhage, time to intervention for stroke therapy, time to revascularization in myocardial infarctions, and time to hemostatic transfusion in hemorrhage. Time is critical, and while it seems like an obvious statement: hemorrhaging patients die quickly; therefore, minutes matter. Time to hemostatic transfusion and time to hemorrhage control are the difference between a life and death for a patient with severe bleeding. Given that most deaths from hemorrhage occur in the prehospital environment, employing strategies to mitigate the effects of hemorrhage and improve hemorrhage control in the far-forward combat environment will have the highest impact mortality. Additionally, strategies used to control bleeding in the military population can be extrapolated into civilian practices and ideally have a large impact on preventable death from trauma.


Clearly then, there are several parameters to be considered when developing transfusion strategies for DCR, and these become even more critical in the RDCR setting. The product or products transfused should recapitulate to the extent possible the functionality of whole blood. Oxygen delivery and hemostasis must both be accomplished. Products like whole blood , plasma and platelets should be optimized primarily for their ability to support hemorrhage control. Speed and ease of use are vitally important, particularly as the staff available to administer the transfusions becomes constrained as in the prehospital environment. During initial resuscitation, complexity should be minimized wherever possible to reduce risk of errors and improve speed, through use of broadly compatible products and minimization of testing.


Tools of the Trade: All Roads Lead to Whole Blood


Early in the recent wars in Iraq and Afghanistan, whole blood was utilized in US Military operations primarily by forward deployed teams that were equipped with a limited supply of packed red blood cells ; whole blood transfusion during this period was driven by necessity rather than clinical indication [51, 76]. At the combat support hospitals, the highest level of care on the battlefield, whole blood was initially used when apheresis platelets were unavailable. Over time, through both focused empiricism and investigations of comparative efficacy which demonstrated improved survival with whole blood, battlefield hospitals began using whole blood not just when components were unavailable but because of the clinical superiority of whole blood [45, 51, 77].


While component therapy is a vast improvement over crystalloid and colloid for hemostatic resuscitation, there are deficiencies to the method that must be addressed. With multiple components transfused comes multiple doses of anticoagulant; whole blood is superior in this regard since anticoagulant-induced dilution is minimized with a single product. In a reconstituted whole blood made from 1:1:1 blood components, hematocrit and factor levels are lower than equivalent units of whole blood [78, 79]. Additionally, whole blood contains platelets, providing superior hemostatic function to component therapy in the variety of situations in which platelets cannot be supplied. Logistically, it is much easier to collect, transport, store, and transfuse a single product that meets the essential needs of a bleeding patient.


Because of the benefits to patient care and logistical simplicity, there continues to be enthusiasm for whole blood use in in both military and civilian settings : it is being considered, studied, and reestablished as the optimal therapy for hemorrhage. Cold stored, low titer Group O whole blood was introduced into Iraq in November 2016. In 2017, 311 units of cold stored LTOWB were transfused, and it was the preferred resuscitation product when compared to component therapy. Based on the usage of, and demand for, LTOWB, the authors concluded that it is not only feasible but has logistical advantages and will likely emerge as the preferred transfusion product for far-forward damage control resuscitation [80].


Combat casualties requiring massive transfusion have a mortality rate up to 33% and will receive the largest benefit from whole blood transfusion. In a large retrospective review of patients that received whole blood without platelet transfusion compared to those who received balanced component resuscitation (including platelet transfusion), those who received whole blood had a higher survival at both 24 h and 30 days. The use of fresh whole blood was associated with a 13% increase in 30-day survival and demonstrated that the volume of fresh whole blood transfused was independently associated with improved 30-day survival [45].


Whole blood can refer specifically to two types of products. The first is fresh whole blood (FWB) , drawn on an emergency basis and transfused within a limited window post-collection (typically 24 h). This has the drawback of omitting formal pathogen screening which increases the risk of transfusion transmitted disease (TTD) [81]. This risk can be partially mitigated through the use of point-of-care rapid screening tests, though such testing can be impractical under the most austere conditions of combat casualty care. Blood group typing and matching can be performed with point-of-care tests, or FWB can be drawn only from pre-screened individuals who are group O low anti-A and anti-B titer donors (low titer O whole blood, LTOWB). LTOWB red cells will be compatible with recipients of other blood groups, and the risk of transfusing incompatible plasma is minimized by selecting donors with low titers against A and B blood group antigens. FWB can be collected and stored at refrigerated temperatures (1–6 °C) within the first 8 h for up to 21 days in CPD anticoagulant or 35 days with CPDA-1 anticoagulant. If whole blood is collected where it can be tested for TTDs , it can be provided as a standard refrigerated and fully tested product (cold whole blood, CWB). However, the stored blood suffers from the same “storage lesion” as has been previously described for blood components. Over time, stored whole blood red cells undergo shape change and lose function, platelets bind fibrinogen and release their intracellular contents (depleting functionality), and waste products accumulate in the plasma increasing acidity. Despite this, cold-stored blood still supports hemostasis and provides platelets in many scenarios in which they would be unavailable. Storing platelets at cold temperatures reduces their rate of functional decline as observed in the platelet storage lesion for the room temperature-stored standard-of-care platelet product.


Given that the vast majority of combat deaths occur in the prehospital environment, prior to reaching a surgical capability, these are the combat casualties who will most benefit from blood far forward. WB is the logical choice for a nearly perfect resuscitative product in the far-forward environment given that it has oxygen carrying capacity, coagulation factors, and platelets in the ratio that are lost during exsanguination. It is logistically easier to carry and transfuse one unit of whole blood compared to multiple units of components. In the current theaters of operation, the blood transfusion capability continues to mature at both point of injury and in the en route care environment. In 2013 the Norwegian Special Operations Forces instituted an RDCR protocol which included far-forward collection and transfusion of whole blood. A similar protocol for tactical DCR (TDCR) in order to transfuse low-titer Type-O WB at POI was adopted by US Army Ranger Regiment: Ranger O-low-titer Type O (ROLO) . Currently, US Special Operations Forces carry low titer group O whole blood on select missions [8284]. Transfusion far forward is an essential capability that saves lives of combat casualties.


In far-forward or prolonged field care conditions with life threatening hemorrhage, where hemostatic resuscitation is most critical, it becomes particularly apparent that whole blood is a superior option with respect to simplicity of logistics, usage, and outcomes. Carrying all blood components (RBCs, plasma, platelets) is all but impossible for the military medic, and even most medical transports cannot support the multiple temperature storage modalities required for proper maintenance of individual components. Additionally, both collection and delivery of a single product reduce risk, including crossmatching risk reduction through use of low-titer O whole blood as mentioned above – a benefit in prolonged field care, at role 2 facilities, and even for humanitarian care where the recipient’s type is unknown. However, the definition of what constitutes “low-titer” for anti-A and anti-B is still under some debate, with the maximum set to <256 by US Armed Forces until such time as more strong evidence emerges to re-evaluate this threshold established in World War II.


Leukoreduction has been recommended to reduce the immunomodulatory side effects of whole blood transfusion. Remy et al. showed that there was a distinct loss of platelet function even with “platelet-sparing” leukocyte filtration, an effect that must be considered in the cost-benefit analysis of whether or not to use leukoreduction [85]. The US military does not currently leukoreduce whole blood, though approximately 50% of civilian centers do so [86, 87].


Another consideration in the implementation of a LTOWB program for hemostatic resuscitation is how to manage the risk of alloimmunization to the D or other antigens in patients receiving uncross-matched blood. It is generally accepted that alloimmunization to the D antigen represents the greatest risk, as it is the most immunogenic antigen on red blood cells. In female patients of child-bearing potential, development of an anti-D antibody could lead to hemolytic disease of the fetus and newborn (HDFN) , though only about 20% of D-negative recipients of D-positive red blood cells or whole blood develop antibodies. While a simple solution to this problem would appear to be available – transfusion of only D-negative LTOWB to females of child-bearing potential – the reality is that D-negative potential donors make up only 7% of the population and that D-positive group O whole blood is generally the only product available in sufficient quantities to resuscitate patients. Thus, decisions regarding what products to offer to which populations should be made based on a local risk assessment. Transfusion of D-positive LTOWB to females of unknown blood type can be justified due to the imperative to preserve the patient’s life when weighed against the relatively low risk of causing harm to a future theoretical pregnancy. Furthermore, HDFN is treatable and does not automatically doom all future pregnancies. Finally, it is important to realize that the limitations on availability of D-negative LTOWB are similar to those for D-negative RBC units and that most of the emergency release blood available is D-positive [88].


The tangible benefits from both logistical and patient care perspectives make whole blood a superior option to component therapy following hemorrhage, especially in massive transfusion cases and in the prehospital setting.


Tools of the Trade: Component Blood Products


Red Cells


Red cells (erythrocytes) are the largest volumetric cellular fraction of blood, performing the critical functions of delivering oxygen to tissues, supplying critical enzymatic functions, and buffering the blood [89]. Their contribution to hemostasis consists primarily in providing the bulk of clot mass and in pushing platelets to the edges of the blood flow stream and facilitating their interaction with damaged vessel walls in a process known as margination [26]. In the microvasculature, red cells contribute significantly to buffering the acidosis generated by hypoperfusion. Since coagulation enzyme activity drops with dropping pH, red cells play a crucial role in maintaining the activity of the coagulation system and reducing capillary bleeding. In addition, hypoxia triggers release of tPA from endothelial cells, activating fibrinolysis. Red cell delivery of oxygen to the vascular bed can mitigate this process which otherwise contributes significantly to development of acute traumatic coagulopathy [90, 91]. Thus, red cell transfusion is critical to recovery of oxygen deficit and hemostatic function.


As stated above, historically, red cells have been among the first products delivered in resuscitation, often at a higher ratio than plasma or platelets. They remain very common in transfusion, partly because of their support for oxygen delivery but also likely because they are easier to maintain in blood banking practice [92]. Red cells are isolated from whole blood via centrifugation and transferred into a preservative solution which by Food and Drug Administration regulations allows them to be maintained at temperatures from 1 to 6 °C for up to 42 days. However, multiple studies have indicated that red cells undergo a “storage lesion” over time; as red cells remain in storage prior to transfusion, they begin to shed microvesicles, to lose membrane integrity, to have diminished oxygen carrying capacity, and to suffer altered morphology [93, 94]. Aged red cell transfusion may result in greater likelihood of poor outcomes in trauma patients that require a large amount of RBC transfusions [9598].


Red cells remain an important part of the balanced resuscitation prescribed by DCR, but they must be used with platelets and plasma to achieve primary and secondary hemostasis.


Plasma


The need for plasma in hemostatic resuscitation should be self-evident; plasma contains all of the necessary enzymes and substrates for producing a clot, factors which are rapidly depleted in trauma due to consumption, dilution (from autoresuscitation or crystalloid usage), and/or continued hemorrhage. Restoration of what has been lost in plasma is mandatory for continued hemostasis. As noted above, early use of plasma, even prehospital, has been shown to reduce mortality in severely injured trauma patient [48, 67].


Plasma can be collected via centrifugation of whole blood or can be obtained through apheresis, and there are several options for storing the plasma. For maximum retention of enzymatic function, plasma can be frozen immediately (within 8 h) after collection at temperatures below −18 °C. Alternatively, often for convenience and logistical purposes, plasma is isolated from whole blood within 24 h of collection and frozen, resulting in some diminished capacity of labile factors (particularly factors V and VIII) but overall preservation of fibrinogen, the primary substrate for clot formation [99]. Frozen plasma can be kept for a year before expiration, but it requires sufficient time for thawing (30–40 min using conventional techniques), a substantial consideration in emergency scenarios. Alternatively, plasma can be thawed prior to use and stored refrigerated for up to 5 days (thawed plasma), or it can be stored as a refrigerated product and never frozen (liquid plasma). Liquid plasma can be stored for 26 days if collected in CPD anticoagulant or for 40 days if collected in CPDA-1. All thawed or liquid plasmas are deficient to various degrees in labile factors like FV and FVIII, but overall ability to support hemostasis in emergency settings appears to be adequate [100, 101]. The convenience of omitting the thawing step can mean the difference between timely plasma transfusion and the delivery of a temporally unbalanced resuscitation that appears to be associated with suboptimal outcomes [57, 102].


Plasma can also be dehydrated through one of several lyophilization or spray-drying processes, resulting in a relatively stable powder of plasma proteins that can be rehydrated on demand [103]. This allows for easier transport as cold chain requirements are reduced and no freezer is required for storage, and rehydration can occur much more rapidly than thawing an equivalent volume of frozen plasma. While dried plasmas are available in some countries, no dried plasma products are yet approved by the US Food and Drug Administration. Usage within the United States has been restricted to an investigational new drug application of the French lyophilized plasma in military special operations forces. Comparative efficacy of FDP versus other plasmas is still being studied, especially with regard to its utility at point-of-injury care [104107]. See Chap. 8, Dried Plasma, for more information.


Plasma from group AB donors has long been considered universal due to its lack of anti-A or anti-B antibodies. Since only about 4% of US and European populations are AB, this plasma is in short supply. It has emerged that Group A plasma can be safely transfused to recipients of any group, even when anti-B titers are unknown [108]. Group A plasma is being widely adopted as an emergency release product in many trauma systems including the US military.


Platelets


After being neglected in resuscitation strategies for many years, pragmatically and in the literature, platelets deserve special mention for their critical function of rapidly initiating coagulation and hemostasis at the site of wounding. The vital role of platelets in hemostasis has long been recognized; therefore, it is not understood why these key elements were often omitted as an imperative component of hemorrhage resuscitation after the transition to component therapy. It appears that the reason is more logistical than biological: platelets are problematic from a supply standpoint. Once collected (e.g., by platelet apheresis in volumes of 200–300 ml from a single donor), they are typically stored at room temperature (approximately 22 °C) with gentle agitation. This already presents problems for austere and extreme environments with limited power and unregulated temperatures, and these settings (e.g., theaters of war, high altitudes, polar stations, or space flights) are also associated with higher risks to life and limb where hemostatic blood products would be most valuable on scene. But even beyond the storage requirements, the shelf life of platelets is the most restrictive of the blood products: regulations limit platelets to a 5–7-day post-collection expiration, primarily because storage at room temperature gives ample opportunity for what would have been inconsequential contamination at collection to become a major problem after 5–7 days of bacterial growth. This restriction in particular makes platelet usage outside of large trauma centers extremely limited.


Platelets , like red cells, suffer from a “storage lesion” over time, although with platelets this occurs more rapidly, exacerbated by room temperature storage where metabolism functions markedly better than the refrigeration of red cells allows. In vitro aggregation function declines rapidly and is minimal after 72 h [109]. Mitochondrial exhaustion is apparent, and waste products are abundant [110]. Clinical outcomes are also affected [111].


Recognizing the importance of platelets in balanced hemostatic resuscitation, several avenues have been investigated to extend shelf life and improve function. In an effort to reduce the effect of platelet alpha 2b beta 3 receptors binding fibrinogen in the plasma solution in which they are carried, a variety of additive solutions have been used to dilute the fibrinogen and supply nutrients to the platelets during storage [112]. These have shown moderate success in improving the function over time, but there is still opportunity for reducing bacterial growth.


To overcome the contamination issues and limit biochemical activity during storage, the obvious solution is to store under refrigeration similarly to whole blood. This idea has once again been brought to the forefront of transfusion research after decades of being dismissed by the blood banking community due to studies in the late 1960s and early 1970s that demonstrated a reduced recovery and survival of transfused platelets that had been stored in refrigerated temperatures [113]. Recently, that paradigm for viability has been questioned, as studies have shown that the room temperature-stored platelets that freely circulate and boost the recovery and survival counts are largely non-functional in hemostasis [109]. In fact, the likely explanation for the diminished recovery of refrigeration-stored platelets is because they are, in fact, migrating to sites of injury and performing their intended function; this has been proven in animal studies that show these cold-stored platelets (in whole blood) localizing in thrombi on damaged endothelium [114, 115]. These in vitro and animal studies have led to human testing; a randomized control trial performed in Norway evaluated cold-stored platelets versus standard room temperature-stored platelets in cardiac surgery patients and found that cold platelet use was associated with reduced post-operative blood loss. Overall, cold-stored platelets (CSP) have been compared to room temperature (RT)-stored platelets across the following parameters and been found to be generally superior: aggregation to single or multiple agonists, adhesion to collagen under flow including reversal of antiplatelet drug effect, spreading on fibrinogen-coated surfaces, clot strength, clot retraction, clot architecture, thrombin generation, thromboelastography/thromboelastometry, mitochondrial function, resistance to activation of apoptosis, maintenance of membrane integrity and granule content, response to regulatory stimuli, preservation of RNA, secretion of inflammatory mediators, risk of bacterial growth, and in vivo hemostasis in both animal models and human patients including those undergoing surgery and those with hypoproliferative thrombocytopenia due to chemotherapy or other bone marrow failure states [116118]. These results have been replicated since the early 1970s through the present (2018) and in laboratory and clinical settings using multiple variations of CSP (platelet-rich plasma concentrate pools, buffy coat pools, apheresis units collected on multiple platforms, units stored in plasma or platelet additive solutions, gamma-irradiated or pathogen-reduced units) in the United States, Norway, Sweden, Australia, Germany, Korea, and China. In short, the superior hemostatic function and bacterial safety of CSP are well-established. The US Department of Defense has used CSP stored for up to 14 days in the hemostatic resuscitation of combat casualties in Afghanistan and Iraq, and the US FDA has granted a variance for the use of CSP in the treatment of bleeding patients [119]. CSP offers a way to expand access to hemostatic resuscitation safe from bacterial contamination for a broad range of patients previously without access to platelet transfusions.


As previously mentioned, perhaps the easiest solution to incorporating platelets into transfusion is through the use of whole blood. Whole blood is already stored refrigerated and contains platelets, plasma, and red cells all in one package. CSP and whole blood will soon be more broadly available and will transform hemostatic resuscitation in the far-forward setting.


Tools of the Trade: The Role of Laboratory Testing, Factor Concentrates, and Tranexamic Acid


Goal-directed therapies have been used for decades. For example, acute traumatic coagulopathy has been identified by some as an increase in prothrombin time (PT) , and many efforts to provide reversal of this coagulopathy have focused on the goal of restoring PT to normal. In fact, resuscitation in the pre-DCR era used crystalloid and red cells first to establish tissue perfusion, followed by plasma and platelets as guided by the PT and platelet counts to correct objectively identified coagulation deficits. As we have seen, this approach led to late use of plasma and platelets and suboptimal resuscitation of bleeding patients. PT was recognized as an inadequate diagnostic [120], and thus more robust methods have gained ground in recent years, with viscoelastic tests of coagulation such as thromboelastography (TEG) and rotational thromboelastometry (ROTEM) analyzing an ex vivo blood sample for a variety of parameters providing additional therapeutic targets [121123]. Varying the combinations of reagents in these assays can isolate specific coagulation-related problems. Point-of-care (POC) devices have also been introduced to provide limited information to guide treatment at the scene or en route [124].


The use of viscoelastic testing technology to facilitate early diagnosis of clotting aberrancies and guide goal-directed therapies has been suggested as being superior to empiric ratio-based component therapy, and factor adjuncts to resuscitation have received attention over the last decade. Recombinant factor VII initially showed promise for the treatment of trauma-induced coagulopathy [125, 126]; however, larger retrospective and prospective studies did not demonstrate a mortality benefit [127129].


Studies have suggested that early fibrinogen supplementation may improve outcomes in traumatic hemorrhage. Cryoprecipitate from plasma contains fibrinogen, factor XIII, factor VIII, vWF, and fibronectin, and it is commonly used for resuscitation in cases where early fibrinogen and factors will provide the most benefit. Similarly, for cases where pharmaceutical anticoagulant reversal is required, prothrombin complex concentrates can restore thrombin activation, a procedure typically guided by prothrombin time and international normalized ratios. Identification of hyperfibrinolysis as a major bleeding problem following ischemia and plasminogen activator release has supported the use of tranexamic acid as an early adjunct (within 3 h of injury) in patients identified as at risk for bleeding complications to stabilize fibrin networks against exuberant plasmin-induced breakdown. Some groups have suggested the use of viscoelastic tests to limit the use of tranexamic acid to those manifesting evidence of fibrinolysis.


The use of factor concentrates and tranexamic acid guided by viscoelastic testing, while intellectually attractive, has not been adequately studied. One single-center randomized study evaluated this approach compared to use of plasma in the resuscitation of blunt trauma patients [102]. Although the authors found an advantage to use of concentrates, this was largely driven by the time delay in treatment in the plasma arm due to the need to thaw plasma. Clearly, this delay in treatment could be obviated by the use of thawed or liquid plasma or indeed whole blood. In addition, the study did not include penetrating trauma patients who may experience brisk bleeding and rapid decompensation that limits the utility of a testing-intensive resuscitation strategy. Also, there is little consensus on viscoelastic test thresholds for determining the use of factor concentrates or antifibrinolytics like tranexamic acid [130]. Finally, the viscoelastic tests like TEG and ROTEM are not practical for prehospital use. Although a single-center RCT recently demonstrated a survival benefit from the use of TEG-directed therapy, further study of this promising goal-directed approach is required before it can be broadly implemented [123].


Overall, empiric use of TXA in bleeding trauma patients is well-supported by the literature, though this represents off-label use in the United States. The CRASH-2 study randomized over 20,000 patients to either TXA or placebo and found a 9% reduction in relative risk of all-cause mortality and a 15% reduction in relative risk of hemorrhage mortality in patients receiving TXA [131]. In this study, TXA was given without viscoelastic testing guidance in a dose of 1 g over 10 min followed by 1 g over 8 h. TXA reduced mortality if given within 3 h of injury but was associated with higher mortality when given more than 3 h after injury. Current clinical guidelines suggest using TXA as given in CRASH-2, within 3 h of injury.


Currently, there are no high-quality data to support either the empiric or viscoelastic testing-based use of fibrinogen concentrate, prothrombin complex concentrates, or recombinant human activated factor VII (rhFVIIa) outside of the setting of a clinical trial. In the United States, fibrinogen concentrate is approved for the treatment of congenital hypofibrinogenemia. Prothrombin complex concentrates containing factors II, VII, IX, and X such as Kcentra are licensed for the reversal of vitamin K antagonists, and rhFVIIa is approved for the treatment of patients with hemophilia who have inhibitors to FVIII. In addition to the complexity and time requirement of reconstituting multiple vials of these products in the acute setting, and the considerable costs of these factors, the thrombotic risk of using these products off-label in the absence of high-quality clinical data supporting their safety or efficacy in unselected trauma patients argues for caution. Further study of these products in bleeding trauma patients is needed.


A frequently overlooked hemostatic adjunct is calcium. Hypocalcemia is present in a majority of trauma patients requiring urgent resuscitation due in part to the calcium chelating effects of intracellular phosphates and other substances released from damaged cells. Transfusion of citrated blood causes further calcium sequestration and can cause clinically significant hypocalcemia [132]. Hypocalcemia can cause not only cardiac arrhythmias but also dysfunctional coagulation and vasoplegia. Infusion of calcium early in resuscitation (e.g., one gram of calcium IV/IO as either 30 ml of 10% calcium gluconate or 10 ml of 10% calcium chloride) can mitigate these problems and boost not only coagulation function but also cardiac output and vascular tone.


Mitigation of Transfusion Hazards


While evidence suggests that blood and blood products should be given early following trauma, increased usage, especially in emergency scenarios, raises the likelihood of a transfusion-related complications . Transfusion-related acute lung injury is a concern with use of plasma, though this has been significantly mitigated by use of plasma from male donors or never-pregnant females or females documented to lack anti-HLA antibodies [133]. Over-transfusion, or transfusion-associated circulatory overload, has been documented, and thus transfusions should be carefully monitored and documented [134].


Potentially lethal hemolytic transfusion reactions can be mitigated through use of low-titer group O whole blood, group O red cells, and group AB or A plasma. Safety concerns associated with on-scene collection and transfusion (as has become possible in military practice) must be addressed through rigorous training in donor selection and repetition of collection procedures that emphasize competence in blood typing and infectious disease rapid testing, as well as the development of donor screening programs and rigorous record keeping. The potentially serious hazards of prehospital blood collection and transfusion are significantly diminished by using a pre-screened, blood group-identified donor pool [135].


Even when blood is collected in advance, screening can be a major expenditure with respect to both time and money, reducing the supply and availability of product in remote locations. New innovations in pathogen reduction technologies have been proposed to provide a rapid method to reduce the transfusion-transmitted disease risks associated with fresh whole blood, and these products and methods have been made available in locations suffering from virulent outbreaks including Ebola [136]. Photochemical inactivation of pathogens is the current approach, with the latest products using photosensitizers and ultraviolet light to damage nucleic acids. These technologies may also reduce the very small but real risk of transfusion-associated graft-versus-host disease through inactivation of the lymphocytes transfused from donor to recipient [137]. These pathogen reduction technologies are undergoing regulatory evaluation in the United States.


Conclusion


The preponderance of the available evidence suggests that hemostatic resuscitation is a core element of the DCR bundle of care. DCR is inclusive of early mechanical hemorrhage control, no crystalloid, and hemostatic resuscitation which is the holistic approach to treating blood failure by replacing the functionality of whole blood lost to hemorrhage. DCR and hemostatic resuscitation reduce trauma mortality compared to resuscitation strategies that do not address both the restoration of perfusion and of hemostasis in a timely manner. Emerging data from military and civilian experience demonstrate that translation of the DCR approach into the prehospital setting as RDCR extends the benefits of DCR further reduces trauma mortality. Significant challenges remain in the broad implementation of a “blood far-forward” paradigm such as the financial and logistical challenges to providing whole blood or components in the prehospital environment. Training prehospital personnel in hemostatic resuscitation procedures and transfusion is difficult and requires a substantial investment in skills maintenance. Training not only military personnel but also civilians in whole blood collection, establishment of emergency donor panels, and documentation of emergency transfusion is a major undertaking but one that could prove lifesaving in the event of civilian or military mass casualty events where the local blood supply is exhausted and resupply from other regions has not occurred. Research challenges include the need to identify better ways to store blood products in order to preserve their shelf life and function. Ultimately, these challenges must be overcome in order to make progress towards the goal of zero preventable deaths that military experience in elite units has taught us could be close to achievable.

Mar 15, 2021 | Posted by in EMERGENCY MEDICINE | Comments Off on Resuscitation

Full access? Get Clinical Tree

Get Clinical Tree app for offline access