History of Fluid Resuscitation for Bleeding

div class=”ChapterContextInformation”>


© Springer Nature Switzerland AG 2020
Philip C. Spinella (ed.)Damage Control Resuscitationhttps://doi.org/10.1007/978-3-030-20820-2_1



1. The History of Fluid Resuscitation for Bleeding



Patrick Thompson1   and Geir Strandenes2


(1)
Paramedic, Cape Town, South Africa

(2)
Department of Immunology and Transfusion Medicine, Haukeland University Hospital, Bergen, Norway

 



 

Patrick Thompson


Keywords

HistoryDamage control resuscitationHemostatic resuscitationRemote damage control resuscitation (RDCR)


Introduction


Damage control resuscitation (DCR) is a bundle of care first described by Holcomb et al. that is aimed at reducing death from hemorrhage for patients with severe traumatic bleeding. DCR principles include compressible hemorrhage control; hypotensive resuscitation; rapid surgical control of bleeding; avoidance of the overuse of crystalloids and colloids, prevention or correction of acidosis, hypothermia, and hypocalcaemia; and hemostatic resuscitation (blood-based resuscitation) [1]. RDCR is defined as the prehospital application of DCR concepts. The term RDCR was first published by Gerhardt and has been disseminated by the THOR Network [2, 3].


The number and severity of wounded in the wars in Afghanistan and Iraq coupled with the collection of clinical data inspired renewed thinking regarding the optimal methods to improve outcomes for casualties with traumatic hemorrhagic shock. Motivation for reassessment of the standard resuscitative approach for severe bleeding was a result of retrospective studies supporting the earlier use of blood products to include whole blood [47] and data by Eastridge that indicated the majority of casualties succumb to their wounds before reaching any medical facility with an advanced resuscitation capability, and the overwhelming majority of these patients (>90%) died from hemorrhage [8]. Advanced life-saving interventions performed in this pre-medical treatment facility (MTF) phase of care can improve outcomes by delivering a casualty to the surgeon with survivable injuries [9, 10].


The history of DCR and RDCR starts well before the inception of the terms. The concepts behind the principles of DCR and RDCR stretch far back into the past. This chapter provides an outline of this history, but it is limited to the fluid resuscitation aspect of DCR/RDCR.


1600s


The history of fluid resuscitation starts with the discovery of the circulatory system. Until this point in time there was no intervention to the circulatory system as no one had yet conceived of the blood to be in “circulation”; it was incorrectly assumed that the blood was produced in the liver and consumed in the peripheries.


In 1628, William Harvey, an English physician educated in Italy at the University of Padua as a student of Hieronymus Fabricius and later at the University of Cambridge, publishes Exercitatio Anatomica de Motu Cordis et Sanguinis in Animalibus translated as “An Anatomical Exercise on the Motion of the Heart and Blood in Living Beings” commonly called De Motu Cordis (On The Motion of Heart and Blood). This was the first complete, well-researched, description of the circulatory system including the pulmonary and the systemic circulation. The concept was in contradiction to Galen and the accepted understanding of the age. Harvey calculated cardiac output and demonstrated that it was impossible that the liver could possibly produce the volume of blood required as had been previously thought. This bold insight set the stage for new ideas surrounding treatment for hemorrhage.


Harvey’s description of the circulatory system was rapidly accepted and it was not long until the idea of interventions via the circulatory system was envisioned. The first intravenous injections (IV) were administered by Christopher Wren and Robert Boyle in 1656 in Oxford. An animal bladder was attached to a Goose quill and wine, ale and opiates were injected into dogs. A mixture of opium and alcohol produced the first IV anesthesia with full recovery; this concept was not implemented into clinical practice and an early chance for pain-free surgery was lost.


Richard Lower conducted research in the cardiopulmonary system and was the first to describe the difference in blood after exposure to air via the lungs. In 1666, Lower reported the first blood transfusion. More specifically, Lower revealed that transfusion could be used as a life-saving treatment for exsanguination. Lower bled a dog to the point of death and then saved the animal with a whole blood transfusion from another larger dog. In 1667, blood was first transfused from animal to man by Jean Baptiste Denis and Lower. It must be noted that the transfusion of blood from a lamb into man was not as a treatment for hemorrhage but instead for “madness.” After much medical and theological debate, the practice of transfusion was banned by the French and later the Pope. While transfusion fell into disrepute, the practice faded although the theory was passed on.


1700s


Just as it was important to identify and describe the circulatory system, it was equally important to identify and describe the condition of hemorrhagic shock; this has proven particularly difficult due to the complexity of the pathology. In 1731, French surgeon Henri Francois Le Dran in a publication titled, Observation de Chirurgie, describes the collapse of vital functions which ended in death after being hit by a missile. He called it secousse which translates from the French to Shock [11].


1800s


In 1817, Dr John Henry Leacock showed that blood was species-specific in cat and dog transfusions and argued for human-to-human transfusion.

The consequences of haemorrhages where the functions are not dangerously affected, do not of course, require transfusion, since other remedies will suffice. But when the danger is imminent, and the common means are ineffectual, as when a parturient women trembles on the brink of the grave from uterine haemorrhage, or when a soldier is at the point of death from loss of blood, what reason can be alleged for not having recourse to this last hope, and for not attempting the recruit the exhausted frame and turn the ebbing tide of life.


This quote carries a clear message of the urgency of resuscitation after severe hemorrhage.


In 1818, James Blundell performed the first human-to-human transfusion. Blundell had postulated that transfusion could be used to treat postpartum hemorrhage and researched transfusion with animals. In 1829, Blundell published the first successful resuscitation of a woman from postpartum hemorrhage in The Lancet. He performed ten transfusions in the next 10 years. Blundell also improved the technique and equipment for transfusion using a syringe to conduct vein-to-vein transfusions.


Blundell noted that vein-to-vein transfusions were impractical due to clotting, and removal of air was essential. Attaching the donor’s artery to the recipient’s vein had however proven successful in Lower’s experiments but required skill and time. To resolve this problem, the use of defibrillated blood was suggested by Prevost and Dumas in 1821, which allowed blood to clot, usually by stirring, and then the clots were removed and the remaining fluid now “defibrillated” could be used. Others sought an anticoagulant; J Neudorfer recommended sodium bicarbonate as an anticoagulant in 1860. Dr Braxton Hicks attempted a solution of sodium phosphate but was unsuccessful [12].


In 1849, C.H.F. Routh reviewed all the published blood transfusions to that date, in an article entitled “Remarks, statistical and general on transfusion of blood,” which was published in the Medical Times. He reported that he was only able to find 48 recorded cases of transfusion, of which 18 had a fatal outcome. This gave a mortality of approximately 1 in 3, which was reported as being “rather less than that of hernia, or about the same as the average amputation [12].


In 1865, Louis Pasteur recognized that bacterial and fungal contamination causes putrefaction, and in 1867, Joseph Lister discovered antiseptics to cure the dangers of infection. As a result of these discoveries, infection in transfusions moved toward a potential solution with the sterilization of instruments and antiseptic methods beginning to be introduced.


Crystalloids and Colloids


Another important development in fluid resuscitation started in 1831. William Brooke O’Shaughnessy examined cholera patients in Edinburgh and postulated that the disease resulted in hypovolemia and electrolyte loss; O’Shaughnessy experimented on dogs with Saline. In 1832, Thomas A Latta administered salt solution to cholera victims and published details in The Lancet: “The very remarkable effects of this remedy require to be witnessed to be believed. Shortly after the commencement of the injection the pulse, which was not perceptible, gradually returns, … the whole countenance assumes a natural healthy appearance” [13].


In 1885, Sydney Ringer strived to achieve optimum electrolyte concentrations for organs making Ringer’s solution. In 1896, Ernest Starling described colloid osmotic pressure (Starling’s principle) and the importance of colloids plasma proteins; this paved the way for the development of colloids.


American Civil War 1861–1865


In 1850, Samuel D. Gross makes one of the first descriptions of wound shock: “the rude unhinging of the machinery of life” [14, 15].


Two whole blood transfusion attempts were made on active duty wounded soldiers by Union surgeons and reported in the War Department’s Medical and Surgical History of the War of the Rebellion. Surgeon E. Bentley reported a successful transfusion given to Private G. P. Cross at Grosvenor Branch Hospital, Arlington, Virginia, on August 15, 1864, and another by Assistant Surgeon B. E. Fryer at Brown Hospital in Louisville, Kentucky, operated on a Private J. Mott in August 1864 [16, 17].


Franco-Prussian War 1870–1871


Battlefield Transfusions


In 1865, Dr J. Roussel of Geneva first conducted a whole blood transfusion using direct arm-to-arm transfusion with a device he had developed called the “transfuseur, ” for treatment of a patient suffering from hemorrhage. The apparatus he used was described in the Gazette des hopitaux in 1867. Roussel stated later that it was unfortunate that the device and procedure was not more widely utilized during the Franco-Prussian war, although it was used.


In 1867, Roussel claimed 16 successful whole blood transfusions out of 35 performed for the treatment of a variety of conditions. In 1882, in Paris, he reported on a total of 60 whole blood transfusions performed since 1865 in Switzerland, Austria, Russia, Belgium, England, and France. Roussel’s transfuseur apparatus was subsequently officially adopted for use by the French Army and apparently used in times of war.


Developments were made on the equipment needed to conduct whole blood transfusions. Blundell used syringes made for him specially for the vein-to-vein transfusion process; he later developed two new devices: the “impellor” and later the “gravitator.” Many other devices were invented and attempted. In 1873, Dr. J.H. Aveling used a device he invented for vein-to-vein whole blood transfusion which consisted of two cannulas joined by a bulb pump and one-way valve to ensure the correct direction of flow; he described the device as small enough to be carried around in a pocket. In 1872, Aveling attended to a lady, aged 21 years, “in extremis” from postpartum hemorrhage. She received 60 drachms of blood from her coachman and apparently soon recovered, certainly enough to reportedly be able to remark that she was dying! Dr. Aveling added in his report that: “the mental improvement of the patient was not as marked and rapid as I anticipated, but this was perhaps due to the quantity of brandy she had taken” [12].


In the United States, between 1873 and 1880, an attempt at a blood substitute was attempted with the milk of cows and goats. T.G. Thomas and J. S. Prout supported this treatment due to the problems with blood transfusion because of its “tendency to coagulation.” By 1878, J.H. Britton, writing in the New York Medical Record, predicted that transfusion using milk would entirely supersede transfusions of blood [12].


The Spanish-American War 1898


The first descriptions of wound shock which was thought of as something separate from the injury came from the American Civil War, and it was during the Spanish-American War of 1898 that wound shock was first associated with sepsis; however, wound shock was seen as distinctive from hemorrhage [18].


The Anglo-Boer War 1899–1902


In 1900, during the Anglo-Boer War , British surgeons use strychnine and saline to treat shock. Porter describes treatment, “I wanted to pump in strychnine as before, but Cheyne was playing about with 3 or 4 drop doses. The man was very bad and looked like dying so I got 10 drops and gave it. Cheyne was astonished and said it was a very big dose, but I said the patient wanted it. Then Cheyne thought he would try transfusion, and put one and half pints of salt water into a vein” [19].


In 1900, the US Surgeon General recommended that patients in a state of shock were given normal salt solution rectally and subcutaneously and 1/60 grain of strychnine , covered with blankets and kept warm [20].


1900s


Physiology: Blood Groups


In 1900, Karl Landsteiner, while experimenting with the mixing of whole blood from different people, found some blood agglutinates and some lyse, and some are unaffected. In 1901, he found that this effect was due to the red blood cells coming into contact with incompatible blood serum antibodies. He labeled the blood groups according to agglutination A, B, and C, which was later changed to O. Landsteiner also found out that whole blood transfusion between persons with the same blood group did not lead to the destruction of blood cells, whereas this occurred between persons of different blood groups [21]. A fourth main blood type, AB, was found by A. Decastrello and A. Sturli.


Transfusion: Avoiding Transfusion Reactions


In 1907, Ludvig Hektoen recommends blood cross matching, the mixing of donor and recipient blood to determine compatibility. Ruben Ottenberg performs first “cross matched” and typed whole blood transfusion, and Ruben also recognized blood type O as the universal donor.


In 1908, French surgeon Alexis Carrel devised a way to prevent blood clotting. His method involved joining an artery in the donor directly to a vein in the recipient with surgical sutures; this was a highly skilled and complex process available only to skilled surgeons.


In 1913, Dr. Edward Lindeman revolutionized blood transfusion by using syringes and cannulas to transfuse whole blood instead of directly connecting the donors’ and recipients’ blood vessels at the Bellevue Hospital in New York [22]. In 1914, the first transfusion using citrated whole blood was performed by Professor L. Agote. In 1915, Richard Lewisohn uses sodium citrate as an anticoagulant to transform the transfusion procedure from direct to indirect with the capability of storage. Richard Weil demonstrates the feasibility of refrigerated storage of such anticoagulated blood. In 1916, Peyton Rous and J.R. Turner Jr. found that adding dextrose to the citrate extended the storage time to 4 weeks.


In 1916, W. Bayliss a professor of general physiology at University College London contributed a lecture to the Physiological Society; his abstract was published in the Journal of Physiology The abstract detailed animal models after bleeding that received salt solutions had only a transitory recovery; however, the effect was sustained when 5% gelatin of gum acacia was added. Interestingly, gum acacia contains a moderate amount of calcium and magnesium salts, which are cofactors in hemostasis [23].


WWI 1914–1918


In 1915, Oswald Hope Robertson travels to Europe as a medical student and performs first whole blood transfusion of the war at a volunteer hospital in Paris. After his graduation later that year, he works with P. Rous at the Rockefeller Institute. In 1917, Robertson joins the Harvard Medical Unit with Roger Lee at the Base Hospital No. 5 from Boston. Lee had sent Robertson to work with Rous at the Rockefeller Institute. Robertson is tasked with investigation of the treatment of shock; he initiated direct transfusions and wrote to Rous with an idea of larger-scale collection and storage. In 1917, he tested donors and used only type “O” universal donors as suggested by Lee; the donors were tested for disease. He collected blood via venipuncture into glass bottles with anticoagulant. He cooled the blood in ice chests and stored it for up to 28 days. Robertson moved the blood to where it would be needed. He personally administered blood to the wounded under fire and was awarded the Distinguished Service Order for bravery. Robertson also taught the techniques to other instructors responsible for transfusion and resuscitation training. In 1918, O.H. Robertson published his findings in the British Medical Journal [24].


In 1915–1916, Captain Ernest Cowell and Captain John Fraser began measuring soldiers’ blood pressures and recorded that in wounded men with classic symptoms of shock, the average SBP was 90mmHg, and they labeled this primary shock. In the group which showed no signs of shock initially but later the BP dropped to 70–90 mmHg, this was called secondary shock. If the BP continued to decline and if it fell to 50–60, or below, the men died.


In 1916, Captain L. Bruce Robertson from Toronto, who had recently trained with Lindemann in New York, used direct whole blood transfusions with no blood typing or cross matching in the field. He published in the British Medical Journal, The transfusion of whole blood: “a suggestion for its more frequent employment in war surgery,” where he states: “the additional blood often carries the patients over a critical period and assists his forces to rally to withstand further surgical procedures.” Robertson publishes his experiences of resuscitation transfusions in 1917 in the British Medical Journal, and in 1918, in the Annals of Surgery, he describes 36 cases of transfusion including 3 fatal hemolytic transfusion reactions [24].


In 1917, after the Medical Research Council Shock Committee meeting, Bayliss recommends 5% gum acacia in a 3% sodium bicarbonate solution; this proved difficult to manufacture, and after further testing, it was agreed to place the 6% gum acacia in a 0.9% saline solution. Reports were circulated that gum acacia and Ringer’s solution were capable of saving lives on the front. In 1918, Colonel Elliott and Captain Walker reported that gum-saline succeeded if infused on arrival at the Casualty Clearing Station, but if treatment was delayed for more than 8 hours, a blood transfusion was better.


In 1917, the Investigation Committee on Surgical Shock and Allied Conditions of the Medical Research Council was formed with Starling as first chair then Bayliss. The committee was established to examine treatment of shock. The committee requests an update on the use of whole blood from Captain Oswald Hope Robertson. Both cold-stored and warm whole blood were transfused to casualties in WWI.


In 1917, Bayliss travels to France and meets Captain Fraser and Captain Walter B. Cannon of the USAMC and the Higginson Professor of Physiology at the Harvard Medical School. Cannon conducted autopsies to test the theory that wound shock was caused by blood pooling in the great veins of the abdomen and found this to be untrue. He began investigations with a Van Slyke blood gas analyzer on blood plasma and was able to show a correlation between wound shock BP and acidosis; the lower the BP, the greater the acidity of the plasma.


On August 17, 1917, at the first MRC Special Investigation Committee on Surgical Shock and Allied Conditions meeting, they publish the first definition of wound shock “a condition of circulatory failure due to deficient entry of blood into the heart.”


The Medical Research Council Shock Committee urgently tried to discover the cause of shock and potential treatment. Cannon is convinced that high acid levels in the blood are causing the wound shock and an alkali treatment is needed. H.H Dale disagrees and suggests a more complex pathology: “namely, that substances with similar activity (to histamine) absorbed from wounds involving injury to tissues, in conjunction with hemorrhage, exposure to cold, and so forth, could well determine the onset of shock.” Dale argues that the treatment of shock should include whole blood transfusion [25].


In 1918, Cannon is named the Director of Surgical Research at the Medical Laboratory at Dijon; there he trains resuscitation teams in the physiology of shock and resuscitation of shock with a strong emphasis on hypothermia management, which he learned from working on the front line with Cowell and Fraser. Cannon requests and receives the assistance of O.H. Robertson in his research. In 1918, the US Army Medical Department adopts whole blood transfusion with citrated blood to combat shock for American Expeditionary Forces.


Geoffrey Keynes developed “field durable” equipment that enabled whole blood transfusions to be carried out in the field outside of established medical facilities. In the field, the only way to transfuse casualties was from another soldier to the casualty. Keynes’ equipment enabled regulating the flow of blood between the donor and the patient.


Post-WW1


In November of 1918, the Royal Army Medical Core convened a conference in Boulogne of surgeons and pathologists to evaluate treatments for shock and hemorrhage. The final conclusion was that whole blood was probably superior, but colloids warranted further investigation, and reactions to gum acacia were reported.


After the war, the MRC Shock Committee also independently reviews the evidence from the war and declares “that in all cases of hemorrhage with shock, transfusion of unaltered whole blood or citrated blood is the best treatment yet available” [26, 27]


Major W. Richard Ohler states after the war, “hemorrhage is the important single factor in shock and the amount of hemorrhage defines the amount of shock, when, therefore, the need is for oxygen carrying corpuses, no other intravenous solution will serve the purpose.”


In 1921, Percy Lane Oliver , Secretary of the Camberwell Division of the British Red Cross, establishes the first emergency donor panel with some 20 strong donors to donate blood at short notice in London hospitals. Oliver calls it British Red Cross Blood Transfusion Service. In 1922, it is used 13 times; word spread and by 1925, the service is used 428 times. Sir Geoffrey Keynes is appointed as medical adviser to the organization. Similar systems are adopted in other countries; France, Germany, Austria, Belgium, Australia, and Japan being among the first. At the first Congress of the International Society of Blood Transfusion held in Rome in 1935, “It is to the Red Cross in London that the honor is due to having been the first, in 1921, to solve the problem of blood donation by organizing a transfusion service available at all hours, and able to send to any place a donor of guaranteed health, whose blood has been duly verified.” In 1937, Bernard Fantus of the Cook County Hospital in Chicago establishes the first US civilian blood bank, in which whole blood was collected in bottles and stored in a refrigerator for up to 10 days [28].


In 1932, Alexis F. Hartmann and M.J.C. Senn suggest a 1/6 molar sodium-lactate solution to replace the sodium chloride in Ringer’s solution; they showed that the lactate was metabolized in the liver, making sodium available to combine with available anions. The use of the solution meant the amount of chloride to be reduced, limiting hyperchloremic acidosis [29].


In 1929, Professor Vladimir Shamov of Kharkiv, USSR, reports experimental use of cadaveric blood transfusion and absence of toxicity. In 1930, Russian surgeon Sergei Yudin familiar with the work of Shamov transfuses his first patient, and he states, “My first experience was with the case of a young engineer who slashed both of his wrists in a suicidal attempt. He was brought to our hospital pulseless and with slow, jerky respiration. Transfusion with 420 cc. of blood taken from the cadaver of a man, aged 60, who had been killed in an automobile accident just six hours before, promptly revived him [30]. Later that year, Yudin reports at the fourth Congress of Ukrainian Surgeons at Kharkiv in September on his first seven transfusions from cadavers. By 1932, Yudin reports 100 transfusions with blood kept for 3 weeks from cadavers, and in 1937, Yudin reports over 1,000 uses of cadaveric blood in The Lancet [28].


Spanish Civil War 1936–1939


By 1936, Frederic Duran-Jorda had created a transfusion service in Barcelona to meet the growing demand for blood transfusions; later that year, Norman Bethune visited the facility and then sets up a similar service based out of Madrid called the Servicio canadiense de transfusión de sangre. In 1914, Bethune suspended his medical studies and joined the Canadian Army’s No. 2 Field Ambulance to serve as a stretcher-bearer in France. He was wounded by shrapnel, and after recovering, he returned to Toronto to complete his medical degree. Based on his experience in WWI, he organized a mobile transfusion service stating: “Why bring the bleeding men back to the hospital when the blood should travel forward to them?” During the Spanish Civil War, 28,900 donors donated 9000 liters of whole blood. Donors are X-rayed for TB and their blood is tested for syphilis and malaria. Six donations of whole blood were mixed and filtered and then placed in 300 ml glass jars and stored at 2 °C for up to 15 days. With the advent of blood fractionation, plasma could be separated from whole blood and was used for the first time in this war to treat the battle wounded. In 1938, Duran-Jorda fleed to the United Kingdom and worked with Dr. Janet Vaughan at the Royal Postgraduate Medical School at Hammersmith Hospital to create a system of national blood banks in London.


Pre-WWII


In 1934, Alfred Blalock proposed four categories of shock: hypovolemic, vasogenic (septic), cardiogenic, and neurogenic. Hypovolemic shock, the most common type, results from loss of circulating blood volume due to loss of whole blood (hemorrhagic shock), plasma, interstitial fluid, or a combination [31].


In 1938, the Medical Research Council establishes four blood depots in London. Later, in the autumn, the War Office also created the British Army Blood Transfusion Service and the initial Army Blood Service Depot (ABSD) in Bristol under the control of Dr Lionel Whitby. The service also sets up a plasma-drying facility that produced 1200–1400 units a week.


WWII 1939–1945


Transfusion: UK Army Blood Transfusion Service


In 1938, Brigadier Lionel Whitby was appointed Director of an autonomous UK Army Blood Transfusion Service (ABTS) . Unlike WWI where the blood was obtained from fellow soldiers, the plan changed to central civilian collection and then to a distribution network. The service was organized on three levels: (1) the Army Blood Service Depot (ABSD) , producing all wet and dried products, crystalloids, grouping sera, blood collecting, and administering equipment and training; (2) Base Transfusion Units, which were chiefly concerned with distribution in each theater of operations; and (3) Field Transfusion Units, which worked in forward areas.


Plasma for Britain


In 1940, Dr Charles R. Drew, surgeon and researcher who had developed techniques for preserving liquid plasma, supervised the “Blood for Britain” program which delivered blood to treat those wounded during the Blitz. To encourage donation, Drew first used vehicles with refrigerators serving as donation centers.


Research


In 1940, on May 31, US Surgeon General Magee appoints Professor Walter B. Cannon of Harvard University as Chairman of the US National Research Council Committee on Shock and Transfusion. On November 3, 1941, this committee agreed “that it had been the consensus of the group that [US] Armed Forces should use whole blood in the treatment of shock wherever possible”; the results of that discussion were not made official until 2 years later, on November 17, 1943 [32].


Cannon also introduced the term “homeostasis” to describe the equilibrium maintained in the internal environment and is credited for the first proposal to cause deliberate hypotension in order to reduce internal hemorrhage until surgical control could be established [33].


Plasma: Fractionation


In 1940, Edwin Cohn, a professor of biological chemistry at Harvard Medical School, develops cold ethanol fractionation, the process of breaking down plasma into components and products. Albumin, gamma globulin, and fibrinogen are isolated and become available for clinical use. John Elliott develops the first blood container, a vacuum bottle extensively used by the Red Cross [34]. In 1941, Isodor Ravdin treats victims of the Pearl Harbor attack with Cohn’s albumin for blood loss and shock [34].


Transfusion: The United States’ Need for Whole Blood


In 1941, as US troops arrive in the United Kingdom, the United States reports that they are not able or prepared to supply US donated blood to Europe or Africa.


On June 28, 1941, the first Conference on Shock was conducted by the Subcommittee on Shock, 6 months before the United States entered the war. Treatment recommendations included control of hemorrhage with early application of a tourniquet, the application of heat to reverse hypothermia and analgesia. Regarding fluid therapy, when shock is imminent or present, blood, plasma, or albumin should be injected as promptly as possible. In massive hemorrhage, whole blood is preferable to blood substitutes.


In 1943, pressure grows on the United States to supply whole blood during D-Day Planning: The Allied planning group were shocked to be told that the U.S. would not sanction the transport of any whole blood from the United States to Great Britain; logistical problems and the efficacy of human plasma were cited as the reasons for the U.S. obduracy [35].


In March 1943, US Army Colonel Edward D. Churchill arrives for duty as Chief Surgical Consultant to the North African and Mediterranean operational theater. Churchill conducts a study on the resuscitation of shock and releases a report that states plasma is a first aid measure in support of whole blood which is the first-line treatment for resuscitation of battlefield casualties. Whole blood is the only agent that prepares casualties for surgery and decreases mortality by reducing infection. Inadequate resuscitation with whole blood resulted in organ damage. There was a widespread misconception by US military medical leadership that plasma was as effective as whole blood [36]. Churchill, incensed by the US Surgeon Generals’ position on blood products, briefed a New York Times reporter with the aim of publicizing the need for military blood banks [37]. In 1943, Colonel Elliott C. Cutler’s memorandum to Brigadier General Paul R. Hawley, Chief Surgeon, European Theatre of Operations, stated that “Brigadier Whitby tells me that the use of wet plasma has practically been given up, and transfusion (of whole blood) used in its stead in the British Army [38].


Colonel Frank S Gillespie (Liaison Officer for the United Kingdom in Washington DC)

I have often wondered at the physiological differences between the British and American soldier. The former, when badly shocked, needs plenty of whole blood, but the American soldier, until recently, has got by with plasma. However, I seemed to observe a change of heart when I was in Normandy recently and found American surgical units borrowing 200–300 pints of blood daily from British Transfusion Units, and I’m sure they were temporarily and perhaps even permanently benefited by having some good British blood in their veins.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Mar 15, 2021 | Posted by in EMERGENCY MEDICINE | Comments Off on History of Fluid Resuscitation for Bleeding

Full access? Get Clinical Tree

Get Clinical Tree app for offline access