A History of Neuroanesthesia



Fig. 64.1
A trephined skull from 3,500 BC demonstrates a rounded edge of the hole, indicating that healing had taken place and that the patient had survived the procedure. (From Wikipedia, accessed 3 June 12)



Surgeons believed that boring a hole in a patient’s head would increase the brain’s metabolism (that is, replace depression with happiness), heighten cranial blood flow (release tension as caused by a hematoma within the cranium), or release evil spirits that resided within the skull. Many “patients” survived, as shown by signs of healing in their skulls [8]. (Fig. 64.1) Individuals undergoing trephination for spiritual reasons reported a new and glorious understanding of themselves and the world around them. In addition, this “third eye” was said to bestow psychic abilities and an understanding of mysticism upon the patient. Some reported that trephination heightened the body’s senses.

During the Middle Ages, did the familiar haircut of monks and friars (a ring of hair around the circumference of the head) evolve from holy men who had tops of their heads shaved in preparation for trephination? The phrase, “I need that (some unwanted object) like I need a hole in the head,” may be a throwback to trephination.

Trephination, or trepanning as preferred by some authors and cultures, persisted in primitive civilizations until the early twentieth century [11]. Although widely considered to be pseudoscience, the practice of trephination continues in some societies. Proponents point to “recent research” on the increase in cranial compliance following trephination, with increase in blood flow, as justifying the practice. They believe that once the skull is opened, it introduces oxygen, which provides knowledge, significant of a divinity. Some individuals have practiced non-emergency trephination for psychic purposes [12]. A prominent proponent of this view is Peter Halvorson, who, in 1962 drilled a hole in the front of his own skull to increase “brain blood volume” [13]. In 1998, he established the International Trepanation Advocacy Group, which mantains a website and purports to support research into this practice. Theoretical support for the benefits of self-trephination, was offered by Bart Huges (or Hughes, 1934–2004) who claimed that trepanantion increased “brain blood volume” and thereby enhanced cerebral metabolism in a manner similar to cerebral vasodilators such as ginko biloba. No published results support these claims. Huges, a Dutch librarian published “The Mechanism of Brainbloodvolume (‘BBV’)” (also known as “Homo Sapiens Correctus”) in 1964. Based on an ancient belief, he proposed that when mankind began to walk upright, brains drained of blood while trephination allowed blood to flow in and out of the brain better, causing permanent euphoria.

How was pain minimized for trephination? Coca leaves were widely used in Peru. Perhaps an early anesthetist chewed the leaves and spat in the wound, a technique described for centuries in South America’but possibly ineffective because of the small amount of free cocaine in coca. Or perhaps as in the painting ascribed to Heironymus Bosch (1450–1516), “Extraction of the stone of madness”, the figure standing to the side held a jug of wine at the ready (Fig. 64.2). While Bosch was known for his fantastic imagery illustrating moral and religious concepts and narratives, in recent decades, scholars have also viewed his art as reflecting then orthodox religious beliefs.



A978-1-4614-8441-7_64_Fig2_HTML.jpg


Fig. 64.2
A surgeon makes a hole in the skull while an assistant stands by with a pitcher of wine. (Attributed to Heironymous Bosch, 1450–1516, from “The Extraction of the Stone”, 1490, oil painting, Museo del Prado. From Wikipedia, accessed 3 June 12.)

Peter Treveris is credited with an engraving in the “Handywarke of Surgeri” by Heironymus von Braunschweig, showing the method of trephination in 1525. Treveris also published “The Grete Herball” in two editions in 1526 and 1529. This compendium of herbal and plant remedies was translated from the French “Le Grant Herbier”. The author describes laudanum, henbane, opium and lettuces as narcotics for pain relief. These preparations and alcohol may have been used during these procedures.

The use of trephination declined in the nineteenth century. In 1829, Astley Cooper, a consulting surgeon to Guy’s hospital in London wrote “Trephining in concussion is now so completely abandoned that in the last 4 years I do not know that I have performed it once, while 35 years ago I would have performed it 5 or 6 times a year”. More advanced medicine replaced the practice: bleeding, purges and application of leeches [14].




Nineteenth Century and the Discovery of Anesthesia


After the momentous demonstration by William Morton on 16 October 1846, knowledge and application of the benefits of ether, and then chloroform, (James Simpson, 4 November 1847) spread quickly around the world. However, anesthesia for neurosurgical procedures was slow to be accepted, perhaps because it was recognized that the brain does not possess pain endings, and thus unconsciousness might not be necessary. Also the dangers of working within the brain might mandate a responsive patient.

A notable case illustrating the dangers of anesthesia achieved local fame. In Lumberton, New Jersey, on 7 February 1887, Mary Anderson was shot in the head by her thwarted lover [15]. Two weeks later, 4 prominent physicians, Girdner, Spitzka, Pancoast, and Spiller assembled in a tiny cottage, and with a telephonic probe1 tried to locate and remove the bullet. Under ether anesthesia and breathing spontaneously, her condition rapidly deteriorated as her brain swelled and the procedure was abandoned. She died without regaining consciousness a further 2 weeks later. Her assassin was accused, tried, convicted and executed.

During the nineteenth century, several prominent surgeons, especially neurosurgeons, made important advances in anesthetic techniques for neurosurgery.


Victor Horsley (1857–1916)


In the latter part of the nineteenth century, a few general surgeons performed neurosurgery. Chipault in France and von Bergman and Krause in Germany, Macewen in Scotland, and Keen in the US all performed cranial and spinal procedures as well as general surgery. Victor Horsley studied neurophysiology extensively in his early career. He was also a leader in the temperance movement in the UK, having observed many inebriated head injuried patients admitted to hosptial. Horsley was the first neurosurgeon appointed to the hospital in Queen Square, London, now called the National Hospital for Neurology and Neurosurgery’the Victor Horsley Department of Neurosurgery is named in his honor. The Walton Centre for Neurology & Neurosurgery NHS Trust in Liverpool, England, another leading Neurosurgical Hospital dedicated their Intensive Care Unit to Sir Victor Horsley and is called the Horsley ward [16].

Horsley investigated the intracranial effects of chloroform, ether and morphine after experimenting on himself (Fig. 64.3). He concluded that because ether caused hypertension, excessive bleeding, postoperative vomiting and general excitement, it was not to be used in neurosurgery [17]. He considered morphine valuable because of the apparent increase in cerebral blood flow and more readily controlled hemmorhage in the surgical field. The respiratory depression caused by the opioid (an effect that he had noted from self experimentation) could also cause problems. He preferred chloroform, advising the “judicious use of chloroform to control hemmorhage [18]. However, death during chloroform anesthesia was all too common. In 1890, the Hyderabad Commission concluded that death was due to respiratory failure. The need for control of the dose remained unclear, although it was realized that slightly less than 2% chloroform induced anesthesia and much less allowed maintenance. It was argued that this amount might just as easily be achieved, by sprinkling the liquid onto a cloth. Horsley disagreed and together with a physical chemist, Vernon Harcourt, developed an inhaler that could control the percentage of chloroform administered.



A978-1-4614-8441-7_64_Fig3_HTML.jpg


Fig. 64.3
Victor Horsely photograph accessed from Wikimedia Commons, 3 Aug 13.

In 1901, the British Medical Association appointed a “Special Chloroform Committee” including Waller, Sherrington, Harcourt, Buxton and Horsley to introduce science to the art of anesthetic administration. The committee concluded that a chloroform dose exceeding 2% was unsafe because the resulting inhibition of the vagus nerve could cause cardiac arrest. They determined that several inhalers were suitable for administering accurate measures of chloroform including those of Snow, Junker, Clover, Paul Bert, Harcourt, Roth-Drager, Waller and Collingwood [19]. Harcourt’s inhaler delivered no more than 2% chloroform. Horsley believed that administration should be reduced to <0.5% after bone removal during craniotomy, because of potentially adverse effects on intracranial dynamics [18].


William Macewen (1848–1924)


Macewen, medical superintendent of the Glasgow Fever Hospital, dealt with many deaths from respiratory obstruction due to diphtheria. (Fig. 64.4) He developed metal tubes that he inserted into the tracheas of cadavers, and then patients, following the observations of Desault some 100 years previously. He reported the case of a Glaswegian who popped a hot potato into his mouth. A few hours later and after some libation, the man had difficulty breathing and went to the emergency room of the Royal Infirmary. Macewen passed a metal tube into the Glaswegian’s trachea and relieved the obstruction. On 5 July 1878, he passed a tube into the larynx of a patient before inducing chloroform anesthesia for removal of an epithelioma from the base of the tongue [20]. He later developed red rubber tubes that were better tolerated by patients, especially those with diphtheria, when the intubation was maintained for 36 hours or longer [19].



A978-1-4614-8441-7_64_Fig4_HTML.jpg


Fig. 64.4
William Macewen. (From Wikimedia commons, accessed 3 June 12.)

After an anesthetic death and much national publicity, a resolution was adopted on 7 March 1883, in Glasgow, requiring training in anesthetics for all medical students and clerks [21]. Formal anesthetic training was not adopted in the rest of Britain until 1911. In the US, although the American Board of Anesthesiology was established in 1938, formal training for all medical students has only slowly been accepted, and is still not universal, similar to that in many 3rd world countries.

Macewen became a leader in neurosurgery in the UK. He demanded that his patients be anesthetized only by residents that had been appropriately trained and certified under his supervision.


Harvey Cushing (1869–1939)


An American pioneer of neurosurgery and anesthesia, Cushing studied at Yale and Harvard Universities, and trained at Johns Hopkins Hospital with William Halsted (Fig. 64.5). During his residency with Halsted, he became interested in neurosurgery and determined to develop better diagnostic methods, becoming one of the first in American surgery to employ X-rays.



A978-1-4614-8441-7_64_Fig5_HTML.jpg


Fig. 64.5
A photograph of Harvey Cushing taken around 1900. (From Wikimedia commons, accessed 3 June 12.)

As a medical student he was asked to stand in for the anesthetist Frank Lyman. As recounted in other chapters in this book, Cushing’s anesthetic ended in disaster. As the surgery began, the patient vomited, aspirated the vomit, and died.



“To my perfect amazement I was told it was nothing at all, that I had nothing to do with the man’s death, that he had a strangulated hernia and had been vomiting all night anyway, and that sort of thing happened frequently and I had better forget about it and go on with the medical school. I went on with the medical school but I have never forgotten about it [22].”

True to his word, Cushing never forgot this terrible event, and to the benefit of anesthesia, he did something about it [23]. With Codman, Cushing developed the anesthetic record on which anesthetists noted the patient’s heart rate and blood pressure (itself, a new measurement in medicine) at five minute intervals, an approach little different from that used in today’s anesthetic record. His suggestions were not immediately accepted. In 1903, a Harvard Medical School committee considered “the importance of blood pressure observation in surgical diagnosis and treatment”. The committee concluded that the skilled finger was of much greater value clinically for determination of the state of circulation than any pneumatic instrument. They suggested that Cushing’s work (and also George Crile’s work in Cleveland) should be put aside.

Cushing appeared to agree:



“I am not so sure that the general use of a blood pressure apparatus in clinical work has done more than harm. Just as Floyer’s pulse watch led to two previously unknown diseases, tachycardia and bradycardia, so the sphygmomanometer has led to the uncovering of the diseases of hypertension and hypotension which have vastly added to the number of neuroasthenics in the world” [24].

Cushing became professor of surgery at Harvard Medical School (and chief of surgery at the Peter Bent Brigham Hospital in Boston) in 1912. A perfectionist, he brought to neurosurgery crucial refinements in pre-operative preparation and operative technique. His skill reduced the mortality rate of brain operations from 50–60%, to about 10% by 1930 [25].

In 1898, inspired by Halsted’s work with local anesthesia and by observations of deaths under ether anesthesia, Cushing applied local infiltration of cocaine to his surgeries, especially brain cases, but also to hernia and thyroid operations [25]. He kept detailed records of his work, insisting that records similar to current anesthetic records (see above) be maintained on his patients.


Fedor Krause (1857–1937)


The father of German neurosurgery, Krause introduced operations to treat epilepsy into Germany, performing over 400 operations on epileptic patients during his career (Fig. 64.6). He is also remembered for his work in plastic surgery, and was an early practitioner of intraoperative electrostimulation of the cerebral cortex. He developed operative techniques for tumors of the brain and spinal cord.



A978-1-4614-8441-7_64_Fig6_HTML.jpg


Fig. 64.6
A photograph of Fedor Krause from around 1930. (From Medscape; Source Neurosurg Focus Copyright American Association of Neurological Surgeons.)

Krause was also concerned about the neurosurgical implications of anesthetic actions. As an assistant to Richard Volkman, Krause noted the effects of morphine-chloroform anesthesia, and was unconvinced that it offered advantages for neurosurgical procedures. He preferred chloroform alone, [26] but recognized the usefulness of small doses of morphine for postoperative pain relief. He felt that increased venous bleeding offset the safety of ether, and reserved its use for patients with cardiac failure. He advocated increasing the concentration of chloroform to produce controlled hypotension and decrease bleeding during tumor extirpation. He also noted that sudden death might occur if respiration ceased during tumor surgery (respiration was mainly spontaneous or controlled by an assistant under the drapes holding a mask to the patient’s face), and he preferred the Roth Drager apparatus that allowed administration of 100% oxygen. Like others, he emphasized that the brain was insensitive to pain, indicating the need for only light planes of narcotization [26]. Nevertheless he did not advocate local anesthesia, believing that pain was not the only problem. Optimum patient preparation for surgery demanded a positive attitude and calmness. Severe anxiety could contribute to death. He concluded that a good outcome required a rapid, aseptic technique, minimal blood loss, normothermia and general anesthesia, noting that spinal procedures were amenable to local infiltration [27].


Semmelweis and Clean Hands


The frequency of wound infection, which Horsley found to be as high as 40%, hindered the advancement of surgery and thus also of anesthesia. Ignaz Semmelweis (1818–1865) was a Hungarian physician and early pioneer in the use of antiseptic procedures (Fig. 64.7). Described as the “savior of mothers” [6] Semmelweis discovered that hand disinfection in post partum obstetric clinics drastically decreased the incidence of puerperal fever [28]. In mid-nineteenth-century hospitals, puerperal fever was common and often fatal, with mortality rates of 10%–35%. Semmelweis worked in Vienna General Hospital’s First Obstetric Clinic where patients on doctors’ wards had three times the mortality of those on the Second Obstetric Clinic, the midwives’ wards (Fig. 64.8). He noted that the doctors performed autopsies on patients dying of peurperal fever, but the midwives did not. Semmelweiss postulated that washing of hands and arms with chlorinated lime solutions, especially after cadaveric dissection, would reduce infection rates.



A978-1-4614-8441-7_64_Fig7_HTML.jpg


Fig. 64.7
Ignaz Semmelweis was aged 42 when this pen sketch by Jeno Dopy was made in 1860. (From Wikimedia commons, accessed 3 June 12.)



A978-1-4614-8441-7_64_Fig8_HTML.gif


Fig. 64.8
Puerperal fever mortality rates for the first and second clinic at the Vienna General Hospital 1841–6 differed. The first clinic, run by doctors, had a higher mortality rate than did the second, run by midwives. (From Wikimedia commons, accessed 3 June 12.)

Semmelweis’ astounding results showed that hand-washing reduced infection to less than 1%, but such data conflicted with contemporary scientific and medical opinions, and the medical community rejected his ideas. The suggestion that they should wash their hands offended doctors, and Semmelweis could not offer a scientific explanation for his findings. His ideas finally earned widespread acceptance years after his death, when Louis Pasteur (1822–1895) confirmed the germ theory by proving that microorganism (yeast) growth causes fermentation, and that bacteria emerge in nutrient broths from biogenesis and not from spontaneous generation [29]. In 1865, Semmelweis was committed to an asylum, where he died at age 47. It is not clear whether he died of a beating or, ironically, of septicemia.


Lister, von Bergman, Antisepsis and Asepsis


A stream of discoveries provided the knowledge and means to minimize, abolish, and treat infection. In 1867, Joseph Lister (1827–1912), Professor of Surgery at the Royal Infirmary in Glasgow, killed bacteria in the air and on the skin with a carbolic acid spray, thereby giving the world antisepsis. Other antiseptic rituals such as the use of masks, operating gowns, hats, and sterile gloves evolved thereafter. In 1886, Ernst von Bergmann (1836–1907) added asepsis, with his steam sterilization that ensured the sterility of instruments. Alexander Fleming grew penicillin from the mould Penicillium notatum in 1928, thereby beginning the antibiotic revolution. Together with Florey and Chain, who further refined penicillin, Fleming received the Nobel Prize in Medicine in 1945.

Anesthetic improvements had made neurosurgeons bolder, but infection had held them back. Now, advances could leap forward and with them, further anesthetic developments would appear.


X-rays


Yet another breakthrough played a major role in the expansion of neuroanesthesia. During 1895, William Röntgen (1845–1923), a German physicist, discovered a new form of electromagnetic radiation while investigating the effect of passing electrical discharges through various vacuum tubes [30]. (Fig. 64.9) He named the new rays, X-rays, a designation for something unknown. Nearly two weeks after his discovery, he took the first picture using x-rays, of the hand of his wife, Anna Bertha. He later reported that at this point he determined to continue his experiments in secrecy, because he feared for his professional reputation if his observations were incorrect. X-rays (Röntgen rays) earned him the first Nobel Prize in Physics in 1901. The resulting development of imaging underlies modern devices such as fluoroscopy, computerized tomography (CT) scans, and magnetic resonance imaging. Röntgen’s discovery profoundly altered surgery in more subtle ways than anesthesia and antisepsis. Before X-rays, doctors relied on their five senses. X-Rays provided a sixth sense that facilitated more precise diagnoses.



A978-1-4614-8441-7_64_Fig9_HTML.jpg


Fig. 64.9
This photograph of Wilhelm Roentgen was taken around 1900, shortly after the discovery of X-rays. (From Wikimedia commons, accessed 3 June 12.)

Few discoveries, apart from anesthesia, spread as fast as that of X-rays. Röntgen’s original paper, “On A New Kind Of Rays” (Über eine neue Art von Strahlen), appeared on 28 December 1895 [28]. X-rays had an enormous impact on surgery. Soft organs could be visualized by introducing a “contrast material”. For the brain, that included the use of air to allow pneumoencepalography, a technique introduced in 1919 by the US neurosurgeon Walter Dandy. Actions of anesthesiologists have enabled the increasing use of interventional neuroradiology in ever more disease states amenable to less invasive therapy. These include interventions such as aneurysm coiling, vertebroplasty, kyphoplasty and carotid stenting. Such offsite venues can be challenging because they may lack OR-oriented personnel or a PACU, and may be equipped with unfamiliar equipment.


Early Twentieth Century


Towards the end of the nineteenth century, many substances were touted as anesthetic agents. In his 1881 textbook,Artificial Anaesthesia and Anaesthetics, Henry Lyman listed 47, including benzene, acetone, and kerosene. He also included an intriguing drug, puff ball (also known as Indian bread and tuckahoe), described as a curious fungus found on the roots of fir trees in the southern US [31]. Other means of inducing insensibility included hyperventilation (instructing the patient to breathe 100 times/minute) and electricity. In this last method, an insulated forceps was applied to a tooth, forming the negative electrode, while a positive electrode was held by the patient. A current was applied and the patient became insensitive, and teeth could be extracted. The surgeon directed administration of the anesthetic. Regarding the responsibilities of the anesthetist, Lyman wrote:



“Death sometimes occurs during the use of an anaesthetic. If inhaled by the decedent in private and of his own motion, was it a case of suicide, or was it an accidental death? If administered by the hand of another…was it given by a person skilled in the theory and practice of etherization or by an individual destitute of these qualifications? [31]” This question remains for legal debate today.

Anesthetic management became an integral part of neurosurgery. In the early 1900s, open drop ether was still the technique of choice. A tourniquet was placed around the skull just above the eyes in an attempt to decrease cutaneous bleeding. Clearly there was no effect on intracranial bleeding. The development of electrocoagulation by William Bovie at Harvard made the use of open circuit explosive agents problematic [32]. Cushing introduced the use of the Bovie, a standard method of halting bleeding today, on 1 October 1926, at the Peter Bent Brigham Hospital, to facilitate removal of a vascular myeloma involving the scalp [33]. He also introduced silver clips to occlude cerebral blood vessels and staunch bleeding.

In 1919, Irish born Ivan Magill (1888–1986), accepted a post as anesthetist at the Queen’s Hospital, Sidcup, a hospital established to treat facial injuries sustained by soldiers in World War I. Working with plastic surgeon Harold Gillies, and his colleague in anesthesia, Stanley Rowbotham (1890–1989), Magill developed numerous items of anesthetic equipment but most particularly the single-tube tracheal anesthesia technique, driven by the immense difficulties of using a mask to administer chloroform and ether to men with severe facial injuries. In 1920, Rowbotham described blind nasal intubation, particularly useful for surgery on the oral cavity. Magill and Rowbotham were instrumental in reintroducing tracheal intubation in 1921. Gradually adopted over the next 2 decades (it was in general use in neurosurgery by the early 1930s), it used rubber tubing rather than the flexible metal tube as developed by Macewen [34]. These developments were essential to the furtherance of neuroanesthesia.

Greater understanding of intracranial dynamics developed. The Monro-Kellie doctrine, a synthesis of the works of eighteenth century Scottish anatomist (Alexander Monro) and nineteenth century American physiologist (George Kellie), stated that the cranial cavity is a closed rigid box, and that the quantity of intracranial blood must change through the displacement or replacement of cerebrospinal fluid [35,36].

Walter Cannon, an American physiologist (who coined the phrase “fight or flight”) described intracranial pressure (ICP) monitoring in 1901, as an expansion of the work of Claude Bernard on homeostasis [37]. White et al in Boston, showed in animals, that various anesthetics perturbed ICP and that carbon dioxide accumulation and oxygen lack produced dramatic increases [38]. To minimize these concerns, Andrew Hunter advocated the use of an 8 l/min inflow of 25% oxygen and nitrous oxide into a semi-closed anesthesia circuit [38]. His preferred opioid was heroin, in 1 mg increments for supretentorial procedures, when reduction in brain bulk was important [39]. Also, some years earlier, CB Courville published a book on adverse effects of nitrous oxide, suggesting that cortical and lenticular damage could be attributed to administration of the gas even if hypoxia had not occurred [40]. In the 1950s–1970s, the Swedish neurosurgeon, Nils Lundberg contributed to our understanding of ICP waves [41,42]. Lundberg worked with Leksell who developed stereotactic equipment, another important advance that called for flexibility and adjustments by the neuroanesthesiologist.

Various anesthetic techniques were advocated for neurosurgery. Willstaetter and Duisburg synthesized the rectal anesthetic tribromethanol (Avertin) in 1923, and Butzengeiger and Eichholtz used it as the sole anesthetic agent for neurosurgical procedures in the same year. Dandy used it in 1931 to reduce intracranial pressure [43]. Leo Davidoff, professor of neurosurgery at the Albert Einstein College of Medicine in New York, and a former resident with Harvey Cushing, considered that the effects of tribromethanol wore off too quickly and advised additional infiltration with local anesthetic [44]. Trichlorethylene with nitrous oxide was popular in the British Commonwealth but not the US or other parts of the world. After DE Jackson described the anesthetic effects of cyclopropane in 1934, BB Hershenson used it for neurosurgery in low concentrations in a closed circuit, publishing his experiences in 1942 [45,46]. The explosive potential of cyclopropane mandated draping the machine and the patient’s head with wet towels. Thiopental, synthesized by Volwiler and Tabern in 1930, was introduced into clinical practice by Waters and Lundy in 1934, and advocated for neurosurgery by Shannon and Gardner in 1946 [47]. The popularity of thiopental as asole agent disappeared with the discovery of fluorinated anesthetics.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Mar 21, 2017 | Posted by in ANESTHESIA | Comments Off on A History of Neuroanesthesia

Full access? Get Clinical Tree

Get Clinical Tree app for offline access