Major Anesthetic Themes in the 1950s



Fig. 8.1
Countries with more anesthetists than the median in 2006, of 160 members, formed societies roughly one-and-a-half to three decades sooner than those having less than the median. (Data from the World Federation of Societies of Anaesthesiologists.)





A978-1-4614-8441-7_8_Fig2_HTML.gif


Fig. 8.2
The rapid growth in societies that began in the 1940s continued in the 1950s with the addition of 22 new societies, 12 in Europe, 5 in Central-South America-Caribbean, and 5 in the Far East-Asia. (Data from the World Federation of Societies of Anaesthesiologists.)

As may be apparent from the preceding figures and discussion, anesthesia societies arose at different rates in different parts of the world (Fig. 8.3). Despite the devastation wreaked on Europe by World War II, anesthesia societies in Western Europe appeared in the late 1940s and early 1950s, a decade before they appeared in less affluent, more slowly reconstructed, Russian-controlled Eastern Europe. Western European countries benefited from the Marshall plan. Societies in South America appeared at times between these extremes. Having fewer resources and fewer anesthesiologists, Central American and Caribbean Island societies arose a decade after those in South America.



A978-1-4614-8441-7_8_Fig3_HTML.gif


Fig. 8.3
The relationship seen in Fig. 8.1 partly explains the differences seen in Fig. 8.3. Larger numbers of anesthesiologists in Western Europe led to an earlier development of societies than in South America, Eastern Europe, Asia, Central America, the Caribbean islands and the Middle East. Economic advancement may add to the explanation of differences in the appearance and growth of anesthesia in different regions. (Data from the World Federation of Societies of Anaesthesiologists.)



World War II Unevenly Affected the Geographic Growth of Anesthesia


At the beginning of the War, few trained anesthetists existed in any part of the world, and most of those few were from English-speaking countries. Countries who were slow to recover from the war could not support a major development of a trained cadre of anesthetists; their goal was survival and rebuilding of their infrastructure and industry. Furthermore, in much of the world, anesthesia had been, and continued to be, viewed as something anyone could undertake. Anyone could give anesthesia. No anesthesiologists? What’s the problem?

Thus, despite the enormous increase in the numbers of operations imposed by war, anesthesia made few advances in numbers or status in Eastern Europe, the USSR, and Asia. In the UK, physicians had long been the providers of anesthesia and a subset of UK physicians practiced only anesthesia. The US had responded to the increase in surgeries in World War II by forcing physicians into the delivery of anesthesia. Why physicians? Well, they were men, and men could be drafted. More trained anesthetists were nurses, but most of these were women [1]. Compounding the problem, if a nurse anesthetist volunteered for service, she was not guaranteed to practice anesthesia! [That wasn’t unique to nurses. Following residency in 1958 I (EIE) was drafted into the army more than a decade after the end of World War II. Although I was the Chief of the Anesthesia and Operative Section (in charge of 1 operating room and a closet), I was also a general duty medical officer who did a lot of physical examinations. I remember finding a thyroid nodule during a routine examination one day and giving anesthesia to the patient the next day for removal of his cancer.] The spectacular increases in the numbers of US physicians forced to practice anesthesia imply something else. These were young anesthetists with the energy and ambition that comes with youth. They made their way against established but older specialists.


Drafted US Physicians Become Anesthesiologists


In the early phases of World War II, waves of US physicians trained to become anesthetists in programs lasting up to 3 months, the products of these courses calling themselves 90-day wonders. In 1942, (Col.) Ralph Tovell surveyed American military hospitals in England and found that insufficiently trained physicians provided anesthesia services. In the following year, this led 99 officers to receive 1 month or more training in the better-prepared British hospitals. Parallel courses for the 90-day wonders were given in the US.



“Until the war, there had been only one military course in anesthesia. Stevens J. Martin, M.D., who trained in Wisconsin under Ralph M. Waters, M.D., organized the first Army course in anesthesia in July 1941 at Tilton General Hospital, Fort Dix, New Jersey. This course became the model used for the anesthesia courses developed by The Subcommittee on Anesthesia of the National Research Council….Courses began in the summer of 1942.” [1]

Because of the great need for anesthetists, many physicians pressed into service as anesthetists did not even have the limited training described above. They were self-taught. In 1944, the Army recognized a need for upgrading their skills and knowledge and



“gave four intensive courses in anesthesia taught by ‘the outstanding physician-anesthetist in the theater.’ (the military theater of operations)…A fourth cohort was an amalgam of on-the-job training and informal and formal apprenticeships by medical officers in the theater. In American units with adequately experienced medical officers, training and apprenticeship programs for local and rotating officers were established. The newly trained physician-anesthetists then returned to their own hospitals to train more medical officers…in anesthesia,,.As the war continued, required apprenticeships were implemented to address the increasingly inadequate number of physician-anesthetists. In fact, because of the shortage, in November 1944, it was determined that a trained replacement was required before an anesthetist could move out.” [1]

The unintended effect of the War was an enormous increase in the number of anesthesiologists dedicated to the specialty. In 1940, there were 568 ASA members, of which 105 were ABA diplomats. By 1950, these numbers had increased to 3,393 and 706. They were to double again in the 1950s. This forced growth in the number of US physician-anesthetists presented unexpected opportunities, while other factors imposed by the war augmented the effects of those opportunities.

First, physicians pressed into administering anesthesia often discovered that they liked what they did, finding that anesthesia presented surprising pleasures and challenges such as the use of regional anesthesia’more than half of the operations were performed under regional anesthesia [1]. General anesthesia was increasingly administered through a tracheal tube, demanding the acquisition of a technical skill [1]. And the anesthesiologist assumed the role of perioperative physician, responsible for the patient before and after as well as during anesthesia.

Second, surgeons supported by physician-anesthetists, found that they liked what they got.



“…surgeons have made or will make their first contacts with competent anesthesiologists in the armed forces and work under such improved conditions provided by them. After such an experience, it is to be seriously doubted whether many of them will be content on their return to civilian practice to retrogress to the inferior type of unsupervised technician anesthesia, where, as the law requires…the surgeons…must assume full responsibility for the anesthesia, even though fully occupied with the technical requirements of the surgery….” [2]

In Great Britain and its Commonwealth countries however, surgeons had been used to anesthesia being provided by physicians, but that didn’t stop them from maintaining their place at the top of the tree.

A third factor encouraged physicians forced into anesthesia to continue as anesthesiologists. The American Board of Anesthesiology credited such physicians with a year of residency’i.e., half the time mandated to complete a residency. Take just one year of residency and become (if you passed your exams) a board-certified anesthesiologist.

Waisel noted a fourth factor supporting the growth of anesthesia as a specialty during and after World War II: wage and price controls [1]. These controls were imposed to limit inflation. To compensate for this limitation to recruitment of workers, corporations increased benefits, including health insurance, something that prompted the increased use of medical/surgical services and, ultimately, the demand for anesthesia and anesthetists. As an aside, one might note that this means of paying for health care partly underlies our present health care crisis. Outside the US, anesthetists continued to be remunerated at a fraction of the rate pertaining to surgeons. Many were actually paid a moiety out of the surgeons’ fee. In GB, the National Health Service ensured that all doctors were paid the same’within the service.

To summarize, several factors contributed to what became the basis for a remarkable growth in physician-anesthetists and nurse-anesthetists. Physicians drafted into anesthesia discovered that it was more challenging and rewarding than they had anticipated. Many surgeons became supporters of such physician-anesthetists. The financial base for anesthesia increased in the US. And board certification was made easier to acquire for those with on-the-job training. One might add the reasons given by residents today: the attractions of acute care, of seeing an immediate effect of anesthetic ministrations, the everyday application of principles of physiology and pharmacology.

But I (EIE) believe there is a sixth reason, one more powerful than the above five incentives to a career in anesthesia. My conversion to anesthesia provides an illustration.



“My career in anesthesia began on a pleasant spring day in 1952 as a newly minted first year medical student who wished to make money as an anesthesia extern. After a two-month summer apprenticeship in anesthesia, I would take call for my mentor, who could rest secure at home knowing that the care of emergency patients was in my capable hands! On that first day, he showed me how to start an intravenous infusion of 0.2 % thiopental, dial a 70 % concentration of nitrous oxide, properly hold a rubber mask to the patient’s face, and watch the rebreathing bag. Then he left the room. And I was in trouble. The rebreathing bag moved less and less and finally stopped. I knew little of anesthesia, just information supplied in a couple of lectures in pharmacology. But I knew that breathing was good and not breathing was bad. In a squeaky voice I told the surgeon that the patient had stopped breathing. With great presence of mind, and instead of berating me for obvious incompetence, he asked if I wanted him to give artificial respiration. “Yes, please.” I responded, voice still high-pitched. The surgeon squeezed the chest, the rebreathing bag now moved, and the circulating nurse fetched my mentor’who noted that the rebreathing bag could be used to ventilate the patient’s lungs. I finished the day exhausted and smelling of terror. The epiphany came as I sat thinking of the day’s events. To that moment I’d dreamed of becoming a second Robert Koch, a country physician who would make great medical discoveries as a general practitioner. A wonderfully naïve dream that suddenly vanished as I thought ‘You nearly killed a patient, today, and if you chose anesthesia as a career, you could do that every day. Every day you could take a patient’s life in your hands. Every day.’ To a control freak (me) that image was as seductive as seduction comes.” [3]

I’m probably not alone. Anesthesia confers enormous power. With each anesthetic, the anesthetist takes a patient’s life in their hands, exerts complete control over another human being, over the brain, breathing, the circulation, the muscles. Enormous power, and with it, enormous responsibility, an addictive and intoxicating mix.


Worldwide Training in Anesthesia


These diverse forces dramatically increased the numbers of physicians and nurses who chose anesthesia as a career in the US. Training programs increased, particularly after World War II, to accommodate this career choice (Fig. 8.4). Parallel growth occurred worldwide, but as noted above, the timing and rate of growth differed as a function of social and economic factors. South America had been spared the horrors of World War II, and in the 1950s–1960s, residencies arose in Bolivia, Brazil, Colombia, Peru, Uruguay and Venezuela, some of 2 years duration. Between 1948 and 1973 in France, nurse anesthesia training schools opened in most cities, training 1500 students who on passing an examination, received a Certificat d’Aptitude aux Fonctions d’Aide Anesthésiste. As in the US, the graduating nurse anesthetists established their own institutions and competed with anesthesiologists, resulting in continuing antagonisms.



A978-1-4614-8441-7_8_Fig4_HTML.gif


Fig. 8.4
The increasing numbers of residency programs in the US after World War II in part reflected the demand for training in anesthesia. This demand stabilized, as did the number of approved residencies, in the 1950s. The increase may also have reflected the desire by hospitals to access anesthetic services at low cost. (Data from Betcher et al. Anesthesiology 17:226–64, 1956.)

The length of formal training varied enormously worldwide, particularly soon after World War II. By the 1950s, with internship, it would be 3 years in the US. It was already 3 years in GB, including internship, and it started as 3 years in Australia and New Zealand in 1952. Given that World War II had placed the US as leader in many ways, much of the world initially might use 3 years as a standard. The 1950s were a time of great diversity and change in the perception of the appropriate duration of training. Limited economic resources and an immediate need for trained anesthetists might favor shorter periods of training. In the 1950s in Europe, the minimum prescribed periods differed by an order of magnitude. But by the 1980s this had narrowed to between 4 and 7 years, with most European countries agreeing that 5 years were needed.

It is a cruel irony that training in anesthesia, and anesthesia societies, developed slowly in the USSR and China, countries suffering most from World War II, countries that had contributed the most human lives, whose infrastructure had been devastated, and who had access to fewer resources that might assist rebuilding. Isolated by the cold war, devoting limited resources to strengthening military power, continuing to bleed from civil war (China) and government tyranny (in the USSR; and note China’s Cultural Revolution), these countries educated few anesthesiologists and had but limited and diverse durations of training. Contrast this with growth in Japan which had also lost substantial human life and infrastructure, but was occupied by what turned out to be a benevolent force, imposing a stability that allowed rebuilding to occur. The Japanese Society of Anesthesiologists was established in 1954, whereas the Russian Federation of Anaesthesiologists and Reanimatologists was not established until 1972, and the Chinese Society of Anesthesiologists began in 1988.


Fee-for-Service and the Association of University Anesthesiologists (AUA)


The Association of University Anesthetists, later the Association of University Anesthesiologists, perhaps the leading organization for anesthetic academicians in the US, developed in the early 1950s. Emanuel Papper played a crucial role in its birth, a role he described in a witty, self-deprecating history [4]. A stated purpose of the AUA was “…to promote and discuss research and teaching in Anesthesia.” [5] The larger truth was a bit seamier [4]. In 1950, organized anesthesia (namely the American Society of Anesthesiologists’the ASA) pressured practitioners to not accept a salary for their anesthetic services. The ASA argued that anesthesia was the practice of medicine and a hospital (the management paying the salary) could not practice medicine. The ASA appeared to adopt the Hess Report of the American Medical Association: “A physician should not dispose of his professional attainments or services to any hospital, lay body, organization, group or individual, by whatever name called, or however organized, under terms or conditions which permit exploitation of the services of the physician for the financial profit of the agency concerned.” Papper believed that the ASA and the Directors of the American Board of Anesthesiology (ABA) interpreted this section of the Hess Report to mean that salaried forms of the practice of anesthesiology were unethical. He contended that “Attentive reading of the approved Hess Report does not support this interpretation so clearly.” What a diplomat was our Papper. The concern regarding fee-for-service was peculiar to the US. In other countries in the world, salaried medical practice, in hospitals, was and continues to be an accepted practice.

The pressure applied by the ASA included the threat of loss of ASA/AMA membership and Board certification. It never was clear that the ASA would broadly impose sanctions on salaried academics, many of which were’and are’quite happy with a salaried arrangement, unconcerned with any ethical implications. But we know of one case where pressure was supposedly brought to bear on an anesthesiologist for accepting a salaried arrangement. In the Ellensburg Daily Record, June 10, 1954 the report reads:



Physician Sues Medical Society In King County. SEATTLE (UP)’Claiming he has been socially ostracized and denied membership in the King County Medical Society because he works for a salary rather than for fees, Dr. Lloyd H. Mousel has filed suit against the society and seven other medical groups. The doctor, formerly of Washington, D.C., and the Mayo Clinic, has worked at Seattle’s Swedish Hospital since 1949 as director of anesthesiology on a salary basis. This, he said, has resulted in rejection of his application to transfer his membership from the District of Columbia to the King County Medical Society.”

Papper argued that “Little recognition was given then, or for that matter now, on the dangerous irrelevancy of linking ethical behavior of a physician in practice to the manner in which he or she earned a living as a matter of principle. It is how these practices are used that determines whether the method leads to inequities and to abuse of patient care rather than the process itself.”

The potential intrusions of the ASA distressed four academics. Perhaps most distressed was the ethicist Henry Beecher’joined by Papper, Austin Lamont, and Robert Dripps, anesthetic giants of the period. Over several months, these four constructed an organization that might voice the views of academia. The organization, of course, would have research and education as its stated primary purpose, but in fact, the underlying purpose would be to keep the ASA from dictating the economics of anesthetic academia.

The AUA has served academic anesthesia and the specialty well. It initially had elitist pretensions. The original by-laws mandated a membership that was not to exceed 100. Tom Hornbein and John Bonica challenged this with the argument that the membership needed to be broadly based if it were to truly represent academia. Their egalitarian view prevailed, and in 1971 the 100 member limit was removed.

Further to elitist pretensions, at the AUA meeting in 1973, the membership discussed the possibility of awarding some honor for the best research done during that year. I (EIE) was all for it. However, Papper demolished the proposal by suggesting that it was “OK, but it really was like little boys who gave each other medals and epaulets.” Immediately after the day’s meeting ended, Hornbein and I found an army-navy supply store and bought the biggest, gaudiest pair of epaulets available and attached double-sided tape to each. Before that evening’s formal black-tie (more elitist stuff) AUA banquet, we convinced the AUA President, Nick Greene, that this was the time to put Papper’s suggestion to work. At the banquet, after reading the usual announcements, Greene, intoned that this evening he would reveal the First Manny Papper Awardee for Excellence in Research. “Would Dr. Papper please stand for the Award?” Papper hesitantly rose. Hornbein and I strode up from behind and clapped the epaulets onto his shoulder. Ruefully, Papper commented that he “should have kept his mouth shut.” This was the first and last Manny Papper Award.

The AUA has grown, with 613 active and 176 senior members as of 2012 (Email from Annie DeVries, AUA Administrative Assistant, 15 Nov 12). Beyond a research mission, it has a larger focus on educational and political matters. Despite its initial hostility to the ASA, it has a present cordial and cooperative relationship with the ASA.

In 1978, the Federal Trade Commission required that the ASA sign a Consent Decree agreeing, “…that an anesthesiologist is free to choose whatever arrangement he prefers for compensation of his professional services.” A demand for a fee-for-service arrangement could no longer be imposed by the ASA or the ABA. Nonetheless, the nearly three decades over which fee-for-service had been the main method of compensation had, with other factors (see above), made anesthesiology an economically attractive specialty in the US, an attraction that continues to this day (Table8.1). This attraction had several consequences. It added to the ease of recruitment into the specialty, including physicians from outside the US. It enabled academic departments to support research with time and seed money. This luxury helped make US anesthesiology a world leader in anesthesia research for much of the 1950s to the 1980s.




Table 8.1
A comparison of median annual salaries of some specialties in the USA in 2011
















































Speciality

Median
 
Salary

Orthopedic Surgeon

$ 500,672

Gastroenterologist

$ 405,000

Invasive Cardiologist

$ 402,000

Anesthesiologist

$370,500

General Surgeon

$ 357,091

Obstetrician-Gynecologist

$ 275,152

Emergency Medicine

$ 267,293

Ophthalmologist

$ 238,200

Neurologist

$ 236,500

Urologist

$ 222,920

Pediatrician

$ 209,873

Family Medicine

$ 208.861


Data from SK&A, A Cegedim Company. 2601 Main St, Suite 650, Irvine, CA 92614; 800-752-5478. U.S. Physician Compensation Trends. Revised August 2011


New Inhaled Anesthetics and New Vaporizers for Their Delivery



The Development of Modern Inhaled Anesthetics


The anesthesia provided by ether and chloroform in 1846 and 1847 was an enormous step forward. It enabled unheard-of, unthought-of surgeries. For a century, it sufficed for most purposes that surgeons could imagine. Advances were not required until surgeons began to push boundaries for which ether and chloroform, and the technology and skills needed for anesthetic delivery, were insufficient. Surgery demanded more. The benefits of the electrocautery (the Bovie, first used by Cushing in 1926) and other electrical equipment, encouraged development of new nonflammable anesthetics. Chloroform was nonflammable but too toxic. The coincidence of this need, plus the new fluorine chemistry required for development of the atomic bomb, gave rise to modern inhaled anesthetics, compounds halogenated with fluorine.

Why fluorine? Because this smallest of halogens clings with greater strength to other atoms, particularly carbon atoms. The chlorine in chloroform can be torn from its carbon mate and this underlies the toxicity of chloroform; prevent the separation of the two atoms and you prevent hepatic injury [6]. Fluorine-carbon combinations are usually less vulnerable than chlorine-carbon combinations to degradation by carbon dioxide absorbents or to metabolism, and thus are usually less subject to the sometimes-toxic consequences of degradation or metabolism. Fluorination had another virtue. It made compounds less soluble and thus more readily eliminated; patients awoke sooner after anesthesia.

The first of these modern inhaled anesthetics was fluroxene, synthesized by Julius Shukys in the late 1940s and released in 1953. I (EIE) had come into anesthesia at this time in Chicago, having suddenly decided on a career direction change from general practitioner to anesthesiologist (see above). This caused me to try to learn all that I could about anesthesia. It wasn’t hard to do because there wasn’t much to learn. I sought out the teaching venues in Chicago. The head of the University of Illinois program, Max Sadove, led one of these. Each week he presided over the morbidity and mortality sessions (now portentously called Grand Rounds), cigar in hand, calling for cases to present. It was not a formal enterprise. I remember a discussion of fluroxene, ending with Sadove’s paraphrased comment that “We’ve tried it now in humans; perhaps we should test it in dogs.” Lucky they went in that direction because it is toxic to many animals (including dogs), [6] but not to humans [7]. Fluroxene enjoyed a minor vogue and commercial success, but it caused cardiovascular stimulation, [8] was irritating to breathe, and produced substantial postoperative nausea and vomiting. Its trade name was Fluromar, but it sometimes disparagingly was called Vomomar or Flurobarf.

The first truly successful modern inhaled anesthetic was halothane, synthesized in 1951 by Charles Suckling of Imperial Chemical Industries in England. He was acting on an inspired guess by his boss, John Ferguson, who had studied fluorine-containing agrochemical agents for fumigation of grain silos, noting that some of the agents knocked out weevils and beetles who recovered quite nicely [9]. After testing by James Raventos in animals, Michael Johnstone used halothane clinically in 1956 [10]. Halothane was less soluble in blood and thus allowed a more rapid awakening from anesthesia than did chloroform. However, the bigger concern was injury to the liver (hepatotoxicity). Chloroform was a classic hepatotoxin. If you gave it to an animal, it injured the liver. The more you gave, the greater the injury [11]. At least in the beginning, halothane didn’t do that in humans. Chance favored halothane. The first patient scheduled to be anesthetized with halothane had her surgery cancelled because of a slight illness that subsequently became manifest as jaundice due to hepatitis. Had she received halothane, the subsequent hepatic illness would surely have been considered to be due to the anesthetic. But the gods smiled on halothane, and the next patient survived uneventfully.

Michael Johnstone was not the only one to test the new anesthetic. Cedric Prys-Roberts (personal communication to EIE, 15 Nov 12) remembered that



“it was also sent to Roger Bryce-Smith in Oxford, and to George Ellis at St Bartholomews Hospital. As a medical student early in 1957 I did an anaesthetic attachment for one month, mostly with George Ellis. He used halothane in a Marrett head (vaporizer within a circle system with CO2 absorption) with spontaneous breathing after a thiopentone induction. He taught me to anaesthetize patients for retropubic prostatectomy (open abdomen, slight head down tilt) with this technique’his admonitions: 1) never squeeze the bag! (for obvious reasons)1’if the patient stops breathing’leave them alone so long as they are pink’they will start breathing again when they are ready; 2) don’t take the blood pressure’you will only be worried because it will be low, and; 3) don’t turn down the halothane vaporizer setting unless you can feel an irregular pulse”

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Mar 21, 2017 | Posted by in ANESTHESIA | Comments Off on Major Anesthetic Themes in the 1950s

Full access? Get Clinical Tree

Get Clinical Tree app for offline access