Major Anesthetic Themes in the 1980s



Fig. 11.1
Anesthetic societies continued to accumulate, a dozen in the 1980s



In South America, in 1987, the societies in Argentina, Brazil, Paraguay and Uruguay formed the Federation of South American Societies of Anesthesiology (FASA), later to be joined by Bolivia, Chile, and Colombia. In the same year, the larger Latin American Confederation of Societies of Anesthesiologists (CLASA) that had been established in 1962, became the South American Regional Section of the World Federation of Societies of Anaesthesiologists (WFSA). The Pan Arab Section and the African Section remained to be created in 1990 and 1997 respectively.

The American Society of Anesthesiologists (ASA) continued to grow in numbers and influence. Membership increased from just under 17,000 in 1980, to 27,000 in 1990. It came into conflict with government and non-government agencies in the 1980s. In the late 1970s, the Federal Trade Commission questioned the billing practices of anesthetists, citing curtailment of competition through the use of a common guide to fee setting, the Relative Value Guide. The ASA was large enough to take on the FTC, and in 1979 successfully defended use of the RVG. Predictably, the FTC did not go away, and targeted groups and individuals for uncompetitive practices and in particular, relationships with nurse anesthetists. These matters and others kept the ASA busy for the whole decade, prompting the Society to establish a permanent Office of Government Affairs (OGA) in Washington DC, in 1984. The decision was vindicated in 1985. Congress passed a bill giving nurse anesthetists the right to practice independently in the care of government employees, but through the efforts of the OGA, the ASA secured a Presidential veto of the legislation.

The ASA perceived that the public did not appreciate the contributions and importance of anesthesia delivered by anesthesiologists. Accordingly, the ASA began a public education program in 1981, utilizing the media, videos and brochures, with media tours targeting major cities. A 1985 review of the program for its effectiveness found that “the man in the street in this country is not the least bit concerned about anesthesiology, let alone anesthesiologists”. As a result, by 1987, the program had been substantially curtailed, targeting only hospital administrators and legislators [1]. Several other countries failed to learn from this experience, and spent large sums of money on media-based public education programs (National Anaesthesia Days) in the early 2000s, only to abandon them within a few years of their commencement.

Through the 1980s, women increasingly influenced the specialty. They accepted leadership roles in the US including 3 academic chairs in 1986 (but this out of more than 100 programs). In 1988, Gertie Marx received the ASA Distinguished Service Award. Nevertheless, their rate of board certification was less, and academic advancement appeared to be slower. This prompted a request by a group of women activists to the ASA for the formation of an ad hoc committee on women’s issues and a panel to discuss such issues at the Annual Meeting. The request was denied. In 1986, ASA delegate Betty Stephenson proposed a resolution that the ASA bylaws be degendered. The resolution also failed approval by the ASA House of Delegates, but ASA President, Peter McDermott later administratively degendered the ASA’s bylaws and documents. As noted in Chapter 2: “In October 1992, I (McDermott) became ASA president and issued an order degendering the language of ASA bylaws and other documents. It would have been embarrassing to have the issue debated (in the House of Delegates) by old white men.”



Subspecialty Societies


The burgeoning specialty spawned subspecialty organizations in Europe for regional anesthesia (1980), intensive care (1982), cardiothoracic anesthesia (1985), and pediatric anesthesia (1986). International societies established in the previous decade accommodated subspecialty interest in obstetrics, neuroanesthesia, and pain. The European Society of Intensive Care Medicine created the European Diploma in Intensive Care (EDIC) in 1988, and this became widely accepted.

In Canada, Australia, and New Zealand, subspecialty societies or “special interest groups” began to appear under the auspices of the major organisations, the Faculty of Anaesthetists (Australia and New Zealand), and the Canadian Anaesthesiologists Society. International subspecialty organisations for pain, obstetric anesthesia, neuroanesthesia, and regional anesthesia supported membership from all countries.

In the US, the Society for Ambulatory Anesthesia (SAMBA) was established in 1984, reflecting the increasing focus on outpatient surgery and anesthesia, and the Society for Pediatric Anesthesia (SPA) was established in 1986.


The Growth of Education


The 1980s saw further expansion and consolidation of basic and advanced education in anesthesia.

In the US, in 1980, after years of disagreement between the organizations involved in undergraduate and postgraduate medical education, the Coordinating Council on Medical Education (CCME) was disbanded. The Liaison Committee on Graduate Medical Education had been accrediting postgraduate programs since 1975, and in 1981 was renamed the Accreditation Council for Graduate Medical Education (ACGME) under the auspices of the American Medical Association (AMA), the American Hospital Association (AHA) and the Advisory Board for Medical Specialties (ABMS). The ACGME continues to be the accrediting body for all medical internship and residency training programs.

The American Board of Anesthesiology refined the examination process, in particular determining means to diminish variability in both questions and assessment of oral examinations. During the 1980s, the Board initiated internal discussions regarding the introduction of recertification and the number of Board certified diplomats increased from 9800 to 19000.

The Society for Education in Anesthesia (SEA) was formed in the early 1980s with an aim of engaging consultant anesthetists in the education of medical students and residents. A founder, Philip Liu, wrote in the first newsletter in 1984 of his hope that that “members of this Society will develop a genuine sense of fellowship, so that individuals with an interest and vocation in the teaching of anesthesiology will find a network of support and a resource for communication.” The Society has continued its educational role, and has never sought a political one.

In 1977, Jean Lassner established the European Academy of Anesthesiology (EAA) in Paris, to promote meetings and exchange of ideas between European anesthetists, with a particular emphasis on teaching methods. The Academy established theEuropean Journal of Anaesthesiology in 1984, edited by British anesthetist, John Zorab. In the same year, the European Diploma in Anaesthesia and Intensive Care (EDA) was introduced.

The Diploma comprised a two part multi-lingual examination, with the second part to be taken after completing 6 years of post graduate medical practice,1 including 4 years in anesthesia. This was at a time when the Anaesthesiology Section of the Union Européenne des Médecins Spécialistes (UEMS), established in 1962, was the official body under the European Union for recommending minimum training times, but without any formal power. The Diploma, although not officially recognized, provided evidence that the diplomate had met uniform educational standards. Nevertheless, it did not replace individual national accreditation, and did not provide for cross-border portability. In 1989, the EAA also created a voluntary Hospital Recognition Program, primarily for those countries that did not have their own hospital accreditation system. Although not accepted by the UEMS, the program was supported by the Anaesthesiology Section.

During the 1980s, several European countries adopted all or part of the EDA as a mandatory part of their accreditation process for specialist recognition. In 1980, the Section recommended a minimum training time of 5 years. However during the decade, the minimum duration in most western and eastern European countries was 4 years (see Fig. 38.​1). In contrast, training in Russia varied from less than one, to two years. In Paris, in 1989, a new Diploma, the Diplôme d’études supérieures en anesthésié-réanimation (DESA) was created, requiring 5 years of postgraduate anesthesia training.

In 1986, anesthetists in The Netherlands, France and Belgium established the Fondation Européenne d’Enseignement en Anesthésiologie (FEEA) to provide continuing medical education (CME) in anesthesia, through the conduct of refresher courses. FEEA extended its activity to more countries as the size of the European Union grew.

Matters were further confused by the establishment of the European Society of Anaesthesiologists (ESA) in 1992, the European Board of Anaesthesiology (EBA) in 1993, and the Confederation of European National Societies of Anaesthesiology (CENSA) that had evolved from the WFSA regional section. These anesthesia-related educational bodies functioned independently, but co-operatively, and in the 2000s were brought under the joint umbrella of the ESA and EBA.

In Great Britain, while functioning within the Royal College of Surgeons, the Faculty of Anaesthetists had long provided training, accreditation and examination for Fellowship. Reflecting the earned status of the specialty, in 1988, the Faculty became an independent College of Anaesthetists. By this time the period of training was 7 years including basic, intermediate and specialised training. Because these requirements exceeded those for European qualification, few European anesthetists could practice as specialist consultants in Great Britain. Reciprocal arrangements were to be resolved during the next decade.

In other areas of the globe, anesthesia training was either being established or extended. By the end of the 1980s, all major Latin American countries had formal training programs in place with a minimum period of 3 years.

Despite the short time since the 1976 end of the Cultural Revolution, Chinese anesthetists had established the Chinese Society of Anesthesiology in 1979 and several Colleges in the succeeding decade. The first College was in Xuzhou Medical School in 1987, with three more founded in 1988, at Harbin, Hunan, and Tongji medical universities. These were undergraduate medical courses, offering major studies in anesthesia, critical care and pain management. In the 1980s, three new journals were launched, theChinese Journal of Anesthesiology, theForeign Medical Sciences (Anesthesiology and Resuscitation), andThe Journal of Clinical Anesthesiology. Despite this, in many hospitals, anesthesia was often considered a technical rather than a clinical (i.e., professional) practice. This changed in 1989, when the Ministry of Public Health formally defined the criteria for a department of anesthesiology: “The department of anesthesiology should be a clinical department rather than a technical department. More emphasis should be showed on personnel training, instrument and equipment. The working level should be improved to meet the demands of medical development.”


Intensive (Critical) Care Training


Beginning with Bjørn Ibsen’s management of the 1952 polio epidemic in Copenhagen, anesthesiologists have been at the forefront of the development of intensive care medicine, utilizing their skills in respiratory and emergency care. Accordingly, anesthesia training programs increasingly stipulated training in intensive care. In 1982, the European Society of Intensive Care Medicine (ESICM) was founded, and 6 years later the Society created the European Diploma in Intensive Care (EDIC). At the time, the specialty was not recognized as a separate section under the UEMS, partly due to its multidisciplinary nature. This was to occur in 1999. It remains separate from the ESA.

In the US, critical care medicine had been one of several subspecialties, in which experience was required as part of the formal accreditation process by the American Board of Anesthesiology (ABA). Critical care experience was distinguished by being the only subspecialty for which a minimum time was stipulated (2 months). In 1985, the ABA introduced the first subspecialty certificate’in critical care medicine.


Nurse Anesthetists


In few countries is the presence of nurse anesthesia of greater importance to anesthetic delivery than in the US. In the US, during the late 1970s, the intertwined relationship between the certification process for nurse anesthetists, and the political role of the American Association of Nurse Anesthetists (AANA) was finally separated. After 1982, the independent Council on Certification issued a certificate (Certified Registered Nurse Anesthetist’CRNA) that no longer required membership in the association. Through the remainder of the 1980s, the Council embarked on a process to standardize and validate the large number of training programs, and introduce an objective standardized examination process.

In 1989, a meeting of nurse anesthetists in Switzerland, representing 11 countries, founded the International Federation of Nurse Anesthetists.


Academic Anesthesia


In Great Britain, Margaret Thatcher’s conservative government curtailed funding for academic support, and anesthesia research suffered. Universities relied largely on public funding for their operation, applying it to both teaching and research within their own jurisdiction. After reviewing the method of funding, in 1986, funding for research was redistributed to research councils. Universities, or their research staff, now had to compete for grants. Universities became business enterprises, competing in a marketplace for research funds. Suddenly, anesthesia researchers contended against everyone else for a portion of a diminished pool. Fields of investigation that had greater perceived or real benefit to the population were at a distinct advantage. Anesthesia became a victim of its own success; too few patients died or were injured by anesthesia. For anesthetists who were partly academic and partly clinical and teaching practitioners, the situation was worse as they did not have the same time or level of support in writing grant applications as did their full time academic colleagues.

The Japanese Society of Anaesthesiology began publication of theJournal of Anaesthesia in 1987, an English language journal that would compete with Masui.


Isoflurane Finally Makes It


After being cleared of the reported carcinogenic risk that had thwarted its planned release in the mid 1970s, isoflurane was introduced to clinical anesthesia in 1981, It soon replaced its structural isomer, enflurane, having the benefits of faster induction and recovery, absent epileptogenicity, and only 0.2% biodegradation. Seemingly, the only drawback was a mild pungency, making it less suitable as an inhaled induction agent. The pungency ensured the persistence of halothane in pediatric practice.

Was it too good to be true? Soon a potential issue appeared. Isoflurane caused a dose-related decrease in blood pressure, due to vascular dilation, rather than myocardial depression, as with halothane. The dilation also affected coronary vessels, and a debate began about “coronary artery steal”, diverting blood flow away from collateral vessels that were unable to dilate. Subsequent studies showed the fears to be unfounded.


Japan Tries a Reject


Ross Terrell had synthesized sevoflurane and desflurane in the 1960s. Both had been set aside as unsuitable for further study, sevoflurane because it was unstable in soda lime, and desflurane because it was difficult to make, had a low potency and high saturated vapour pressure. Increasingly, the extent of metabolism and the absence of degradation by CO2 absorbants were considered crucial properties of desirable modern inhaled anesthetics. Less was better, dictating subsequent development of enflurane, isoflurane, and desflurane, all agents with progressively less metabolic degradation than halothane or methoxyflurane.

Despite these concerns, sevoflurane was transiently resurrected in the 1970s. Animal studies suggested that it might show promise as an “ideal” inhalation agent. Minimal cardiorespiratory stimulation accompanied a low pungency and rapid induction. However, in addition to the instability in soda lime, the potential for fluoride-induced renal failure (as with methoxyflurane) was raised on the basis of experiments in rats. Baxter and ICI shelved sevoflurane in the late 1970s, although human trials in the early 1980s, confirmed sevoflurane’s suitability from a clinical, if not a laboratory perspective.

Anaquest resurrected desflurane in the 1980s, as part of a search for a better anesthetic for outpatient surgery. The first human subject, Jeremy Cashman, received desflurane at Guy’s hospital in 1988. He survived and went on to an academic career in anesthesia. Meanwhile, Baxter offered the rights to Anaquest, for development of sevoflurane for the North American market. Anaquest took an option for one year while they considered the matter.

Anaquest thus had the rights to desflurane, the option from Baxter on sevoflurane, and a difficult choice: which one to develop for the anesthesia market? Despite the difficult and expensive manufacturing process, the need for a radically new vaporiser, and its airway irritability, desflurane had a long patent life, was more stable and less soluble than sevoflurane and underwent negligible metabolism. It was a close decision, but sevoflurane lost.

Enter Maruishi, a Japanese pharmaceutical company operating in a less regulated environment. They bought the rights to sevoflurane from Baxter and succeeded with its commercialization in Japan in the late 1980s. Maruishi licensed the international rights to sevoflurane to Abbott Laboratories, in 1992.


Two Out of Five


The 1980s saw the release of five new muscle relaxants, only two of which survived. Vecuronium was released in 1980 as the first “designer” relaxant. Once the exact structure of curare was known, and the cause of the vagolytic and neuromuscular blocking effects of pancuronium could be defined at a molecular level, the new monoquaternary drug, vecuronium was conceived. It had the same potency as pancuronium, one twentieth of the vagolytic action, and an effect that lasted half as long. The latter property resulted from its dual elimination via both the liver and kidney, unlike pancuronium’s dependence on renal elimination.

Not long afterwards, another new relaxant, atracurium appeared. Its advantage over other relaxants was that its elimination was partly by Hofmann spontaneous degradation in plasma and tissue, as well as by ester hydrolysis and organ metabolism. Atracurium became popular worldwide. There was however, widespread caution because the drug was known to cause histamine release with resulting hypotension and tachycardia. Except in rare circumstances, these effects were mild and transient, and were often overstated [2].

An additional atracurium related concern was raised due to laudanosine, a by-product of the metabolism of atracurium, that if present in sufficient concentrations may cause seizures due to a decreased seizure threshold. However, because Hofmann degradation reduces the atracurium concentration, it was unlikely that laudanosine would reach concentrations capable of causing seizures’also the case for the isomer of atracurium, cisatracurium.

John Savarese introduced mivacurium in 1984, as a relaxant with a short duration of action, resulting from its hydrolysis by plasma cholinesterase, just like succinylcholine. But it lasted twice as long as succinylcholine, and more importantly, it also took twice as long to act. It never achieved great popularity but continues to be available in some countries. Doxecurium and pipecuronium were also released in the 1980s, but offered no advantages over vecuronium and atracurium, and failed to find a market.

The advent of neuromuscular blocking drugs having fewer side effects and kinetics allowing easier reversal, encouraged their use in critical care units for the control of patients requiring mechanical ventilation. This reflected the increased use of long-term ventilation, and a mistaken view among some critical care physicians that muscle relaxants possessed analgesic and/or anxiolytic properties. (As far as the authors know, this view was not expressed in print and may just be an urban legend. But it has a curious support in the 2003 finding that muscle relaxant administration can decrease the BIS value, albeit with no decrease in consciousness [3].) An unforseen result of long term paralysis was prolonged muscle weakness after tracheal extubation, identified in 1985 [4]. Use of smaller and intermittent doses solved the problem.


A Problem with Bupivacaine


A 1979 editorial inAnesthesiology, by Albright, brought attention to six cases of cardiac arrest associated with the use of the local anesthetics, bupivacaine and etidocaine [5]. Several additional cases were soon reported. Many were fatal, and were associated with the use of 0.75% bupivacaine in obstetrics. In 1983, the US FDA banned the use of 0.75% solution in obstetrics. These cases also resulted in the recommendation of a test dose in epidural anesthesia as well as incremental dosing. Studies suggested that bupivacaine affects voltage-gated sodium channels in cardiomyocytes, but the exact mechanism is yet to be defined.


Intravenous MAC


During the 1980s, three groups chased the automatic control of total intravenous anaesthesia, Prys-Roberts’ group in Bristol, the Schwilden/Schuttler group in Germany, and the Stanford group in the US. In 1980, Cedric Prys-Roberts conceived of a “MAC” for intravenous anesthetics. His group determined EC50 and EC95 values for quasi-steady state infusions of methohexital and althesin [6], and for the new emulsion formulation of propofol [7]. While others used the Dixon up-and-down method (which is parsimonious in terms of numbers of patients required) to provide the EC50, Prys-Roberts and colleagues used a probit analysis, arguing that such analysis allowed determination of both the EC50 and EC95. Prys-Roberts’ group opted to develop closed-loop control systems knowing that they probably would not be able to control anesthesia precisely (because no single variable or parameter can accurately differentiate between consciousness and unconsciousness, let alone the more complex so-called “depth of anaesthesia” paradigm.) The Germans followed a similar approach, and the Stanford group went the pharmacokinetic route. All reached similar conclusions. Prys-Roberts’ group determined the pharmacokinetic and pharmacodynamic model in small children using probit analysis [8], and joined forces with an engineering group in Oxford to develop a neural network approach using parametric modelling and on-line statistical pattern recognition of the EEG during anesthesia. Prys-Roberts wrote to LJS (personal note 16 Nov 2012): “I am as convinced now as I was then that it is no more difficult to control total intravenous anaesthesia (e.g., propofol/alfentanil/relaxant) using a manually controlled set of pumps [9] than it is to control inhalational anaesthesia by adjusting the setting of a vaporiser.”

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Mar 21, 2017 | Posted by in ANESTHESIA | Comments Off on Major Anesthetic Themes in the 1980s

Full access? Get Clinical Tree

Get Clinical Tree app for offline access