Abstract
Recent advances in molecular pharmacology and neuroscience have led to a greater understanding of how anesthetic chemicals can alter the function of the nervous system. Anesthesia through intravenous agents is a clinical state in which multiple behavioral endpoints are caused by a structurally diverse group of drugs. Certain membrane proteins possess specific binding sites that interact with injectable drugs currently used for clinical anesthesia. This chapter explores what is known and touches on what is unknown regarding the clinical use and neuropharmacologic mechanism of intravenous anesthetics.
Keywords
GABA A , receptors, benzodiazepines, propofol, ketamine, dexmedetomidine
Chapter Outline
History of Intravenous Anesthesia
Pharmacologic Targets of Intravenous Anesthetics in the Central Nervous System
Metabolism, Redistribution, Clearance, and Elimination
Advantages and Disadvantages of Propofol as an Induction Agent
Clinical Features of Anesthesia Maintenance With Propofol
History of Intravenous Anesthesia
The concept of blood delivery of medication can be traced to the middle of the 17th century soon after Harvey described the function of the vascular system. Not only did Sir Christopher Wren study blood transfusions in dogs, but he also experimented with intravenous delivery of injected opium solution into these animals via a feather quill. In the mid-19th century, technologic advancements in needle and syringe manufacturing led to injectable morphine for analgesia, but attempts to produce general anesthesia through intravenous drugs came later. Initial attempts with agents such as diethyl ether, chloral hydrate, magnesium sulfate, barbituric acid, and ethyl alcohol were stalled by prolonged side effects and limited techniques for ventilatory support. Local anesthetics were also tested as intravenous agents for general anesthesia in the early 20th century before their primary clinical application in regional anesthesia was established. The major historical developments in intravenous agents for general anesthesia lagged behind those of the inhaled anesthetics until Lundy and Waters began using barbiturates in the 1930s.
General Anesthesia by Intravenous Agents
For the pedagogic purposes of this chapter, intravenous anesthetic is defined as a clinically available substance that when administered directly to the patient via the bloodstream can be used to induce or maintain a state of general anesthesia. Many injectable substances, such as antihistamines or antipsychotics (see Chapter 12 ), have obvious effects on the central nervous system (CNS) and can depress cognitive status and induce sleep. These drugs are potentially useful for several perioperative situations where sedation is required, but they are neither appropriate nor safe as primary agents for producing general anesthesia. Similarly, medications like the benzodiazepines can be classified as intravenous anesthetics yet are currently used primarily in the perioperative setting for premedication and sedation, not anesthesia. The intravenous opioids (see Chapter 17 ) form the backbone of modern surgical analgesia yet are not considered true intravenous anesthetics because awareness and recall can occur despite very high doses that produce deep sedation.
The concept of balanced anesthesia was originally used by Lundy to describe premedication and light sedation as adjuncts to regional anesthesia, but the term was almost universally adopted when nitrous oxide anesthetics supplemented with thiopental and d-tubocurarine grew in popularity in the middle of the 20th century. At present, it is usually accepted that the state of general anesthesia can best be described as a delicate balance of the following effects: unconsciousness, analgesia, amnesia, suppression of the stress response, and sufficient immobility. The relative importance of each of these separate components varies for each case depending on specific surgical and patient factors; anesthesiologists must tailor combinations of intravenous drugs to match these priorities.
Intravenous Anesthesia Mechanisms and Theory
Modern anesthetic techniques have transformed surgery from a traumatic and barbaric affair to an acceptable, routine, and essential part of modern medicine. Despite the technologic advances made in perioperative medicine and surgical techniques, the drugs administered by anesthesiologists to render patients unconscious continue to be used without a clear understanding of how they produce anesthesia. Fortunately, through recent advances in molecular pharmacology and neuroscience, clinicians and investigators understand better than ever before how anesthetic chemicals can alter the function of the nervous system.
The elucidation of general anesthetic mechanisms was not accessible to traditional pharmacology methods; anesthesia is a clinical state in which multiple behavioral endpoints are caused by a structurally diverse group of drugs. Nevertheless, it appears that certain membrane proteins ( Fig. 10.1 ) possess binding sites that interact with many of the currently used anesthetics ( Table 10.1 ). In general, the halogenated volatile anesthetic agents (see Chapter 11 ) exhibit less specificity for molecular targets than the intravenous agents.
GABA A Receptors | NMDA Receptors | 2PK Receptors | Glycine Receptors | AMPA Receptors | 5-HT Receptors | |
---|---|---|---|---|---|---|
Thiopental | ↑↑ | ↓ | ↓↓ | ↓ | ||
Benzodiazepines | ↑↑ | ↑ * | ↓ * | ↑ * | ||
Etomidate | ↑↑ | ↑ | ↑ | |||
Propofol | ↑ | ↓ | ↑↑ | ↓ | ↓ | |
Ketamine | ↑ | ↓ | ↑ | ↑ | ||
Dexmedetomidine (α 2 agonist) | ↑ * | ↓ * | ||||
Isoflurane | ↑ | ↓ | ↑↑ | ↑↑ | ↓↓ | ↑ |
* Asterisks indicate a complex relationship between drug and receptor—either evidence for direct allosteric interaction between these drugs and receptors is suspected but has yet to be found, or receptor activity and/or expression is known to be influenced through administration of the drugs. Isoflurane is also shown for comparison.
At the cellular and network levels, intravenous anesthetics alter signaling between neurons by interacting directly with a small number of ion channels. Under normal conditions, these specialized membrane proteins are activated by chemical signals or changes in the membrane environment. Upon activation, channels modify the electrical excitability of neurons by controlling the flow of ions across the cell membrane via channels coupled with specific receptors that sense the initial signal (see Chapter 1 ).
The majority of intravenous anesthetics exert their primary clinical anesthetic action by enhancing inhibitory signaling via gamma-aminobutyric acid type A (GABA A ) receptors. Ketamine and dexmedetomidine are notable exceptions. From a neurophysiology perspective, unconsciousness can be considered as a disruption of the precisely timed cortical integration necessary to produce what is considered the conscious state. It is interesting to note that the unconsciousness produced by ketamine is phenotypically different from that produced by the GABA-ergic agents (e.g., propofol, thiopental, or etomidate). Although many talented scientists worked diligently throughout the 20th century to discover a unifying mechanism by which diverse chemicals cause what is loosely defined as the anesthetic state, molecular investigations into the action of individual drugs have revealed that this one true “grail” does not exist. It should rather be interpreted that the anesthetized state (and its separate components) can be arrived at by any disruption of the delicately constructed and precisely timed neuronal networks that underlie the normal awake, un-anesthetized state.
Pharmacologic Targets of Intravenous Anesthetics in the Central Nervous System
GABA A Receptors
GABA is the most abundant inhibitory neurotransmitter in the brain. GABA A receptors represent the most abundant receptor type for this ubiquitous inhibitory signaling molecule. GABA A receptors are broadly distributed in the CNS and regulate neuronal excitability. They appear to mediate unconsciousness, arguably the most recognizable phenotype associated with general anesthesia. There is also strong evidence that GABA A receptors are involved in mediating some of the other classic components of general anesthesia, including depression of spinal reflexes and amnesia. The contribution of GABA A receptors in mediating immobility and analgesia is less clear.
GABA A receptors are ligand-gated ion channels, more specifically members of the “Cys-loop” superfamily that also includes nicotinic acetylcholine, glycine, and serotonin type 3 (5-hydroxytryptamine type 3) receptors. Each of these receptors is formed as a pentameric combination of transmembrane protein subunits (see Fig. 10.1 ). This superfamily is named for the fixed loops formed in each of the subunits by a disulfide bond between two cysteine residues. In 2014, the protein crystallization of a human GABA A receptor was reported. Binding pockets for neurotransmitters are located at two or more extracellular interfaces: in the case of GABA A receptors, the endogenous ligand GABA binds between the α and β subunits. Thus far, 19 genes have been identified for GABA A receptor subunits (α1-6, β1-3, γ1-3, δ, ε, θ, π, ρ1- 3). Although millions of subunit arrangements are possible, only a subset of receptor configurations is expressed in significant amounts in the CNS. Preferred subunit combinations distribute among different brain regions and even among different subcellular domains. Each type of GABA A receptor exhibits subtly distinct biophysical and pharmacologic properties that in turn have diverse influences on synaptic transmission and synaptic integration.
Specific behavioral effects of drugs have been linked to different subunit assemblies present in different brain regions. For example, benzodiazepines are thought to interact with GABA A receptors between the α and γ subunits (see Fig. 10.1 ) and specific clinical effects have been linked to receptors containing specific α and β subunits. Propofol, etomidate, and barbiturates interact with GABA A receptors within, or proximal to, β subunits. The β subunits show less specific subcellular localization compared to α subunits, and the distribution of β subunits in mammalian brain does not share the same clear distinctions as α subunits. Therefore distinguishing specific differences between the GABA-mediated anesthetic effects of non-benzodiazepine intravenous anesthetics has been more elusive. However, research involving genetically manipulated mice has suggested that the sedation produced by etomidate can be primarily associated with activity at the β 2 subunit while unconsciousness produced by the same drug can be associated with β 3 subunits (see “ GABA A Insights from Mutagenic Studies ”). Subtle pharmacodynamic differences between intravenous anesthetics (e.g., effects on postoperative nausea and vomiting) might also be mediated by interactions with targets in addition to GABA A receptors.
Almost all general anesthetics enhance GABA A receptor–induced chloride currents by enhancing receptor sensitivity to GABA, thereby inhibiting neuronal activation. Most of these drugs, at high concentrations, also directly open the channel as an agonist in the absence of GABA. By virtue of the specific distribution of these receptors in the cerebral cortex and other brain regions, in addition to being involved in producing anesthesia, GABA A receptors function in thalamic circuits necessary for sensory processing and attention, hippocampal networks involved in memory, and thalamocortical circuits underlying conscious awareness. Computational neuronal modeling studies have been important in revealing the impact of propofol and etomidate on dynamic changes in these networks.
GABA A Insights From Mutagenic Studies
The critical role of the GABA A receptor in the pharmacodynamics of anesthetic drugs has been established through laboratory experimentation using genetic modifications of this protein. In the early 1990s, in vitro studies aimed at determining the interactions of the endogenous ligand GABA with its receptor revealed that specific amino acid substitutions in the GABA A receptor conferred the sensitivity of the channel to agonist or allosteric modulator activity. By 1995, several functional domains of the GABA A receptor associated with binding of agonists and allosteric modulators were identified (reviewed by Smith and Olsen ). Following the discovery of the site of benzodiazepine action, investigators discovered that mutation of a pair of transmembrane amino acids on the α subunit of GABA A renders the receptors insensitive to inhaled anesthetics. That same year, a corresponding area on the β 3 subunit was determined to be critical for the action of etomidate. Further studies have determined this location to be essential to the actions of other intravenous agents such as propofol and pentobarbital. A different amino acid residue appears to be involved with the specific actions of propofol only ( Fig. 10.2 ).
The strategy behind these discoveries was to examine the subtle differences among the amino acid sequences in receptor isoforms known to have different sensitivities to anesthetics. Originally receptor chimeras of unnatural subunit configurations were used, eventually giving way to specifically modified subunits via site-directed mutagenesis. Potential residues and amino acid sequence domains that mediate anesthetic sensitivity can be anticipated by comparing data among different mutated and chimeric receptors. Typically, functional studies are carried out in vitro by virally transfecting this DNA into living cells, which results in expression of these receptors on their cell surface. Immortal cells derived from human embryonic kidney cells and the eggs of the amphibian Xenopus laevis (oocytes) have been invaluable as conduits for this type of research; establishing electrical access in these cells using the conventional patch-clamp technique is relatively straightforward and measurements of their chloride currents in response to opening of the GABA A channel is robust (see Fig. 10.2 ).
By now a great body of literature exists in which anesthetic sensitivity has been mapped to specific point mutations on GABA A receptors. Some of these point mutations, so-called silent mutations, do not affect the natural function of the receptor but only its modulation by specific anesthetics. The introduction of mutations such as these into genetically modified mice can be used to determine the importance of this receptor to the production of certain qualities of an anesthetic in vivo. In vivo experimentation on animals possessing these “knock-in” point mutations has advantages over traditional gene knockout studies. Specifically, if the receptor is unaltered except with respect to its response to exogenous anesthesia, there exists less potential for compensatory changes that could influence the results of in vivo experiments. Mouse geneticists have successfully bred mice with attenuated sensitivity to benzodiazepines as well as etomidate and propofol.
By creating a point mutation that confers anesthetic selectivity in a single specific subunit, the phenotypic effects attributable to that drug and that subunit can be dissected ( Fig. 10.3 ). For example, the substitution of a methionine (M) residue for an asparagine (N) residue in the 265th position on the β 3 subunit (β 3 N265M mutation) results in a phenotypically normal animal that is essentially immune to the hypnotic effects of propofol and etomidate while maintaining its sensitivity to inhaled anesthetics. Similarly, substituting an arginine (R) for the histidine (H) residue in the 101st position on the α 1 subunit (α 1 H101R mutation) essentially renders that GABA A receptor unresponsive to benzodiazepines. By making analogous substitutions in the other α subunits, scientists have been able to map particular behavioral effects from benzodiazepines to specific subunits. For example, the α 1 subunit appears to mediate the sedative, amnestic, and anticonvulsant actions of some benzodiazepines, whereas benzodiazepine muscle relaxation and anxiolysis are mainly mediated via α 2 and α 3 subunits.
These amino acid substitutions (in vitro and in vivo) alter the molecular environment in the anesthetic binding cavity by reducing the number of favorable interactions between the receptor and the anesthetic molecule, thus reducing the efficacy of that anesthetic drug on enhancing GABA-ergic transmission. Although this work has greatly increased insight into anesthetic mechanisms, a complete description of the biophysical interactions between specific residues and specific anesthetics remains as incomplete as understanding of the spike-coded pattern of neuronal network activity responsible for the transitions between consciousness and unconsciousness.
N-Methyl-d-Aspartate Receptors
Whereas augmentation of endogenous inhibitory chemical signaling is important to mechanisms of anesthesia, mitigation of excitatory signaling also depresses neuronal activity. Of the many excitatory chemical signals in the CNS, blockade of the N-methyl-D-aspartate (NMDA)-type glutamate receptor appears to be most relevant to mechanisms of anesthesia. There are two broad categories of excitatory synaptic receptors that use the amino acid L-glutamate as their chemical messenger: NMDA receptors and non-NMDA receptors. The latter group, which mediates fast excitatory postsynaptic currents, can be subdivided into α-amino-3-hydroxy-5-methyl-4-isoxazole propionic acid (AMPA) receptors and kainate receptors. Neither AMPA nor kainite receptors have a clear effect on anesthetic-induced unconsciousness, but the role of AMPA receptors in “off-target” effects of anesthetics is being actively explored (see “ Emerging Developments ”). NMDA receptors, by contrast, mediate excitatory postsynaptic currents of relatively prolonged duration. NMDA receptors are found presynaptically, postsynaptically, and extrasynaptically ; they are important targets for xenon, nitrous oxide, and the dissociative anesthetic ketamine.
NMDA receptors are tetramers consisting of four subunits arranged circumferentially around a central ion channel pore (see Fig. 10.1 ). All NMDA receptors contain an obligatory NR1 subunit and at least one of four types of NR2 subunits (A–D). Other subunits (NR3) and numerous splice variants exist that translate to considerable variability in the kinetic and pharmacologic profile of each receptor isoform. In contrast to non-NMDA glutamate receptors, which are selective for sodium ions (Na + ), the pore of NMDA receptors permits entry of both monovalent and divalent cations (Na + and calcium ions [Ca 2+ ]) into the cell upon activation. The Ca 2+ flux is important for activating Ca 2+ -dependent processes in the postsynaptic cell such as long-term potentiation, a form of synaptic plasticity thought to play an important role in memory.
NMDA receptors are unique among ligand-gated ion channels in that their probability of opening depends not only on presynaptic release of neurotransmitter but also on the voltage across the membrane containing the receptor. Until membrane depolarization occurs, magnesium ions (Mg 2+ ) block the channel pore if agonist is present. High-frequency excitatory input causes membrane depolarization, so NMDA receptors play a significant role in CNS functions that require activity-dependent changes in cellular physiology such as learning and processing of sensory information. In the nociceptive circuitry of the spinal cord, repeated peripheral nerve stimulation results in an increase in response to subsequent stimuli through activation of NMDA receptors. This “windup” phenomenon is associated with hyperalgesia, and both knockout and knockdown of NR1 block inflammatory pain in animals. This, combined with their ability to modify opioid tolerance, makes NMDA receptor antagonists promising treatments for chronic pain.
NMDA antagonists are classified by their mechanism of action. Volatile anesthetics may exert some of their effects as competitive antagonists by displacing the coagonist glycine from NMDA receptors. The intravenous NMDA antagonists in current clinical use are primarily channel blockers that bind to the pore only in its open confirmation. They are considered uncompetitive antagonists and, because their binding requires prior activation by agonist, they are termed use-dependent . Blockers such as ketamine, which remain bound after channel closure, cause prolonged disruption of the associative aspects of neuronal communication. High concentrations of ketamine cause sedation and loss of consciousness with a significant incidence of dysphoric effects. This may reflect a lack of selectivity among various NMDA isoforms or activity at other receptors (see Table 10.1 ). Noncompetitive antagonists, which bind to allosteric sites with some NMDA receptor subunit specificity, may have more specific clinical effects and a better side effect profile.
Other Molecular Targets
Other receptors, such as glycine receptors, voltage-gated Na + channels, and two-pore domain potassium (2PK) channels, deserve attention as they probably contribute to certain components of the balanced anesthetic state with intravenous anesthetics. Glycine receptors colocalize with GABA A receptors near the cell body. Propofol, etomidate, and thiopental all have some positive modulation of the glycine receptors, but ketamine does not. Glycine receptors have an inhibitory role, particularly in the lower brainstem and spinal cord. They are likely major contributors to anesthetic immobility, especially that produced by the volatiles. An investigation of spinal neurons estimated that propofol’s effects on immobility were mediated almost entirely via GABA receptors, whereas the immobility caused by sevoflurane was predominantly mediated by glycine receptors. Propofol inhibits some subtypes of voltage-gated Na + channels, which could contribute to its antiepileptic activity. In high doses, ketamine can also block Na + channel activity.
2PK Channels modulate neuronal excitability through control of the transmembrane potential. There are 15 different 2PK isoforms, and functional channels are formed from homomeric or heteromeric dimers. Genetic deletion of several members of this channel family (TREK1, TREK2, TASK1, TASK3, and TRESK) in animal models reduces the immobilizing effect of intravenous and volatile general anesthetics, which suggests a contribution to their anesthetic mechanisms.
Hyperpolarization-activated cation channels (HCN channels) are important in mediating coordinated neuronal firing between the thalamus and cortex. These channels play an important role in setting the frequency of thalamocortical rhythms critical for high-order cognitive processing and are also important for controlling burst firing in the hippocampus, thalamus, and locus coeruleus. Amnesia and hypnosis are produced upon disruption of signaling in these brain areas via inhaled and intravenous anesthetics. Some anesthetics like dexmedetomidine exert their sedating effects by activating α 2 receptors in the locus coeruleus.
Novel techniques may identify a host of other targets for anesthetic drugs. For example, azipropofol is an analog that becomes covalently linked to a binding partner when exposed to light. Tadpoles given this drug had prolonged anesthesia when exposed to light and proteomic analysis of the linked molecules identified novel potential targets of propofol.
Individual Agents
Barbiturates
In 1864, some 20 years after Morton and Long independently discovered inhaled anesthesia, von Baeyer synthesized barbituric acid, which eventually led to the development of intravenous anesthesia. Barbituric acid is pharmacologically inert, but substitutions at the C5 position impart hypnotic activity. The clinical use of barbiturates as hypnotics began in 1904 with 5,5-diethyl-barbituric acid, but their adaptation to anesthesia was pioneered in the late 1920s by Bumm in Germany and Lundy in the United States. Lundy’s investigations included pentobarbital, also known as nembutal (sodium [N] 5-ethyl-5-[1-methylbutyl]) substituted barbituric acid). Thionembutal, the sulfur substitution of this compound better known as sodium thiopental, became the dominant intravenous induction agent of the next 60 years.
Thiopental was popularized by Lundy but might have been first used in a patient several months earlier in 1934 by Waters in Wisconsin. It was used in short procedures and early use was limited by the lack of reliable equipment to maintain intravenous access. Over time it evolved from a sole agent to a means of inducing anesthesia without breathing the pungent vapors of that era (e.g., ether).
Both pentobarbital and thiopental are supplied as racemic mixtures (see Chapter 2 ), with the S(−) isomer exhibiting roughly double the potency of the R(+) form. Methylation of the ring nitrogen of methohexital creates a second chiral center and a total of four stereoisomers. Early animal work at Eli Lilly revealed that the potent β isomers had more excitatory side effects so methohexital is marketed as a racemate of the α isomers. The α-dextrorotatory form is roughly three times more potent than the α-levorotatory ( S b R h ) isomer and recent stereochemical synthesis suggested that the latter is responsible for the residual excitatory effects of commercial methohexital ( Fig. 10.4B ).
Methohexital found use in electroconvulsive therapy and pentobarbital was used as a pediatric premedication, but thiopental was the mainstay for the intravenous induction of anesthesia before the introduction of propofol. American production of thiopental ceased in 2010 and plans to import the drug from Europe stalled over the political controversy associated with thiopental’s use for lethal injection in the U.S. penal system. Like etomidate, thiopental enhances GABA A receptor function in a stereoselective manner. This is also true for other barbiturate sedatives/hypnotics such as hexobarbital and pentobarbital. Thiopental directly gates GABA A receptors at high concentrations (like propofol and etomidate). Thiopental discriminates between synaptic and extrasynaptic GABA A receptors in the hippocampus, suggesting a possible role for the δ subunit in barbiturate enhancement of this channel. The β subunit has also been implicated as a molecular site of action of barbiturates at GABA A receptors.
In addition to GABA A receptors, barbiturates modulate a variety of ligand-gated ion channels in vitro. They block the action of the excitatory neurotransmitter glutamate at AMPA and kainate receptor subtypes, but not at NMDA receptors. They also inhibit neuronal nicotinic acetylcholine receptors. The relevance of these receptors to the clinical action of barbiturates remains unclear. Several clinical effects appear to be modulated by allosteric augmentation at GABA A receptors. Transgenic mice carrying a point mutation (N265M) of the GABA A β 3 subunit are resistant to the immobilizing effects of pentobarbital. However, this mutation had no effect on the pentobarbital-induced respiratory depression and intermediate blunting of hypnotic and cardiovascular effects. Future studies are needed to determine whether these effects are mediated by some other GABA A receptor subtype, ionotropic glutamate receptors, or other receptors.
Barbiturates produce dose-dependent CNS depression with characteristic effects on electroencephalography (EEG), progressing from a low-frequency, high-amplitude pattern to isoelectric periods of increasing duration. The cerebral metabolic rate for oxygen (CMRO 2 ) is decreased to a widely quoted maximal extent of 55%, and both cerebral blood flow and intracranial pressure (ICP) are decreased as a result of flow-metabolism coupling. These factors led to widespread use of barbiturates for neuroprotection, although large trials and metaanalyses have failed to show substantial clinical benefit in humans. Thiopental and pentobarbital have anticonvulsant effects in the setting of prolonged high-dose administration, whereas prolonged use of methohexital leads to epileptiform discharges.
Barbiturates cause venodilation with consequent decreases in preload and cardiac output. This effect is particularly pronounced in hypovolemic patients. There is in vitro evidence for direct myocardial depression at doses much higher than those used for anesthesia. Selected effects of barbiturates are summarized in Fig. 10.4 .
Barbiturate sodium salts require a pH greater than 10 to remain in aqueous solution and precipitate readily when mixed with nonbasic solutions such as normal saline solution. They also precipitate when combined in high concentrations with certain weak bases such as the neuromuscular blockers, forming a “conjugate salt” (e.g., sodium thiopental and rocuronium bromide precipitate to form sodium bromide). In the plasma, barbiturates are readily protonated and become quite lipophilic, which leads to unconsciousness 30 to 60 seconds after bolus injection. Kinetic modeling of bolus administration reveals a rapid peak in effect site (CNS) concentration ( Fig. 10.5A ). The subsequent decline in concentration results primarily from redistribution to other tissues rather than metabolism and elimination (see Chapter 2 ). Repeated bolus dosing or continuous infusions lead to very prolonged recovery times as depicted by plots of the context-sensitive half-time (see Fig. 10.5C ). When given in very high doses, thiopental can even exhibit “zero-order kinetics,” wherein metabolic capacity is saturated, grossly prolonging the duration of action.
Thiopental is highly protein bound (75%–90%) in the plasma, primarily to albumin. Decreased albumin levels such as those seen in chronic renal or hepatic disease can markedly increase free drug levels and may warrant decreased dosing. Thiopental undergoes some desulfurization to form its parent drug pentobarbital. This “metabolite” is likely relevant only to prolonged high-dose administration of thiopental. Both drugs are deactivated by poorly characterized pathways in the liver to form inactive carboxylic acid and alcohol derivatives. Thiopental causes some cytochrome P450 induction but this is generally of no clinical consequence when used as an induction agent. An active porphyric crisis could be exacerbated by this mechanism, but triggering an event in a patient with latent porphyria is reported to be unusual. Even so, it is prudent to avoid even potential triggers in patients with porphyria.
Benzodiazepines
Benzodiazepines were introduced into clinical use in the 1960s and rapidly gained popularity as anxiolytics with a wider safety margin than barbiturates. Diazepam, lorazepam, and midazolam are three of the most important class members in the practice of anesthesia. Diazepam and lorazepam are lipophilic and are traditionally formulated in propylene glycol because they are not soluble in water. These formulations are associated with pain on injection and long-term infusions in the intensive care unit (ICU) can lead to glycol toxicity. The imidazole ring of midazolam allows the preparation of acidic aqueous solutions, which cause minimal pain on injection. Once at physiologic pH, midazolam assumes a more lipophilic ring conformation and can gain rapid access to the CNS ( Fig. 10.6B ).
Benzodiazepines act on a subset of GABA A receptors containing γ subunits to potentiate chloride conductance upon GABA binding. This modulatory mode of action is postulated to create a “ceiling effect” that limits CNS depression, although when combined with other drugs, benzodiazepines can lead to dangerous respiratory depression. Even when respiratory drive is preserved, midazolam increases the likelihood of airway obstruction, and caution is advised in patients with obstructive sleep apnea or advanced age. The binding pocket appears to be at the interface of the α and γ2 subunits in receptors containing α 1 , α 2 , α 3 , or α 5 subunits. Recent elegant studies of mutant GABA A subunits in transgenic mice have raised the possibility of dissecting the effects of the benzodiazepines at a molecular level (see earlier “ GABA A Insights from Mutagenic Studies ”).
Benzodiazepines have anxiolytic, sedative, hypnotic, amnestic, and anticonvulsant properties in the CNS. In animals, they may either promote or block hyperalgesia depending on the model studied. The α subunit of the GABA A receptor appears to differentially mediate the CNS effects of benzodiazepines. For example, α 2 -containing receptors play an important role in anxiolysis, anti-hyperalgesia, and centrally mediated muscle relaxation, whereas α 1 -containing subunits are important mediators of the sedative, amnestic, and anticonvulsive effects of benzodiazepines. Although benzodiazepines can be used for the induction of anesthesia, the high doses required delay emergence unless the surgery is of prolonged duration. Combining midazolam with another induction agent can offer improved hemodynamic stability without affecting emergence. Other selected effects of benzodiazepines are depicted in Fig. 10.6 .
Oral midazolam in children has an onset of action within 10 minutes and peaks in 20 to 30 minutes; its effects begin to dissipate 45 minutes after administration. This kinetic profile requires some planning to get the child to the operating room within the window of efficacy. As shown in Fig. 10.5A , intravenous midazolam does not reach peak effect site concentration until nearly 10 minutes after administration. When titrating midazolam, one must therefore be patient to avoid “stacking” the doses and oversedating the patient. The onset of intravenous diazepam is more than twice as rapid as that of midazolam, but its use is limited by an extremely long duration of action. Not only is the elimination half-life 10 times longer than that of midazolam, but about half of the parent drug is metabolized to the active compound desmethyldiazepam, which has an even longer elimination half-life. Both midazolam and diazepam are metabolized by cytochrome P450 enzyme 3A (CYP3A) family members in the liver and are subject to interactions with erythromycin and antifungal medications. Diazepam is also metabolized by CYP 1A2 and CYP 2C19.
Premedication is currently the main role of benzodiazepines in anesthesia; a majority of U.S. practitioners surveyed in the 1990s used intravenous midazolam for adults (>70%) and oral midazolam for children (80%). For patients requiring long-term mechanical ventilation, combinations of an opioid and benzodiazepine have been the traditional method of sedation but the latest Society of Critical Care Medicine (SCCM) guidelines encourage use of nonbenzodiazepine alternatives. Although midazolam is often considered a short-acting benzodiazepine, prolonged infusions in critically ill patients lead to markedly delayed awakening in part owing to accumulation of its active metabolite 1-hydroxymidazolam (see Fig. 10.5B and C ). For this reason, the SCCM sedation guidelines recommend limiting midazolam use to 2 to 3 days. For longer periods of sedation, the benzodiazepine of choice is lorazepam, but some formulations of lorazepam are formulated in a glycol-based vehicle and prolonged infusions lead to glycol toxicity. An emerging body of literature links benzodiazepine use to delirium in critically ill patients. Because delirium increases time on mechanical ventilation, length of stay, and morbidity and mortality, there has been some movement toward other sedatives in the ICU. Benzodiazepines also have some immunomodulatory effects on monocytes and T cells in vitro. There are several studies linking outpatient benzodiazepine use to increased infection, but it is unknown whether brief perioperative use carries any risk.
An important and unique feature of the benzodiazepines compared to the other GABA-ergic sedative hypnotics is that a competitive antagonist for the reversal of benzodiazepine effects is available. As an intravenous rescue agent, flumazenil can rapidly reverse the CNS depression associated with benzodiazepine intoxication. Routine administration of flumazenil to patients presenting to the emergency department with the suspicion of overdose can lead to seizures via precipitating acute withdrawal in chronic benzodiazepine users. But there is some evidence that flumazenil has anticonvulsant properties suggesting that this drug may have mixed or partial agonist effects on the GABA A receptor, even in the absence of benzodiazepine administration. In support of this phenomenon is the potential for high doses of flumazenil to potentiate the hypnosis of other positive GABA modulators such as propofol. Its use to augment recovery from general anesthesia is being explored with mixed results (see “ Emerging Developments ”). Although flumazenil can be lifesaving in cases of benzodiazepine overdose, its short-acting kinetic profile creates the possibility of resedation after the effects of flumazenil dissipate.
Etomidate
Etomidate is a rapidly acting intravenous agent that was introduced into clinical practice in the 1970s. Compared with other induction agents, it has minimal effects on the cardiovascular system. Because of its association with nausea and prolonged suppression of adrenocortical synthesis of steroids, its main clinical use is for inductions in which hemodynamic stability is essential. The R(+) isomer has much greater hypnotic effects ( Fig. 10.7 ), and it is formulated as a single enantiomer. Like propofol, etomidate interacts with GABA A receptors in a stereoselective manner. Its enhancement of GABA-mediated current is smaller on receptors containing the β 1 subunit. Of all the clinically used intravenous anesthetics, etomidate exhibits the greatest selectivity for GABA A receptors and has the fewest relevant interactions with other ion channels (see Table 10.1 ).
Following intravenous injection, etomidate is tightly bound to plasma proteins such as albumin. The uncharged drug is highly lipophilic so etomidate rapidly penetrates the blood-brain barrier; peak brain levels are achieved within 2 minutes of injection (see Fig. 10.5A ). Etomidate is metabolized in the liver by ester hydrolysis to a pharmacologically inactive metabolite.
The effects of etomidate on the CNS are similar to those of propofol and the barbiturates. Induction doses are associated with a high incidence of myoclonus, possibly via a loss of cortical inhibition during the transition from consciousness to unconsciousness. Although this myoclonic activity could be mistaken for generalized tonic-clonic seizures, etomidate has anticonvulsant activity in several experimental models. Epileptic attacks occur less frequently during etomidate anesthesia, but it is likely that propofol and thiopental possess greater anticonvulsant effects; etomidate is therefore a viable option for electroconvulsive therapy.
Etomidate causes less depression of ventilation compared with the barbiturates (see Fig. 10.7 ). Despite its favorable hemodynamic profile, it should be noted that patients with high sympathetic tone such as those with shock, intoxication, or drug withdrawal can have a precipitous drop in blood pressure even when etomidate is used to induce anesthesia.
In 1983, investigators reported increased mortality in ICU patients sedated for days with etomidate. The increased mortality was attributed to suppression of cortisol synthesis since etomidate is a potent inhibitor of the synthetic enzyme 11β-hydroxylase in the adrenal cortex. The original retrospective study has been criticized for failing to control for the severity of illness and for the potential role played by concurrent administration of adjuncts to ICU sedation (e.g., opioids) in those patients. Randomized controlled trials in elective cardiac surgery and critically ill patients verified the adrenal suppression but did not show differences in clinical outcome. A large retrospective study of cardiac surgery patients using propensity matching also found no link between etomidate use and increased length of stay or arrhythmia. However, the same group found etomidate carried an odds ratio for death of 2.5 in a retrospective, propensity-matched analysis of more than 7000 American Society of Anaesthesiologists (ASA) class III and class IV patients undergoing noncardiac surgery. Another study of nearly 6000 patients (approximately 40% ASA class III and class IV) before and after increased etomidate use in the setting of the 2010 propofol shortage found no such association. Although the most recent metaanalysis reported no statistically significant association between etomidate use and mortality in critically ill patients, new etomidate analogs are currently under development to avoid endocrine disturbance (see “ Emerging Developments ”).
Propofol
Propofol ( Fig. 10.8 ) is the most widely used intravenous agent. While it is most often used for inducing anesthesia, it also is used for procedural/ICU sedation and for total intravenous anesthesia (TIVA). Since its introduction in the late 1980s TIVA use has dramatically increased. Propofol is listed as an essential medicine by the World Health Organization.
Pharmacology
Propofol (2,6, diisopropylphenol) potentiates GABA-mediated responses in neurons and directly activates GABA A receptor function. Other receptors respond to propofol in the therapeutic concentration range, but the majority of its clinical effects are likely mediated through GABA receptors. The specific interaction between propofol and GABA A receptors is not fully characterized, but likely involves residues in the transmembrane domains on β subunits. Site-directed mutagenesis studies with recombinant GABA A receptors have contributed to precise knowledge regarding the molecular interactions between this drug and GABA A receptors. Initially, only the property of direct activation of the GABA A receptor by propofol was thought to depend on the β subunit, whereas the modulatory effects were considered to involve other subunits. There is now evidence that α, β, and γ subunits all contribute to GABA A sensitivity to propofol.
Formulation and Preparation
The necessity to formulate propofol in a lipid emulsion (because of its extremely poor water solubility) has some very important clinical implications. The original preparation of propofol used a polyethoxylated castor oil (Cremophor EL) in its formulation, but this mixture was abandoned because of concern for allergic reactions to the excipient. The most commonly available propofol formulations today involve a mixture of soybean oil, glycerol, and purified egg phospholipid to solubilize the drug. Despite initial concerns that the egg-derived lecithin might precipitate anaphylaxis in egg-allergic individuals, it appears that propofol can be safely used in egg-allergic patients. Most egg allergies are related to egg albumin (in egg whites) as opposed to lecithin (primarily in the egg yolk), and a small study revealed no hypersensitivity to propofol in egg-allergic patients undergoing skin prick testing.
Perhaps most important among concerns over the lipid formulation of propofol involves the promotion of rapid microbial growth. This issue received much attention after a high-impact epidemiologic investigation described unusual outbreaks of postoperative infections at seven hospitals. In 2007 the U.S. Food and Drug Administration (FDA) mandated that all propofol emulsions in the United States contain an antimicrobial agent (ethylenediaminetetraacetic acid or sodium metabisulfite). The FDA also recommended that, to minimize the potential for bacterial contamination, both the vial and prefilled syringe formulations must be used on only one patient, administration must commence immediately after the vial or syringe has been opened, and administration from a single vial or syringe must be completed within 12 hours of opening. Failing to follow these practices and usual aseptic technique for intravenous administration of medication can lead to transmission of viral particles between patients and bacterial sepsis.
Although pain on injection was a contributing factor to changing the initial formulation of propofol, the change to the current isotonic, emulsified, soybean-based formulation has not eliminated pain on injection of propofol. Pain on intravenous injection of any medication has multiple etiologies and can be influenced by the temperature of the drug, the site of administration, the size of the vein, the speed of injection, and the rate of infusion of the carrier fluid. Some common hyperosmolar formulations of diazepam and etomidate also cause pain on injection, and this pain is often reduced with formulations that are closer to blood in osmolarity. However, the modern formulation of propofol is not hyperosmolar and has a pH in the physiologic range. It is thought that the pain on injection of propofol is related to its free aqueous concentration, because the pain on injection appears to be less when the propofol concentration is reduced in its current formulation. From a mechanistic perspective, investigations of propofol-induced injection pain has focused on the transient receptor potential (TRP) family of ion channels. Multiple investigators have shown a role for TRPA1 as a mediator of propofol-induced pain and TRPV1 has been implicated in some studies but not in others. Many strategies to reduce pain have been studied, but a metaanalysis concluded that use of an antecubital vein versus a hand vein and coadministration of lidocaine (either as a pretreatment or mixed with propofol) were most effective. Administration of opioids prior to propofol injection is another important way to mitigate injection pain.
Metabolism, Redistribution, Clearance, and Elimination
The rapid effects of propofol on the brain make it a frequent choice for sedation in monitored anesthesia care settings and for induction and maintenance of general anesthesia. Its rapid metabolic clearance is useful in pediatric procedures like magnetic resonance imaging, in the critical care arena for sedation during mechanical ventilation, and in neuroanesthesia to temporarily reduce cerebral metabolic rate (i.e., burst suppression or isoelectric EEG).
Propofol is metabolized via conjugation into inactive metabolites; however its clearance (total body clearance estimated at 25 mL/kg per minute) exceeds that of hepatic blood flow and continues to some extent even during the anhepatic phase of liver transplantation; this argues for some extrahepatic metabolism of the drug. Like most of the intravenous anesthetics, the offset of the hypnotic effect after bolus administration occurs mainly through redistribution of propofol from the brain to less well-perfused sites. Target plasma concentrations for inducing unconsciousness are 2 to 2.5 µg/mL and for maintenance of anesthesia are 2 to 6 µg/mL. The elimination half-life of propofol is prolonged because of the slow mobilization from adipose tissue. Regardless, obesity should not be considered a contraindication to its use for induction or maintenance.
Obesity does not drastically prolong recovery, and dose adjustments based on lean body weight have been suggested. However, there is some controversy about this as some groups have demonstrated total body weight (TBW) as the best predictor of propofol pharmacokinetics. One of the newest pharmacokinetic models for propofol by Eleveld et al. used TBW and was superior to previous models in obese patients. Another group validated the Eleveld model in an independent group of obese patients and found that the older Schnider and Marsh models performed better when TBW was replaced with adjusted body weight (defined as ideal body weight + 0.4 × [TBW − ideal body weight]).
Moderate hepatic or renal impairment has little effect on the duration of clinical effect. The dose requirement for propofol is reduced in the elderly because of reduced metabolic clearance of drugs and reduced relative volume of the central compartment. Dosing is increased in pediatric populations because relative central compartment volume is larger and clearance and metabolism are increased. Sex differences for the pharmacokinetics and pharmacodynamics of propofol have been reported, and men sometimes emerge more slowly from propofol total intravenous anesthesia. The mechanisms underlying this observation may involve sex-linked polymorphisms in liver microsomal enzymes.
Advantages and Disadvantages of Propofol as an Induction Agent
Propofol has a number of advantages over other intravenous agents for induction. It depresses airway reflexes, reduces nausea, and does not induce adrenocortical suppression. Simulations of bolus kinetics indicate that it is second only to ketamine in speed of onset and offset (see Fig. 10.5A ).
The primary disadvantage of propofol is its depressive effect on the cardiovascular system. Patients who are hypovolemic, debilitated, or reliant on high sympathetic tone to maintain blood pressure require careful titration of propofol to avoid severe hypotension. Animal models of severe hemorrhagic shock suggest that the induction dose should be reduced by a stunning 80% to 90% of usual doses if given before fluid resuscitation and decreased 50% if given after resuscitation. Use of propofol in patients with cardiac tamponade or critical aortic stenosis can result in hemodynamic collapse.
Mixtures of low-dose ketamine infusions (10–20 µg/kg per minute) with propofol infusions (100–200 µg/kg per minute) have been used to mitigate the cardiovascular effects of both drugs, especially in pediatric anesthesia. A recent metaanalysis of this mixture in the emergency department suggested a reduced rate of respiratory side effects compared with propofol alone. A reduction in induction dose (1–1.5 mg/kg) is recommended for patients older than 65 years of age; however, there is evidence that older patients often receive greater than this recommended dose.
Clinical Features of Anesthesia Maintenance With Propofol
Early clinical trials in the late 1980s and early 1990s consistently reported faster time to eye-opening and other recovery criteria for propofol-based TIVA anesthetics compared with “traditional” regimens of thiopental and volatile agents such as enflurane or isoflurane. Until the development of propofol, TIVA for surgeries with planned extubation was rarely performed owing to prolonged emergences. Propofol maintains a short context-sensitive half-time even after prolonged infusion (see Fig. 10.5C ). More recent studies comparing propofol-based techniques to newer inhaled agents with lower blood and tissue solubility (desflurane and sevoflurane) have either failed to show benefit or have shown only minimal benefits in speed of recovery, but there are environmental benefits to avoiding the greenhouse gas effects of volatiles. The carbon footprint of TIVA is orders of magnitude lower than that of inhaled anesthetic techniques. The updated consensus guidelines for nausea and vomiting recommend propofol-based TIVA as part of a multimodal approach for patients at high risk for nausea and vomiting. In addition to its benefits for patients at high risk for malignant hyperthermia or postoperative nausea and vomiting, propofol appears to confer improved operating conditions for endoscopic sinus surgery and decreased postoperative pain. For example, in gynecologic procedures postoperative pain and opioid administration were reduced in those receiving propofol TIVA. Similar trends were observed in two recent metaanalyses of surgeries of varying types, but more research is needed to determine if this is a clinically meaningful effect.
Anesthetic depth is an important consideration when comparing propofol-based TIVA techniques with volatile anesthesia. The concentration of expired anesthetic gases measured by modern anesthesia monitors correlates with the concentration of inhaled agent in the CNS. Because propofol blood concentrations are not easily measured in the operating room, many practitioners choosing TIVA with propofol as their maintenance anesthetic attempt to control for individual variability in drug clearance through the use of an EEG-based depth of anesthesia monitor because it has been shown to decrease anesthetic use and facilitate postoperative recovery. Favorable pharmacokinetic properties and effects on cerebral blood flow (CBF) and cerebral metabolic requirement for oxygen have made propofol popular for neuroanesthesia. In animal studies, reduction of CBF occurs even when mean arterial blood pressure is held constant, and propofol can reduce ICP even when cerebral perfusion pressure is fixed. Cerebral autoregulation also appears to be preserved in the setting of propofol anesthesia. In comparison, the volatile agents tend to increase ICP and at high doses compromise cerebral autoregulation. Although propofol and thiopental have similar effects on brain physiology, propofol might suppress apoptosis in models of ischemia in vitro (see Chapter 9 ).
The latest Society of Critical Care Medicine sedation guidelines recommended the use of propofol when rapid or frequent interruptions of sedation are required, such as serial neurologic evaluations. Awakening times were similar after 24-, 48-, 72-, and 96-hour constant rate infusions of propofol. Prolonged, high-dose infusions can lead to hypertriglyceridemia or a constellation of metabolic acidosis, rhabdomyolysis, renal failure, and hemodynamic instability, known as propofol infusion syndrome. These complications are not common, but patients should be monitored closely for propofol infusions greater than 48 hours. Although moderate-dose infusions (>40 µg/kg per minute) can increase triglyceride levels in critically ill patients after just 3 days, low-dose propofol infusions (<33 µg/kg per minute) showed no detectable increase in triglycerides after 2 weeks of constant infusion.
It is difficult to imagine the practice of anesthesia without propofol. It is also essential outside the operating room in settings such as the ICU, pediatric imaging procedures, and endoscopy suites. Propofol has numerous advantages (see Fig. 10.8 ), but its depression of respiration makes it potentially dangerous in unskilled hands. Propofol appears to have greater relaxation effects on the pharyngeal musculature than thiopental and should be administered only by persons trained in the rescue of patients who experience deeper-than-intended sedation, including general anesthesia (and who are not involved in the conduct of the procedure).
Currently, no widely accepted method is available to anesthesia providers in the United States that allows estimates of blood concentrations of intravenous drugs being infused for maintenance of anesthesia during TIVA. This presents a disadvantage of intravenous anesthetics compared with inhaled anesthetics, which are typically used in conjunction with monitors equipped with gas sampling for measurements of expired concentrations. Using computer-based models of drug pharmacokinetics, mathematical modeling of a drug’s disposition can provide a convenient way to estimate these blood concentrations of intravenous agents. Target-controlled infusion devices (e.g., Diprifusor) have been used successfully in maintaining anesthesia for both inpatient and outpatient surgery. Despite obvious advantages of these devices in predicting blood concentrations, some controversy exists in regard to which pharmacokinetic model should be used. Another impediment to widespread use of these devices has been the unfamiliarity of providers in targeting an actual blood concentration range, as it is more common to think in terms of infusion rates rather than blood concentrations for infusions of other medications in the hospital. There have been some attempts to improve the adoption of these devices by targeting the calculated index from processed EEG devices used to estimate anesthetic “depth” rather than targeting a desired blood concentration (see “ Emerging Developments ”).
Fospropofol
Circulating phosphatases metabolize the water-soluble prodrug, fospropofol, into propofol, formaldehyde, and a phosphate group ( Fig. 10.12A ). It was envisioned during the drug development process that the slower onset of drug effect associated with the metabolism of the propofol prodrug would allow safe bolus administration in the setting of procedural sedation. However, the FDA product labeling ultimately recommended that fospropofol should be administered only by persons trained in general anesthesia as unanticipated transitions to general anesthesia remain a possibility. The formaldehyde levels measured after routine fospropofol administration are indistinguishable from baseline levels. After administration of the drug, formaldehyde is rapidly metabolized to formate. Although high formate levels can lead to metabolic acidosis, there has been no reported formate toxicity from fospropofol. Other supposed advantages from this alternative formulation of propofol are decreased risk of bacteremia and decreased risk of hyperlipidemia. Fospropofol does not bind TRPA1 and has less pain on injection, although pruritus and perineal paresthesias are reported side effects of its administration.
Ketamine
Ketamine, an arylcyclohexylamine related to phencyclidine, was developed in the 1960s and approved for use in the United States in 1970. Early test subjects described a sense of disconnection from their environment, leading to the term “dissociative anesthesia.” Its combination of hypnosis and analgesia showed great promise as a complete anesthetic with minimal effects on cardiovascular function, respiratory drive, and airway reflexes. Concern over psychologic side effects such as hallucinations and emergence delirium limited its clinical use as a primary anesthetic agent, but there is an increasing body of evidence supporting the use of subanesthetic doses for treatment of acute and chronic pain (see following text).
Ketamine was originally produced as a racemic mixture, and most commercial preparations continue to be a racemic mix of the R and S enantiomers (see Chapter 2 ). S(+) ketamine, a single-enantiomer formulation available in some areas of Europe and South America, is more potent in most clinical and experimental settings ( Fig. 10.9 ).