Bedford and Leeds recommendations
Recent considerations
• Careful risk-benefit analysis for surgery on elderly patients
• Frailty assessment
• Consideration of the risk of postoperative cognitive dysfunction
• No routine pre- or postoperative medication
• Patient tailored pre- or postoperative medication
• Avoidance of anticholinergic side effects
• Cautious choice of drugs to avoid suppression of vital centers
• Neuromonitoring-guided anesthesia to generate adequate depth of anesthesia
• Goal-directed anesthesia and analgesia with TCI and neuromonitoring
• Usage of adjuvants such as alpha-2 agonists to reduce stress and to provide goal-directed depth of anesthesia
• Usage of adjuvants such as alpha-2 agonists and ketamine to reduce POD
• Avoidance of anticholinergic side effects
• Tight fluid, electrolyte, and hemodynamic management
• Tight fluid, electrolyte, and hemodynamic management
• Proper prevention and treatment of postoperative confusion
• Proper prevention and timely pharmacological and non-pharmacological treatment of postoperative delirium
Remarkably, after 60 years and over 12,000 published manuscripts relating to the topic, Bedfort and Leeds’ recommendations remain the major preventive foundations of postoperative cognitive dysfunction.
Our knowledge about this condition has grown exponentially in the last decades, and today it is well known that perioperative stress—including anesthesia—can cause adverse cerebral effects. Although major risk factors have been identified, only recently has the staggering incidence of postoperative cognitive dysfunctions been fully recognized [2, 3]. The exact pathophysiology remains elusive, but landmark studies and theories within the last decade promise a more comprehensive understanding of how surgery and anesthesia impact cognitive function, both in the short and long terms [4]. During Bedford’s time , volatile anesthetics were the only option to conduct general anesthesia, whereas today there is the additional possibility of using iv agents. This chapter is going to explain the role of postoperative delirium (POD) in the perioperative setting and focus on the role of iv agents and total intravenous anesthesia.
Spectrum of Cognitive Disorders Following Surgery
Cognitive complications that arise in the context of surgery are usually classed as direct brain insults, which have a clear causation (e.g., hypoxia, medication), or as aberrant stress responses, which tend to be multifactorial and show no clear etiology (e.g., peripheral infection) [5, 6]. These postoperative brain injuries can be subdivided in short-term and long-term cognitive disorders, whereas short term includes inadequate emergence (IE) and postoperative delirium (POD) and long term can be summarized as postoperative cognitive dysfunction (POCD) (Fig. 40.1). However, when considering the entire spectrum of these complications, the term “cognitive complication” is misleading, as the spectrum is by no means limited to cognition and may also have pronounced effects in awareness and motor skills (e.g., delirium).
Fig. 40.1
Spectrum of cognitive dysfunction after surgery . Immediately post anesthesia, the inadequate emergence affect in particular preschool children. Mostly postoperative delirium occurs hours after surgical procedures. Postoperative delirium and its subsequent outcomes like prolonged hospital stay, higher mortality, and posttraumatic stress disorder are a risk factor for long-term impairment. Postoperative cognitive dysfunction is characterized by cognitive impairment compared to the baseline performance. Recently naturally brain aging process and postoperative cognitive dysfunction in the elderly cannot be discriminated exactly
Inadequate emergence describes a state of altered mental status immediately following anesthesia, which can be further classified as either hyperactive or hypoactive emergence. In literature, IE is often cited as emergence delirium, though the criteria for delirium are not always entirely fulfilled. IE is especially prevalent in the pediatric population , and aside from acute injury, its implications have not been thoroughly analyzed [7–9].
The most common type of short-term cerebral dysfunction following surgery is the postoperative delirium (POD) and involves a manifestation of delirium within 5–7 days following surgery. POD has a high prevalence and is the typical clinical manifestation of “acute brain failure” after anesthesia and surgery. The occurrence of delirium 8–9 days after surgery is typically due to factors such as immobilization or consecutive pneumonia and thus only indirectly linked to the anesthesia and surgical intervention. It remains prudent to maintain preventive measures in place during this late postoperative phase.
Cognitive impairment with effects lasting weeks to months following surgery is usually referred to as postoperative cognitive dysfunction (POCD) . Whereas delirium is a clearly defined syndrome, POCD has no formal definition and requires cognitive testing before and after the operative intervention in order to assess the extent of the cognitive impact. If clinically manifest, POCD is usually classified as either mild cognitive impairment or dementia [10, 11]. The link between delirium, i.e., not only the rate but in particular the duration of delirium, and POCD has been shown in the postoperative and the critical care setting, even long-term cognitive decline 1 year after treatment [12–14].
Although the mechanism is poorly understood and teeming with confounders, the surgical trauma, inadequate anesthesia (e.g., too deep, with painful stimuli), and insufficient stress reduction have a profound, and sometimes permanent, impact on the brain. The analysis of POCD is critical to obtain insight into the role of acute stressors in the cognitive trajectory, as well as to find new techniques to limit or prevent long-term cognitive impairment.
Defining Delirium
Delirium is the most common clinical manifestation of brain dysfunction following surgery. Delirium is not a disease, but a syndrome, in the sense that it comprises a typical reaction of the brain to disturbances. The concept of delirium as a syndrome is necessary to understand its association to other underlying medical conditions such as sepsis, intoxication, withdrawal, and other systemic illnesses.
Both the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) [11], and the International Statistical Classification of Diseases and Related Health Problems , 10th revision (ICD-10) [15], define delirium as a psychiatric syndrome (Table 40.2). Recently, a comparison of the ICD-10 and DSM-IV criteria for the diagnosis of delirium revealed that the DSM-IV definition is more comprehensive than that from ICD-10 and thus better suited to identify patients that will eventually suffer from an impaired outcome [16–19]. At the moment, there is discussion as to whether the DSM-5 criteria have a comparable diagnostic validity to the previous version. The DSM-5 defines delirium using five core criteria. The primary features are inattentiveness and reduced awareness, with an additional cognitive disturbance (e.g., memory, orientation, perception). These symptoms must not be attributable to a previously known underlying illness (e.g., dementia), but rather present acutely, and with fluctuating severity, in the context of a developing medical condition . From the clinical point of view, there is a hyperactive, a hypoactive, and a mixed form of delirium, whereas hypoactive delirium is considered to be particularly dangerous, as it is often overlooked in clinical practice [20–23].
Table 40.2
Gold standard of delirium definition
Diagnostic and Statistical Manual, 5th Edition, 95 | 10th revision of the International Statistical Classification of Diseases and Related Health Problems |
---|---|
Delirium, not induced by alcohol and other psychoactive drugs and not superimposed on dementia | |
(Code 596) | (F05.0) |
A–E must be fulfilled | An etiologically nonspecific organic cerebral syndrome characterized by concurrent disturbances of consciousness and attention, perception, thinking, memory, psychomotor behavior, emotion, and sleep-wake schedule. The duration is variable, and the degree of severity ranges from mild to very severe |
A. Disturbance in attention (i.e., reduced ability to direct, focus, sustain, and shift attention) and awareness (reduced orientation to the environment) | A. Clouding of consciousness, i.e., reduced clarity of awareness of the environment, with reduced ability to focus, sustain, or shift attention |
B. The disturbance develops over a short period of time (usually hours to a few days), represents an acute change from baseline attention and awareness, and tends to fluctuate in severity during the course of a day | B. Disturbance of cognition, manifest by both: (1) Impairment of immediate recall and recent memory, with relatively intact remote memory (2) Disorientation in time, place, or person |
C. An additional disturbance in cognition (e.g., memory deficit, disorientation, language, visuospatial ability, or perception) | C. At least one of the following psychomotor disturbances: (1) Rapid, unpredictable shifts from hypoactivity to hyperactivity (2) Increased reaction time (3) Increased or decreased flow of speech (4) Enhanced startle reaction |
D. The disturbances in criteria A and C are not better explained by a preexisting, established, or evolving neurocognitive disorder and do not occur in the context of a severely reduced level of arousal such as coma | D. Disturbance of sleep or the sleep-wake cycle manifests by at least one of the following: (1) Insomnia, which in severe cases may involve total sleep loss, with or without daytime drowsiness or reversal of the sleep-wake cycle (2) Nocturnal worsening of symptoms (3) Disturbing dreams and nightmares which may continue as hallucinations or illusions after awakening |
E. There is evidence from the history, physical examination, or laboratory findings that the disturbance is a direct physiological consequence of another medical condition, substance intoxication or withdrawal (i.e., due to a drug of abuse or to a medication) or exposure to a toxin or is due to multiple etiologies | E. Rapid onset and fluctuations of the symptoms over the course of the day |
F. Objective evidence from history, physical and neurological examination, or laboratory tests of an underlying cerebral or systemic disease (other than psychoactive substance related) that can be presumed to be responsible for the clinical manifestations in A–D |
The level of consciousness is critical for the DSM-5 definition, as delirium cannot be diagnosed under a severely reduced level of arousal, such as a state of coma (Table 40.3). As this limitation has the potential to overlook purely hypoactive forms of delirium, the DSM-5 criteria have been often criticized by some authors as less inclusive [24, 25]. The joint statement of the American Delirium Society and the European Delirium Association (EDA) summarizes that the “conceptualization of delirium must extend beyond what can be assessed through cognitive testing (attention) and accept that altered arousal is fundamental” [26]. This comprehensive requirement is an important factor, not only as the diagnosis of delirium is often missed but also because failure to timely diagnose delirium severely increases the risk of mortality and impaired functional outcome for the patient.
Table 40.3
The Richmond Agitation and Sedation Scale (RASS)
+4 Combative | Violent, immediate danger to staff |
+3 Very agitated | Pulls or removes tube (tubes) or catheter (catheters), aggressive |
+2 Agitated | Frequent non-purposeful movement, fights ventilator |
+1 Restless | Anxious, apprehensive but movements not aggressive or vigorous |
0 | Alert and calm |
−1 Drowsy | Not fully alert but has sustained awakening to voice (eye opening and contact for more than or exactly 10 s) |
−2 Light sedation | Briefly awakens to voice (eye opening and contact for less than 10 s) |
−3 Moderate sedation | Movement or eye opening to voice (but no eye contact) |
−4 Deep sedation | No response to voice but movement or eye opening to physical stimulation |
−5 Unarousable | No response to voice or physical stimulation |
Diagnosis
The diagnosis of delirium in clinical routine was traditionally made on the basis of clinical observations, though it is now well recognized that these unstructured clinical observations are inadequate to detect such a complex disorder with a sufficient diagnostic validity [27]. Although the DSM-5 criteria are the reference standard for the diagnosis of delirium, a psychiatrist or a well-trained expert is required for the proper assessment of these criteria, and thus it is not easily applicable in clinical routine. These limitations are due to the fact that the clinical presentation of delirium is heterogeneous, and especially patients presenting with the hypoactive subtype of delirium do not necessarily behave conspicuously [23]. This form of delirium, also known as silent delirium , is the most common subtype in ventilated patients [22]. Even patients with a mixed form of delirium, which is the most common subtype in non-ventilated patients, might be overseen during hypoactive periods. Only the pure form of hyperactive delirium is easily recognizable, where the patients are agitated and arouse the attention of the staff. Hyperactive delirium comprises less than 5 % of all delirious patients [20, 22] (Fig. 40.2). In order to solve this predicament, several delirium screening tools have been developed.
Fig. 40.2
Subtypes of delirium. Depending on the health status, the subtypes of delirium diversify. Frequently mechanically ventilated ICU patients present the hypoactive form of delirium, which is characterized by anhedonia and decreased activity. On peripheral ward the mixed form of delirium with fluctuation over the course of the day between hypo- and hyperactivity is predominant. Hyperactive delirium despite being well known is quite rare
There are several guidelines that recommend screening with a validated tool in critical care [28]. The “Clinical Practice Guideline for the Management of Pain, Agitation, and Delirium in Adult Patients in the Intensive Care Unit” shows the results of a psychometric testing of the available tools as well as their scope of application [29, 30]. The most commonly applied scores for the diagnosis of delirium are the Confusion Assessment Method for the Intensive Care Unit (CAM-ICU) [31, 32] (Fig. 40.3), the Intensive Care Delirium Screening Checklist (ICDSC) [33], and the Nursing Delirium Screening Scale (Nu-DESC) [34] (Table 40.4). Although these screening tools have been originally validated in critically ill patients [35], there are studies assessing their validity in the recovery room, alluding that these tools may also be applicable in other settings [36–38]. These scores do not restrict the screening process to physicians but also allow for routine staff members, such as nurses and physiotherapists, to adequately identify POD with high diagnostic validity . This is an important aspect, as another key challenge for the diagnosis is that patients are regularly transferred to different units following surgery (e.g., recovery room, ward, postanesthesia care unit). These short periods of observation by constantly shifting staff impair the ability of staff to detect trends or acute changes in the state of the patient. Accounting for the time period in which POD can arise, this highlights the importance of simple and universal tools for the assessment of POD throughout the entire perioperative period, allowing for proper evaluation not only in the recovery room but also in the peripheral wards.
Fig. 40.3
Confusion Assessment Method for the Intensive Care Unit (CAM-ICU) . (Used with permission from Wesley Ely. Copyright© 2013, E. Wesley Ely, MD, MPH, all rights reserved)
Table 40.4
Scores that have been evaluated in the recovery room and the peripheral ward
Recovery room | Peripheral ward |
---|---|
Nursing Delirium Screening Scale (Nu-DESC) | Nursing Delirium Screening Scale (Nu-DESC) |
Confusion Assessment Method (CAM) | Confusion Assessment Method (CAM)—short form |
Delirium Rating Scale (DRS-98) | Delirium Symptom Interview (DSI) |
Memorial Delirium Assessment Scale (MDAS) |
Staff training and validation setting remain important factors regarding the proper use of these diagnostic scores. Each of these tools has different requirements in respect to staff training, and studies have revealed that these scores drastically lose diagnostic power if used inadequately [39]. It is also important to account for the setting in which the score has been validated. As these current scores cannot be simply extrapolated from the intensive care unit to the recovery room, there has been a demand for evidence-based guidelines in other settings. The “European Society of Anaesthesiology (ESA) ” has established a task force to formulate guidelines on delirium management for the postanesthesia period [40]. This guideline will provide the first recommendations specifically designed for the postoperative context.
Incidence
Identifying the incidence of delirium is an extremely challenging issue. The condition is chronically underdiagnosed, and the available data is usually specific to certain patient collectives or particular surgical procedures, so that projections vary drastically.
To exemplify the dependency on patient population, one meta-analysis study assessing the incidence of delirium among elderly patients with hip fracture showed results between 16.0 and 43.9 % [41]. Among patients undergoing cardiac surgery, reported incidences are relatively constant, lying between 45 and 50 % [42–44]. While about a quarter of major abdominal surgery patients is affected [45], elderly patients undergoing major abdominal and trauma surgery show incidences of about 40 % [46]. The highest incidences of delirium have been reported among medical critically ill patients, where almost every patient suffered from delirium during their ICU stay [47, 48].
These estimates underline the importance of an adequate and consistent diagnostic assessment.
Short- and Long-Term Consequences
With few exceptions, delirium was for decades belittled as a trivial side effect of surgery and anesthesia. During the 1990s, there was a growing amount of evidence regarding the ramifications of delirium. Although most of these early landmark studies were performed in the geriatric and critical care context, and not in the perioperative setting, these studies revealed nonetheless that delirium has severe implications [49, 50] and is by no means an inconsequential issue.
Studies revealed that delirious patients have a significantly higher risk of mortality [51] and that even the “dose of delirium” has an impact on this mortality risk [52]. Pisani and co-workers revealed that every additional day in delirium leads to a significantly increased probability of death (hazard ratio 1.1 for each day) [52]. In the critical care context, delirious patients require increased periods of mechanical ventilation, as well as an increased intensive care unit and hospital stay [53]. In 2010, Witlox and co-workers published a meta-analysis showing the association between delirium and long-term outcomes [2]. They focused on elderly patients (mean age ≥65 years), using seven studies to estimate the relationship between mortality and delirium, and found an increased long-term mortality (mean follow-up time of the studies was 22.7 months) with a hazard ratio of 1.95 for delirious patients . Additionally, they found that patients suffering from delirium were more often institutionalized (seven studies, OR 2.41) following discharge and had a higher risk of developing dementia (two studies, OR 12.52). Recently the rate of posttraumatic stress disorder following POD was also shown to be significantly increased [54].
Studies regarding the long-term consequences of delirium in the postoperative setting also showed an impact on mortality [2, 52, 55–57]. Furthermore, it was also found that POD has a severe detrimental influence on the long-term cognitive trajectory, although the measurements used to assess cognitive performance were heterogeneous among these studies [13, 55, 58]. This shows that postoperative delirium can signal and/or trigger the development of long-term postoperative cognitive dysfunction (s.f. 1.1) and ultimately heralds all the severe individual and socioeconomic consequences of this disorder.
Special attention must be given to the evidence gathered by the “International Study of Postoperative Cognitive Dysfunction (ISPOCD) ” group, which specifically sought to assess long-term cognitive dysfunctions related to surgery [3]. Among their study population of 1218 patients, the authors found a higher rate of early (25.8 %) and late cognitive dysfunction (9.9 %). Cognitive test batteries were applied 1 week and 3 months after surgery, and in a follow-up of the study (for a mean of more than 8 years), they found an association for late POCD and higher mortality (OR 1.63). Interestingly, they also found an increased probability of early departure from the labor market. The odds ratio for leaving the market was higher for patients with early POCD (OR: 2.26), which also accounted for social transfer payments [3, 59]. Although the original ISPOCD publication did not primarily account for delirium, the results nevertheless highlight that factors impairing long-term cognitive function, such as delirium, should be carefully monitored and prevented.
In summary, there is robust evidence that delirium significantly increases cognitive and noncognitive morbidity, as well as mortality, irrespective of the observed collective.
Pathophysiology of Delirium
Despite considerable research, the pathogenesis of delirium remains elusive. As there are several conditions that can lead to delirium, it is likely that no single cause, but rather numerous distinctive mechanisms, conjoins in a final pathway that induces cerebral dysfunction.
Inflammatory Pathway
There is mounting evidence that the common pathway leading to delirium is an activation of microglia cells—cerebral immune cells with the capacity to launch local reactions—with subsequent neuroinflammation [4, 60, 61].
Peripheral inflammation can have a profound influence on the brain through the dissemination of cytokines—namely, interleukin 1β, interleukin 6, and tumor necrosis factor-α (TNF-α) [61]. These proinflammatory cytokines are released by trauma, surgery, or infection, initiating a systemic response that also activates microglia in the central nervous system (Fig. 40.4). It is important to note that this communication is not limited to humoral processes, but can also occur directly via afferent neural pathways. Microglial cells are extremely sensitive to a variety of coexisting factors, so that a previous insult can prime these cells and a relatively mild subsequent insult could trigger an exponential reaction [4].
Fig. 40.4
Model of delirium genesis. While the full pathogenesis of delirium is still not clear, some onset pathways are generated. The microglial cells are inhibited by the neurotransmitter acetylcholine, which keeps them in a resting state. In case of trauma, surgery and infection proinflammatory cytokines named TNF-α, interleukin 6, and interleukin 1β are disseminated. Via humoral pathway, these cytokines overcome the blood-brain barrier and stimulate the microglial cells to activation. Proinflammatory cytokines of a lower dosage can be sensed by afferent nerves. Activated microglia cause local inflammatory effects like changes on tight junctions of astrocytes and influences of neuronal functions. This results in changes in awareness, attention, and behavior and in the worst case delirium. Long-term damage can be explained by an over-activation of pre-damaged microglial cells
Willard et al. showed that by administering a peripheral injection of lipopolysaccharide in rats, an acute and chronic neuroinflammation could be triggered [62]. The levels of TNF-α, which has an established role in microglial activation, rose considerably in the periphery and in the brain, whereas the levels in the brain remained elevated for months thereafter [63]. The effects of a neuroinflammation through cytotoxic agents are not only acute, but can also persist due to structural damage to synapses and neuronal apoptosis. The chronic inflammation in Willard’s experiment induced a time-dependent, and not dose-dependent, neuronal loss of both choline acetyltransferase (responsible for acetylcholine synthesis) and p75-immunoreactive cells (responsible for the inhibition of apoptosis).
The cholinergic anti-inflammatory pathway, as presented by Tracey et al., showed that cholinergic inhibition of this inflammatory processes is key for limiting the extent of the reaction, thus avoiding an exaggerated response with excessive inflammation [61]. The role of this cholinergic inhibition has also been well established: it has been shown that stimulation of the vagus nerve suppresses inflammatory response, that a vagotomy exacerbates cytokine release [64], and that microglia itself is deactivated in the presence of the parasympathetic neurotransmitter acetylcholine [65]. This cholinergic inhibition can be hampered by a variety of factors, such as through medication (e.g., anticholinergic drugs, benzodiazepines), preexisting conditions (e.g., dementia, substance withdrawal), previous inflammation (prior structural damage, priming of microglia), or simply old age—predisposing the brain for delirium.
As postulated by van Gool, an additional insult to an already predisposed brain allows microglia—now unchecked by the cholinergic pathway—to become abnormally active, releasing cytokines that activate further microglia, thus entering a vicious cycle by triggering a sustained local inflammation with subsequent neurodegeneration, with further damage to cholinergic pathways [4]. This uncontrolled neuroinflammation, with neurochemical and synaptic disturbances, can explain the behavioral effects, as well as short- and long-term consequences of delirium and POCD. Additionally, this provides a plausible explanation as to the roles of many recognized risk factors, such as advanced age and the use of anticholinergic drugs, in the genesis of delirium.
Metabolic Factors
Metabolic disorders also appear to play a significant role in the pathophysiology of delirium. Aging, neurological maladies, as well as diabetes and hyperglycemia seem to predispose the development of cognitive dysfunction.
The involvement of diabetes is not surprising, as this condition is known to induce vascular, sensory, and cognitive complications. Hyperglycemia is also known to affect a wide range of structures, such as the blood-brain barrier and synaptic connections, as well as directly increase the release of cytokines [66]. Coupled with neurotoxicity and impaired circulation, the scope of proinflammatory properties of diabetes becomes evident. A perioperative tight glycemic control has also been shown to have protective effects against POD/POCD.
Sedatives and Neurotoxicity
Indirect effects of anesthesia, such as sedation and neurotoxicity, must also be considered in the genesis of delirium.
Much like the sedation-related delirium, which is known in the ICU context as being rapidly reversible, every hypnotic agent, or agent with sedative side effects, interacts with GABA and NMDA receptors. Interaction with those receptors leads to an inhibition of neuronal activity, affecting attention, qualitative and quantitative consciousness, as well as cognition [67]. By intermittently setting perfusion pump on and off, an acute onset and a fluctuating level of attention and/or consciousness over the day are easily produced, finally fulfilling all DSM-5 criteria for delirium. This form of delirium is a reaction to the termination of sedation and thus also related to emergence delirium.
Preclinical experimental work raised growing concerns regarding cognitive and behavioral impairments due to anesthesia. Several experimental trials established a link between time in anesthesia and dose-dependent calcium dysregulation and neuroapoptosis in growing mice brains [68, 69]. This effect has also been shown for surgery, surgical stress, and neuroapoptosis [70]. The significance of these animal studies for humans is still unclear, and there are ongoing prospective clinical trials aiming to clarify this issue.
Depth of Anesthesia
The role of anesthesia must also be considered in the context of delirium pathophysiology. Extended periods in deep anesthesia, as expressed in a burst suppression pattern and duration in EEG monitoring, are also associated with postoperative delirium [71–74]. Burst suppression pattern represents a massive reduction of central activity and neuronal metabolic rate. Deep anesthesia may cause disturbance of neuronal homeostasis with the detrimental complication of POD. Studies published by Monk et al. show that cumulative time in burst suppression in noncardiac surgery patients significantly increased mortality within a 1-year period [75]. These results advocate that the use of EEG for the monitoring of anesthesia depth should be employed routinely, but especially when dealing with more vulnerable populations, such as the infant and elderly patients [76].
Avoiding Anesthesia-Related Risks and Preventing Delirium
Assessing the Risk for the Delirium in the Perioperative Setting
The individual risk of POD is determined by predisposing and precipitating risk factors, as suggested in a risk model that has been established in the late 1990s [50].
Predisposing factors are generally preexisting conditions that place the patient at an increased risk for the development of delirium. There are numerous predisposing risk factors that have been identified in the surgical context, including cognitive impairment, diabetes, anemia, history of stroke, previous delirium, as well as advanced age (Fig. 40.5).
Fig. 40.5
Predispositioning risk factors for POD. Several predispositioning risk factors had been identified in association with POD. Immobility, sensorial deficiency, diabetes mellitus and malnutrition, frailty, atrial fibrillation, and poly-medication belong to these risk factors as well as alcohol and/or benzodiazepine use disorders. Early anticipation of individuals on POD risk contributes to forced special care or treatment
Precipitating factors are triggers for delirium in a specific treatment framework, developing in the context of the medical treatment. Though these may be modifiable under certain circumstances, they are not always avoidable. Precipitating factors include, for example, the use of drugs with anticholinergic activity, burst suppression rate and duration under anesthesia, a prolonged period of fluid fasting, as well as poorly managed postoperative pain (Fig. 40.6).
Fig. 40.6
Precipitation factors for POD . These factors are those the patient suffers during the medical treatment. If ever possible the precipitation factors should be minimized or avoided if conclusive. Reduction of precipitation factors contributes to effective POD prevention
A special consideration must be given to advanced age, as it is—from the quantitative point—an important and frequently reported risk factor [2, 3, 17, 37, 50, 77, 78]. However, current evidence suggests that chronological age should not be viewed strictly as a risk factor, but rather as a surrogate marker for comorbidity, multimorbidity, and a loss of functional reserve [79, 80]. Undoubtedly, both comorbidity and functional impairment are more often present with advanced age, but these factors are surely not limited to elderly patients. There is a high heterogeneity among the older population, which requires the inclusion of a detailed assessment in order to properly estimate the overall risk [81]. The reduction of the “functional cognitive reserve ” is the one of the most important factors to be considered in elderly patients. This means that fewer or less severe precipitating factors might suffice to induce delirium in those patients.
This increased vulnerability is not exclusively related to comorbidities like dementia (which can indeed occur at any age), but rather the loss of physiological functions. The functional status includes several domains, such as cognition, sensory functions, mobility, and malnutrition [49]. “Frailty ” indicates a severely impaired functional status that is not limited to one organ system, but rather denotes a systemic condition. It seems critical to understand that not all elderly patients are frail, but only a fraction (normally between 5 and 30 % in the general population) [82]. In the in-hospital surgical setting, it is estimated that about half of the elderly patients suffer from frailty (Fig. 40.7) [78, 80].
Fig. 40.7
Predispositioning risk factor age surrogates for frailty . Considering the wide range of physiological status in the elderly, many seniors are healthy and participate in all aspects of social life. On the other side some elderly suffer from comorbidities and are on poly-medication. The usual activities of daily living are impaired. Using a mobile phone or driving is too challenging for them. They cannot care for their nutrition or finances. Those elderly are frail and need more medical support. Frailty is known to be a risk factor for POD and POCD
Finally, the role of age as a predisposition condition is caused by the higher likelihood of an accumulation of age-related risk factors, linked to comorbidity and functional impairment. Therefore, a detailed functional assessment in elderly patients is of utmost importance. This includes functional tests focusing on mobility and coordination, such as the “timed up and go” test; a cognitive screening with validated tools, such as the Minimal Mental State Examination (MMSE) ; and a detailed medical history accounting for malnutrition, sensory and hearing loss, as well as psychiatric disorders and comorbidities (Table 40.5) [83].
Table 40.5
Possible frailty assessment tools
Functional domain | Test examples |
---|---|
Cognitive impairment, dementia | • Minimal Mental State Examination (MMSE) • Clock completion test (Watson) |
Mobility and risk of drop | • “Timed up and go” test |
Handgrip strength as surrogate for general muscle strength | • Dynamometer |
Hand-eye coordination, fine motor skills, and motor processing | • Grooved pegboard • Trail making test A & B |
Activity of daily living | • Barthel Index • Instrumental Activities of Daily Living (Lawton and Brody) |
In clinical routine, it would be desirable to use validated tools that predict the actual risk for the development of POD for each individual patient. There are several risk prediction models that have been developed in different contexts. A recent systematic review and meta-analysis found 37 risk prediction models for POD, although the authors found only seven were either internally or externally validated [84].
While those models might indeed be more suitable for risk evaluation than an individual clinical decision [85], currently there is still no risk prediction model that adequately covers all patients. While a comprehensive risk model remains unavailable, the risk assessment should continue to be evaluated on an individual basis in the clinical setting (Fig. 40.8).
Fig. 40.8
POD prevention . To reduce harm and costs, POD prevention should be on focus in all patients and specially in those identified to be on risk of POD. In the surgical context, the actions can be discriminated in pre-, intra-, and postoperative assessments. POD prevention is multi-professional, from the nurse on ward giving care on sensorial aids and fluids to the surgeon choosing the adequate procedure to the anesthesiological management of blood pressure, analgesia, and anesthesia agents; every link in the medical chain needs to be aware of POD prevention
Anesthesia and Delirium
Postoperative delirium was first attributed to anesthesia rather than to the surgical procedure itself. In their article from 1955, Bedfort and Leeds attributed the risk of a patient developing delirium solely to the general anesthesia [1]. In the later phase of POD research, it became evident that it is rather the inflammatory stress induced by a surgical procedure that causes POD. Nevertheless, anesthesia and anesthesia-related factors are important aspects that can either place patients at risk or exert protective effects. The most important factors in this context are the type of anesthesia, hemodynamic management , neuromonitoring, and postoperative pain management [71, 86–88].
EEG Monitoring and Delirium
Monitoring the brain activity via electroencephalography (EEG) during administration of phenobarbital was first reported by Berger in 1931, where he described systematic changes comparable to those associated with sleep stages [89]. But even though it is clear that EEG recording is the most feasible approach for tracking brain states under general anesthesia, currently it is still not part of routine practices in anesthesiology . Instead of a thorough analysis, a single number has been derived from frontal EEG recordings, intended to represent the level of consciousness [90–92]. When compared with the non-EEG-based standard of monitoring—where depth of anesthesia is based on changes in heart rate, blood pressure, and muscle tone—his simplistic approach has been shown to be ineffective in reducing the incidence of intraoperative awareness [93]. Further, these indices are less reliable in pediatric populations, since they have been developed exclusively from cohorts of adult patients [94].
Nevertheless, the widely used EEG-based indices monitoring depth of anesthesia assume that the same index value defines the same level of consciousness for all anesthetics as well as patients of all ages. Since it is known that different anesthetics interact with different molecular targets to induce changes in neuronal circuits , it is important to develop methods for the more detailed analysis of raw EEG data. Slower EEG oscillations are generally assumed to indicate a more profound state of general anesthesia. Ketamine and nitrous oxide , however, commonly induce faster EEG oscillations and, therefore, generally produce increased EEG-derived indices [95, 96]. These elevated EEG indices are misleading in regard to the level of consciousness, frequently leading clinicians to doubt the EEG index reading. In contrast, dexmedetomidine can produce profound slowing of EEG oscillations, leading to low EEG indices, though patients can still easily be aroused [97].
Nonetheless, it is important to note that such an index-based EEG neuromonitoring has been shown to decrease the risk of developing POD in several, large randomized studies [71, 98, 99], where it was found that depth of anesthesia is one of the main risk factors contributing to the incidence of POD and POCD . In these three large randomized trials, elderly patients receiving an elective surgery were included and randomized either in a “BIS-guided” group or a “BIS-blinded” group . In the “BIS-guided” group, the anesthetist was allowed to use the BIS data to guide anesthesia, whereas in the “BIS-blinded” group, the BIS monitor was covered, and patients received routine care during anesthesia. All studies mentioned a decrease in the incidence of POD, as excessive depth of anesthesia could be avoided by processed EEG guiding. Furthermore, the frequency and duration of burst suppression during anesthesia correlated significantly with the POD incidence [71, 100]. These data provide a clear suggestion as to the potential of EEG as a monitoring tool in anesthesia.
Furthermore, BIS monitoring used in combination with TCI systems during anesthesia leads to a significant reduction of applied hypnotics and opioids [101]. These results underline the potential capability of EEG monitoring as a primary parameter to define unconscious states during anesthesia. In a similar study using TCI systems for propofol anesthesia, additional monitoring using auditory evoked potentials led to less patient movement and better sedation [102]. This data provides a clear suggestion as to the potential of neuromonitoring in anesthesia.
Neuromonitoring is not inferior to a combination of TCI with neuromonitoring. TCI systems may be particularly useful for inexperienced personnel (e.g., non-anesthetist sedation providers) or in settings where neuromonitoring is not available or not implemented. TCI algorithms can provide an automatization of dosage, thus reducing alpine blood pressures caused by bolus-wise administration of anesthetics.
The term burst suppression describes an electroencephalographic (EEG) pattern consisting of a continuous alternation between high-voltage slow waves and depressed electrographic activity. It is noticed in various conditions as coma, cerebral anoxia, cerebral trauma, drug intoxication, encephalopathy, hypothermia, and deep anesthesia. The presence of an ongoing oscillation in subcortical structures (hippocampal neurons) during cortical isoelectric line has been noted [103]. Importantly, it has been shown that a propofol-induced burst suppression state is associated with a state of cortical hyperexcitability and that the bursts are triggered by subliminal stimuli reaching the hyperexcitable cortex [104]. Bursts triggered by propofol anesthesia can be asynchronous across the cortex and may even occur in a limited cortical region. This happens while other areas maintain ongoing continuous activity, indicating that different cortical and subcortical circuits express different sensitivities to high doses of anesthetics [105, 106].
During anesthesia, awareness is suppressed by hypnotics , whereas arousal is mainly attenuated by analgesics. There is currently no EEG-derived single parameter which can effectively define the level of arousability . EEG activity is generally influenced by many factors during anesthesia, where the primary contributing factors are age and the choice of anesthetics and analgesics used. If EEG monitoring would be further developed to account for influencing parameters, such as age and anesthetics used, in the index calculation, it could well become an ideal tool for monitoring the level of consciousness.
Age-Related Changes
It is of interest to note that elderly patients are more likely to experience burst suppression during anesthesia [107]. Age-related changes during anesthesia with propofol and sevoflurane are related to EEG power and coherence. EEG power is defined as a function of EEG wave amplitudes and frequencies, whereas coherence can be seen as a frequency-dependent correlation or as a measure of synchrony between two signals at the same frequency (e.g., alpha-band) in different regions of the brain. Purdon and colleagues [107] examined 155 patients, aged 18–90 years old, while receiving either propofol or sevoflurane anesthesia. For both anesthetics, they found a marked reduction in EEG signal power for all frequency bands (a = 8–12 Hz, b = 13–35 Hz, q = 4–7 Hz, d < 4 Hz, g > 35 Hz) with increasing age. The effect was most pronounced in the alpha-frequency band, where they found a loss of coherence, as well as a lower peak coherent frequency in elderly patients, as compared to the younger patients . They proposed that the age-related EEG power reduction might be caused by a decline in synaptic density, changes in dendritic dynamics, or a decline in neurotransmitter synthesis within the cortex. The frontal alpha-band changes are thought to be mediated through the frontal GABA (gamma-aminobutyric acid) thalamocortical circuits [108], so that these age-related changes might reflect a functional alteration in the GABA-dependent fronto-thalamocortical circuits. In the propofol group, these occurrences were more pronounced, which can be related to the different underlying molecular mechanisms of both drugs: while propofol acts primarily as an activator on the GABA receptor, sevoflurane and other inhalative anesthetics show an additional inhibition on NMDA receptors [109].
Propofol-Related Changes
Propofol binds postsynaptically to GABA receptors, hyperpolarizing postsynaptic neurons and thus leading to inhibition [109]. EEG signatures of propofol-induced loss of consciousness show an increase in low-frequency power, a loss of coherent occipital coherent alpha-oscillation, and the appearance of coherent frontal alpha-oscillation, which is reversed during regain of consciousness [110]. Additionally, there is a disruption of frontoparietal feedback connectivity, which also recovers during regain of consciousness (Fig. 40.9) [111].
Fig. 40.9
EEG neuromonitoring during propofol anesthesia. Intraoperative frontal EEG recording with slow oscillations. Upper screen: raw EEG show a theta-delta activity during deep sedation with propofol. Lower screen right: spectrogram during propofol-induced unconsciousness with increase of power in the low-frequency alpha-band and mainly in the theta and delta band. Lower screen left: EEG-derived index D1 or 51
Dexmedetomidine-Related Changes
Dexmedetomidine is an alpha-2-adrenoceptor agonist that gives rise to similarly slow oscillations and spindle-like activity during sedation [112]. In contrast to propofol, dexmedetomidine is clinically known to induce a sedation state comparable to non-rapid eye movement sleep, in which patients can easily be aroused with verbal or tactile stimuli. Both anesthetics are associated with slow/delta oscillation during induced unconsciousness, even though the amplitude power of slow wave oscillation was much larger in propofol anesthesia. Similar to sleep-induced spindles, unconsciousness induced by dexmedetomidine triggers spindles with a maximum power and coherence at 13 Hz, in contrast to propofol, where the peak frequency is at 11 Hz [97]. The authors propose that propofol enables a deeper state of unconsciousness , as seen by large-amplitude slow wave oscillation, whereas dexmedetomidine places patients into a more plane brain state of sedation.
Ketamine-Related Changes
Ketamine acts primarily via the inhibition of NMDA receptors, inducing a “dissociative anesthesia” [109]. This difference in molecular interaction can be seen in EEG analysis, where a reduction in alpha-power as well as an increase in gamma-power can be noted. But despite the molecular and neurophysiological differences to the other major classes of anesthetics, the frontoparietal feedback connectivity was gradually diminished during induction with ketamine anesthesia and was inhibited after loss of consciousness [113].
Target-controlled infusion (TCI) systems have been developed for intravenous drugs, where a set of pharmacokinetic parameters has been selected for computer simulation of an infusion scheme. The selected model is incorporated into the infusion pump, where it is used to predict the drug concentration in the plasma and at the drug target site. This allows it to calculate the needed concentration of anesthetics to reach an unconscious state. Monitoring the depth of anesthesia, or unconscious state during anesthesia, may be defined as the probability of nonresponse to stimulation [114] and, therefore, is dependent on the intensity of the stimulus, as well as the nature of response. Anesthesia may well induce unresponsiveness and amnesia, but the extent to which it causes unconsciousness remains uncertain [109].
Therefore, it may be more reliable in the future to focus on parameters that are directly related to the level of consciousness, such as the EEG, in order to monitor depth of anesthesia/unconscious state. In order to ensure more reliable results, it is important to use EEG signatures that are specific to age and to the anesthetic of choice.
To measure “arousability,” which is the balance between noxious stimulation and nociceptive suppression by analgesics, it is necessary to analyze responses evoked by a strong painful stimulus. Since analgesics primarily act at a subcortical level, it seems appropriate to assess electrophysiological reflexes at the subcortical/spinal level to achieve this goal.
Target-Controlled Infusions and Delirium
Target-Controlled Infusions
In the last decades, the availability of short-acting drugs with a high degree of performance prediction, such as propofol [115], allowed for the development of novel approaches to anesthesia. Aside from total intravenous anesthesia (TIVA) , these advancements allowed for the development of target-controlled infusion (TCI) systems , which employ multi-compartment pharmacokinetic models to predict anesthetic doses [116, 117], as described in Chaps. 6 and 8.
Since the first computer-based infusions were developed in the 1980s [118, 119], increasingly more accurate and reliable models have expanded the prospects of TCI, and today there are regimens available for the drug delivery of several substances, such as sedatives, analgesics, antiarrhythmics, antibiotics, and chemotherapeutics (see Chap. 8).
Anesthesia is attained by a balanced mixture use of hypnotics and analgesics, whereas the ideal dosage is generally estimated using vegetative signs as surrogate markers, such as heart rate and blood pressure fluctuations. Monitoring the depth of anesthesia in this fashion is challenging and has several limitations, so that much skill is needed to properly recognize and interpret stress signals in a broad patient collective. While an insufficient depth of anesthesia increases the risk of awareness and subsequent complications [93, 120], excessive anesthesia, expressed as burst suppression patterns in EEG analysis , is associated with POD [72, 100]. Therefore, inadequate dosage of these substances in either side of the target range can have severe effects on patient outcome [40]. There are promising new monitoring approaches, such as the noxious stimulation response index (NSRI) [121] and surgical pleth index (SPI) [122, 123], which may be helpful devices in future anesthetic assessments.
By properly defining target doses, TCI offers an elegant solution to this predicament, with the potential to decrease complication rates and improve patient outcome. One major limitation of the most commonly used TCI algorithms is that anesthesiologists utilize algorithms that were designed for the use of one opioid (mostly remifentanil) and one hypnotic (mostly propofol). Usually, the target plasma concentrations do not allow an automatic correction for co-analgesics and co-hypnotics , so that these have to be manually adjusted. Especially for patients with a high risk for delirium, these co-substances might play an important role by blocking proinflammatory pathways, thus providing beneficial effects for patients. This can explain the protective effect of iv agents (e.g., ketamine, dexmedetomidine) and also their sparing effect on hypnotics with a considerable risk for burst suppression (e.g., propofol).
When using a TCI model in a patient with a significant risk for POD , the additional use of an EEG-controlled monitoring is recommended in order to avoid excessive anesthesia and to better titrate analgesia to avoid pain and overdosing of opioids by reduction of anticholinergic side effects, as well as to allow for manual adjustments of the target concentrations, if necessary.
Background Information: Pediatrics
Children have a particular high risk of experiencing inadequate emergence after surgery [7, 9]. IE can be further divided as pediatric emergence delirium (paedED) and emergence agitation (EA) . EA, which occurs more frequently than paedED [40], is a behavioral disturbance seen as excessive motor activity caused by discomfort, pain, or anxiety [124].
In the case of paedED, however, all the DSM-5 criteria for delirium are entirely fulfilled. This complication usually affects preschool children following anesthesia with sevoflurane and is associated with consecutive maladaptive behavioral changes [125]. Thus, prevention is essential.
The employment of TCI in pediatrics is promising, but remains limited [126, 127]. Intravenous anesthesia has been shown to have beneficial effects on children, including a decrease in the rate of paedED, and postoperative nausea and vomiting (PONV) [126]. Reduced rates of paedED have also been shown under premedication with midazolam [128–130] or alpha-2 agonists [131–134]. Intraoperative application of propofol and ketamine and adequate perioperative pain management also provide protection [135, 136]. The development of proper TCI models for pediatric patients, however, has been hindered due to essential differences in pharmacokinetics and pharmacodynamics, as well as limitations on anesthesia monitoring in this patient collective [126, 127, 137].
TCI Substances and Delirium
Propofol
Propofol is a highly lipid-soluble substrate that readily permeates biomembranes such as the blood-brain barrier, so that the onset of anesthetic effects is equivalent to the blood circulation time. This property also allows for the rapid redistribution to the periphery, so that patients can readily recover from anesthetic states.