Simulation in General Surgery



Fig. 23.1
Surgery residents and attendings practicing laparoscopic colectomy on a cadaver model




Physical Part-Task Trainers


Popular part-task trainers used in general surgery include a variety of models that allow training in chest tube placement, cricothyroidotomy, central line placement (subclavian, jugular, and femoral), arterial line placement, thoracentesis and paracentesis, ultrasound techniques, and several biopsy techniques. Examples of these products include: the Focused Abdominal Sonography in Trauma (FAST) Exam Real Time Ultrasound Training Model (Blue Phantom, Redmond, WA, USA), Arterial Puncture Wrist (Limbs & Things, Bristol, UK), IOUSFAN (Kyoto Kagaku, Kyoto, Japan), SimMan (Leardal, Wappingers Falls, NY, USA), TraumaMan (see Fig. 23.2), CentraLineMan, and FemoraLineMan (Simulab Corporation, Seattle, WA, USA). Home-made models are not uncommon to address skills for which a commercial product is not available. Examples include, but are not limited to, a laparotomy model using foam, bubble wrap, plastic wrap, and various fabrics to inexpensively replicate the peritoneal contents and abdominal wall [6], which has been adapted in the ACS/APDS surgical skills curriculum [7]; an abscess model using mock purulent material injected into a chicken breast [8]; and a laparoscopic common bile duct exploration model using vesical catheters [9].

A978-1-4614-5993-4_23_Fig2_HTML.gif


Fig. 23.2
TraumaMan (Simulab Corporation, Seattle, WA, USA) allows trainees to practice cricothyroidotomy, chest tube insertion, pericardiocentesis, needle decompression, percutaneous tracheostomy, diagnostic peritoneal lavage, and IV cutdown (Photo from authors’ collection)

For training in laparoscopy, the most commonly used and widely available simulators are realistic part-task trainers, also known as benchtop, video, box, or pelvic trainers. These trainers have been developed to provide inexpensive and reproducible training in laparoscopy and generally include a confined space (box) that resembles the abdominal cavity, an imaging system (video camera, light source, and monitor), access ports, and laparoscopic equipment (see Fig. 23.3). Historically, the Yale Top Gun laparoscopic skills and suturing course developed the first task trainer and included three tasks: the rope pass, cup drop, and triangle transfer drill [3]. Several other models and tasks have been developed subsequently with variable penetration in the market. The Southwestern stations were an expansion of the Top Gun tasks and included a suturing task with foam and a task for placing numbered and lettered blocks on a checkerboard [10]. Laparoscopic models for simulation include the Fundamentals of Laparoscopic Surgery (FLS) Laparoscopic Trainer Box (Venture Technologies, North Billerica, MA, USA), Portable Laparoscopic Task Trainers (Ethicon, Somerville, NJ, USA), Helago Laparoscopic Trainer (Limbs & Things, Bristol, UK), and the Minimally Invasive Training System (3-Dmed, Franklin, OH, USA).

A978-1-4614-5993-4_23_Fig3_HTML.gif


Fig. 23.3
Surgery resident practices laparoscopic suturing on a box trainer

The Fundamentals of Laparoscopic Surgery (FLS) program deserves special mention. This program was developed under the auspices of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES) and the American College of Surgeons (ACS) and was based on the prior work of McGill University researchers [1114]. FLS is an Internet-based ­program designed to provide and verify the fundamental skills and knowledge necessary for effective and safe laparoscopic surgery and includes knowledge and skills components. It includes modules on preoperative, intraoperative, and postoperative considerations during basic laparoscopic procedures and potential complications, as well as manual skill practice on five tasks: peg transfer, precision cutting, placement of a ligating loop, and suturing using extracorporeal and intracorporeal knot tying [15] (see Fig. 23.4). This program has undergone rigorous validation [16] and is currently available to all general surgery residency programs in the USA through an industry-supported grant [17]. Importantly, general surgery residents are now required to obtain FLS certification to be eligible to take the qualifying examination for the American Board of Surgery [16]. This is the first inclusion of simulation as a component of board certification in general surgery.

A978-1-4614-5993-4_23_Fig4_HTML.gif


Fig. 23.4
Screenshot of intracorporeal knot tying on a box trainer


Virtual Reality Part-Task Trainers


Compared with realistic simulators, virtual reality (VR) simulators offer different advantages to the learner. These systems are configurable (different levels of difficulty), allow for multiple anatomic variations to simulate pathology and aberrant anatomy [3], and enable repetitive practice of procedures at minimal cost (i.e., the same task/procedure can be performed an infinite number of times without the need for supplies or disposables). Additionally, VR simulators do not require the presence of an instructor as they often provide built-in tutorials and multiple metrics that can be used for learner performance assessment and feedback. Their disadvantages include high acquisition and maintenance costs, the need for periodic software and hardware updates, suboptimal realism, and the potential for learning bad habits in the absence of an instructor to give feedback.

The first and best validated VR simulator, the Minimally Invasive Surgical Trainer-Virtual Reality (MIST-VR; Mentice, Gothenburg, Sweden), is currently available as part of LapSim. Popular VR systems include the LapMentor (Simbionix, Cleveland, OH, USA), CAE LaparoscopyVR (CAE Healthcare, Montreal, Canada), and LapSim (Surgical Science, Göteborg, Sweden) and offer a variety of laparoscopic procedures that extend beyond general surgery. Virtual reality simulators also exist for training in flexible endoscopy including upper endoscopy, colonoscopy, and bronchoscopy and are addressed in other chapters. Examples include the CAE EndoscopyVR Simulator (CAE Healthcare, Montreal, Canada), the GI Mentor (Simbionix, Cleveland, OH, USA), the Surgical Science colonoscopy simulator (Surgical Science, Göteborg, Sweden), and the Endo TS-1 (Olympus Keymed, Southend, UK). Basic endoscopic skills as well as biopsy, polypectomy, and bleeding control ­techniques can be practiced on these devices. At the time of writing this chapter, SAGES is developing the Fundamentals of Endoscopic Surgery (FES), which is a VR-based program similar to FLS that aims to teach and assess the endoscopic skills of surgery residents. The authors anticipate that FES, like FLS, will eventually become an integral part of the general surgery resident curriculum.

Other VR systems available for training in general surgery involve endovascular techniques and procedures. Systems such as the Procedicus VIST (Mentice, Göteborg, Sweden), ANGIO Mentor (Simbionix, Cleveland, OH, USA), and SimSuite (Medical Simulation Corporation, Denver, CO, USA) provide the opportunity to practice endovascular procedures and a means of skill assessment. Moreover, some of these systems allow for the import of actual patient imaging data that can then be used for practice of a planned intervention before its actual performance on the patient. Evidence suggests that patient-specific practice may be superior to generic practice when using these simulators [18].

Other VR simulators that deserve mention include the CAE VIMEDIX ultrasound simulator (CAE Healthcare, Montreal, Canada), which besides providing an excellent platform for training in echocardiography also offers FAST modules that are very useful for training surgery residents to recognize intra-abdominal injuries in trauma patients. In addition, laparoscopic ultrasound compatible models (e.g., IOUSFAN, Kyoto Kagaku, Kyoto, Japan) are being developed, which can be combined with liver biopsy and radio-frequency ablation (RFA) techniques.

With the recent popularity of robotic surgery, especially in disciplines such as urology and gynecology, new VR simulators have emerged such as the Mimic dV-Trainer (Mimic Technologies Inc, Seattle, WA) available to the users of the da Vinci robotic system. Other home-made training systems in robotic surgery exist as well, using FLS-type tasks [19]. Multidisciplinary efforts in surgery are currently ongoing to create the Fundamentals of Robotic Surgery (FRS). Hybrid simulators, such as the ProMIS (CAE Healthcare, Montreal, Canada), which combine realistic instruments and imaging with a virtual reality interface and metrics (motion tracking), are also popular.

Finally, while the majority of available simulators focus on laparoscopy, the increasing dominance of laparoscopic procedures over their open counterparts has created a need for training in open procedures. To address this need, open-surgery VR platforms have been developed such as the SurgSim Trainers (SimQuest LLC, Silver Springs, MD, USA), the Phantom Desktop (SensAble Technologies, Wilmington, MA, USA), and the CyberGlove II (Meta Motion, San Francisco, CA, USA). Initial experience indicates that the creation of VR systems for open surgery is considerably more difficult than for laparoscopic ­procedures [5].


Evidence in Support of Simulation in General Surgery


In a landmark 2000 study, Scott and colleagues [10] demonstrated that simulator-acquired skill successfully transferred to the operating room. In this randomized controlled study, junior general surgery residents trained on basic laparoscopic tasks using the five UT-Southwestern video trainers, and their performance during a laparoscopic cholecystectomy was compared with a control group. Simulator-trained residents performed better in the OR than controls demonstrating the value of simulator training for the acquisition of laparoscopic skill [10]. In 2002, another landmark, randomized, double-blinded study demonstrated transferability of laparoscopic skill acquired on the MIST-VR simulator to the operating room. Virtual reality-trained residents were 29% faster and six times less likely to commit an error compared with non-VR-trained residents during gallbladder dissection. Additionally, the non-VR-trained residents were nine times more likely to transiently fail to make progress [20]. Since that time, several additional good quality studies have demonstrated the value of training on surgical simulators [21, 22].

A systematic review of ten randomized controlled trials [23] confirmed that simulator-acquired surgical skill transfers to the operating room but also recommended additional better quality studies. More recently, another randomized controlled trial demonstrated that junior surgery residents performed better in the operating room compared with controls after they trained to proficiency on the FLS tasks [24]. Importantly, after only 2.5 h of supervised practice and 5 h of individual practice, the FLS-trained first- and second-year residents performed in the OR at the level of third- and fourth-year residents as measured in a prior study [25]. Several other studies have demonstrated the value of available endoscopy and angiography simulators [2629]. While the evidence documenting the impact of simulator training on clinical performance is adequate, the majority of published studies report T2 translational outcomes (impact of training on learner performance), and T3 outcomes (impact of training on patient outcomes) are sparse. A group from Northwestern University demonstrated that residents who received internal jugular and subclavian training on simulators inserted central lines in the medical intensive care unit with significantly fewer needle passes, catheter adjustments, arterial punctures, and with higher success rates than traditionally trained residents (historical controls) [30]. They then conducted a before/after observational study in the medical intensive care unit on the incidence of catheter-related and bloodstream infections over 32 months and found a 85% reduction in catheter-related bloodstream infections after the simulation-based trained residents entered the intensive care unit (0.50 infections per 1,000 catheter days) compared with both the same unit before the intervention (3.20 infections per 1,000 catheter days, P  =  0.001) and a comparison intensive care unit in the same hospital throughout the study period (5.03 infections per 1,000 catheter days, P  =  0.001) [31].

While the previous studies were not done with surgery residents, the procedural task assessed clearly has relevance to surgery. A recently published, more surgery-specific study demonstrated improved operative performance and time, as well as improved patient outcomes regarding intraoperative and postoperative complications after laparoscopic inguinal hernia repair for residents who trained to expert levels on an inguinal hernia simulator compared with a control group in a randomized controlled trial [32]. It should be noted, however, that the transfer of skill acquired on surgical simulators to the clinical environment is not complete. Several studies have demonstrated that, when using a proficiency-based simulator training paradigm, while novices can achieve expert-derived performance criteria on the simulator, their performance lags that of experts in the OR [3335]. This phenomenon is likely multifactorial and requires further investigation [35]. A comprehensive review of all the studies in support of simulation-based training is beyond the scope of this chapter, and readers are referred to excellent review articles [23, 36, 37]. Suffice it to say that the evidence for simulation in general surgery is mounting and as such, its use has expanded.



Performance Assessment Using Simulators


Besides providing an effective training tool, simulators make it possible to objectively assess learner performance. A variety of assessment tools and simulator performance metrics currently are used in general surgery, and efforts are ongoing to refine them. The most often used metrics in surgical simulation include task duration and performance error measurements. These metrics provide robust and relevant information and have been incorporated into the FLS program. In the latter, learners practice repetitively until they reach a level of efficient and error-free performance as defined by task-specific metrics of time and errors. Nevertheless, while these traditional metrics have stood the test of time and are easy to obtain, concerns exist that they may not be the ideal or the only metrics for performance assessment on simulators. The rationale is that these metrics do not provide insight regarding the effort the individual had to invest to achieve a specific level of performance or whether learning has been completed [38, 39].

Several research groups have therefore suggested that additional performance metrics be used. Limb kinematics (i.e., trajectory, velocity, and acceleration) have probably gathered the most attention and have been shown to distinguish performers of variable skill in several studies [40, 41]. Such metrics can be obtained on physical simulators using specialized recording systems, such as the Imperial College Surgical Assessment Device (ICSAD) that uses an electromagnetic tracking system for motion tracking [40] or the ProMIS simulator (CAE Healthcare, Montreal, Canada) that tracks motion from the instrument tips [41] and are readily available on virtual reality simulators. Unfortunately, there is limited evidence about the importance of such metrics for learning. In a prior prospective study, 60% of novices who trained to proficiency in a basic laparoscopic task were able to achieve motion metrics goals easier than time goals indicating that the incorporation of motion metrics into training goals had limited effectiveness for skill acquisition [42]. In a very recent study, the authors demonstrated in a randomized controlled trial that the use of motion metrics as performance goals did not lead to improved transfer of skill to the OR compared with time goals alone or in combination (Stefanidis et al., Does the Incorporation of Motion Metrics Into the Existing FLS Metrics Lead to Improved Skill Acquisition on Simulators? publication pending).

Besides the aforementioned metrics, additional metrics have been proposed for simulator performance assessment with promising results. These either reflect the effort that the learner had to invest to achieve a level of performance (such as the NASA-TLX workload assessment tool) [43], rely on the measurement of distinct expert characteristics (such as eye tracking) [44], or use secondary task measures that reflect multitasking ability [38, 45]. In a study by Stefanidis et al., training of novices to automaticity using secondary task performance as a training goal in addition to time and errors led to better transfer of simulator-acquired skill to the OR compared with traditional proficiency-based training alone [46]. Such metrics may prove important to augment skill acquisition on surgical simulators and minimize the incomplete transfer of skill to the OR environment.

Besides using these metrics, surgical performance can also be reliably assessed by an experienced observer. In fact, this type of assessment may be preferable for some skills, as it provides qualitative information on learner performance that can then be provided as summative and formative feedback to the learner [47, 48]. Furthermore, these instruments are versatile as they can often be used for similar tasks. This type of assessment is frequently criticized as it relies on subjective ratings, unclear operational definitions of performance, and ambiguity in responding [5]; it is therefore imperative that the reliability and validity of such instruments be proved before their use for evaluation [47]. Observer ratings are typically provided on global rating scales, visual analog scales, checklists, or a combination of these. When completed by experts, current evidence suggests that global rating scales are superior to checklists for the evaluation of technical skills [49, 50].

Some authors suggest that checklists and visual analog scales should not be included in technical skills assessment, as they fail to enhance the effectiveness of performance assessment compared with global rating scales alone [47, 49]. On the other hand, checklists may provide important, more specific information for formative feedback on learner performance that could augment learning. Validated rating scales for technical skill assessment have been developed for open, laparoscopic, and endoscopic skills. The objective structured assessment of technical skill (OSATS) [48], the global operative assessment of laparoscopic skills (GOALS) [47], and global assessment of gastrointestinal endoscopic skill (GAGES) [51] have been demonstrated to be valid and reliable measures of performance and have been used widely in the literature for this purpose. However, the exact relationship of observer ratings with other more objective performance metrics is not well understood and requires further study.


Skills Curriculum Development


Despite the evidence supporting simulation training in general surgery, the sole availability of a simulator does not guarantee its educational effectiveness. The most important ingredient is the curriculum. More importantly, most experts in the field advise that simulators are acquired or developed based on the objectives of the curriculum [52, 53]. Early experience with simulators in general surgery supports this notion, as in the absence of a structured curriculum, most simulators, no matter how expensive or sophisticated ended up collecting dust. Curriculum development starts with a needs assessment and gap analysis, a selection of objectives and instructional methods and ongoing evaluation of its effectiveness and optimization based on accumulated experiences [54]. It is also imperative to assess, at the beginning of the curriculum, the required resources for its successful implementation. Besides associated costs, equipment (including simulators), and supplies, the need for supervising faculty and/or other personnel should not be overlooked. Importantly, in the case of residents, this translates into identifying protected time and implementing external motivators for participation (both for residents and teaching faculty). In the authors’ experience, the latter two factors have been the most challenging [55].

Several ingredients guarantee a successful skills curriculum; foremost is deliberate practice for the purpose of effectively improving specific aspects of an individual’s performance [56]. According to Duvivier et al. [57], the characteristics of deliberate practice in medical education include: (a) repetitive performance of intended cognitive or psychomotor skills, (b) rigorous skills assessment, (c) ­provision of specific performance feedback, and (d) the ongoing effort to improve one’s performance. The same authors suggest that the personal skills needed to successfully develop clinical skills include planning (organize work in a structured way), concentration/dedication (long attention span), repetition/revision (strong tendency to practice), and study style/self-reflection (tendency to self-regulate learning). The applicability and value of deliberate practice for surgical skill training has been demonstrated for surgical tasks on a virtual reality simulator [58]. Important ingredients for deliberate practice include internal and external motivation of learners. Internal motivation is the most important driving force for learning but is unique to each trainee and difficult to modify externally. Nevertheless, several external motivators can help improve learning on simulators. These may include practice time protected from other training responsibilities, healthy competition among trainees with performance goals and recognition of the best performers, rewarding excellent performance, and requiring the achievement of specific performance scores before the trainee is permitted to work in the clinical environment [59]. Mandatory participation of surgery residents in simulation training is critical to a curriculum, as resident participation has been shown to range from 7 to 14% in voluntary curricula [55, 60].


Feedback


Performance feedback helps learners improve, but the timing of its administration is also important. Several studies [6165] have shown that external feedback versus no feedback leads to improved skill acquisition and retention, independent of the practiced task. Furthermore, the provision of summative feedback (at the end of performance) has been demonstrated to be superior to concurrent feedback (during practice trials) [61, 66]; the latter may in fact inhibit performance if excessive [61]. The benefit of video tutorials for surgical skill acquisition has also been well documented. CD-ROM tutorials have been shown to effectively transfer cognitive information for motor skill learning [67]. Video tutorial viewings have been shown to augment laparoscopic performance on simulators [68] and hasten the attainment of proficiency [61] especially when provided before and during training as needed [69]. Importantly, computer-based video instruction has been shown to be as effective as expert feedback for the retention of simulator-acquired laparoscopic suturing and knot-tying skills [62].


Training Intervals


Practice distribution can also be optimized to enhance learning. It has been established that practice can initiate neural processes that continue to evolve many hours after practice has ended [70]. Related to surgical skills, this has been supported in a randomized, controlled trial that revealed microvascular anastomosis skills to be better retained and transferred when taught in a distributed manner, as opposed to a massed (all training provided in one session) manner [71]. In a large meta-analysis, it was noted that simple tasks are better acquired with shorter inter-training intervals, while more complex tasks required longer intervals for optimal learning [72]. For the creation of an end-to-side vascular anastomosis, no difference in performance was demonstrated 4 months after training when the skill was acquired in weekly versus monthly training sessions [73].


Proficiency-Based Curricula


Proficiency-based curricula deserve special mention. Such curricula set training goals for learners that are derived from expert performance. Unlike traditional training paradigms that define training duration based on time or number of repetitions, proficiency-based curricula tailor training to individual needs and lead to homogenous skill acquisition. The superiority of proficiency-based training compared with these traditional training methods is well supported in the literature [5, 33, 53, 74, 75]. With known goals, the trainee can compare his or her performance to these targets for immediate feedback. This promotes deliberate practice, enhances motivation, and improves skill acquisition [69, 76]. In a randomized trial by Madan and colleagues, residents who trained with performance goals outperformed residents who practiced the same amount of time but without goals in eight simulated laparoscopic tasks [77]. In another study, Gauger and colleagues [78] demonstrated that the use of specific performance targets and feedback improved the ability of novice surgeons to attain high levels of proficiency in both simulated tasks and actual operative performance. They also demonstrated that if learners are left to determine their own proficiency targets and practice needs, less practice occurs and the level of accomplishment is lower [78]. Furthermore, the establishment of performance goals on simulators has been shown to increase resident attendance and participation in training programs [76] and proficiency-based training to lead to improved retention of simulator-acquired skill [34].

Examples of simulator-based curricula widely used in general surgery include the American College of Surgeons and the Association of Program Directors in Surgery (ACS/APDS) national surgical skills curriculum (Table 23.1) that addresses the needs of general surgery residents. This proficiency-based skills curriculum is web-based and readily accessible to all. It includes a variety of simulations to achieve the desired learning objectives and was introduced in three phases. Phase I includes 20 modules that address basic surgical skills, and phase II includes 15 advanced procedures modules. Each of these modules includes objectives, assumptions, suggested readings, descriptions of steps for specific skills and common errors, an expert performance video, recommendations for guided practice, and information on station setup and use. Many modules also include tools for the verification of proficiency to assess the readiness of individual residents for the operating room. A faculty guidebook provides descriptions of information on ­supplies, station design, vendors, products, and laboratory setup, as well as information on recommended teaching times. Phase III includes ten modules that address team-based skills, including scenarios in the OR, surgical ICU, and other settings. A faculty guidebook for team training is provided, and each module includes case information, patient data, faculty and ­resident information, and debriefing and assessment tools. Important concepts in team training are addressed, such as the development of expert team members versus expert teams, communication (critical language, closed loop), leadership, coping with stress, decision-making, and situational awareness [79, 80].


Table 23.1
(ACS/APDS) National surgical skills curriculum





































































































Phase 1: Basic/core skills and tasks

Advanced laparoscopy skills

Advanced tissue handling: flaps, skin grafts

Airway management

Asepsis and instrument identification

Basic laparoscopy skills

Bone fixation and casting

Central line insertion and arterial lines

Chest tube and thoracentesis

Colonoscopy

Hand-sewn gastrointestinal anastomosis

Inguinal anatomy

Knot tying

Laparotomy opening and closure

Stapled gastrointestinal anastomosis

Surgical biopsy

Suturing

Tissue handling, dissection, and wound closure

Upper endoscopy

Urethral and suprapubic catheterization

Vascular anastomosis

Phase 2: Advanced procedures

Gastric resection and peptic ulcer disease

Laparoscopic appendectomy

Laparoscopic inguinal hernia repair

Laparoscopic right colon resection

Laparoscopic sigmoid resection

Laparoscopic Nissen fundoplication

Laparoscopic ventral hernia repair

Laparoscopic ventral/incisional hernia repair

Laparoscopic/open bile duct exploration

Laparoscopic/open cholecystectomy

Laparoscopic/open splenectomy

Open inguinal/femoral hernia repair

Open right colon resection

Parathyroidectomy/thyroidectomy

Sentinel node biopsy and axillary lymph node dissection

Phase 3: Team-based skills

Laparoscopic crisis

Laparoscopic troubleshooting

Latex allergy anaphylaxis

Patient handoff

Postoperative hypotension

Postoperative MI (cardiogenic shock)

Postoperative pulmonary embolus

Preoperative briefing

Retained sponge on postoperative chest X-ray

Trauma team training

Another example is the Advanced Trauma Operative Management (ATOM) course. The ATOM course is grounded in social cognitive theory and is designed to increase students’ knowledge, self-efficacy, and surgical skill to manage penetrating trauma for 12 injuries in a porcine model [8183]. Developed in 1998, the ATOM program came under the auspices of the American College of Surgeons Committee on Trauma in 2008. The course consists of pre-course reading materials, a pre-and post-course examination, and a 1-day on-site curriculum of lectures and simulations. The pre-course materials include a textbook and a CD-ROM that demonstrate the surgical repair of penetrating injuries to the abdomen and chest. Students are expected to completely manage the injuries, which includes identifying the injuries, developing management plans, and using surgical repair techniques. During the 1-day session, six 30-min lectures about the repair of penetrating injuries to individual organs are presented in the morning. In the afternoon, the students participate in a simulation session of penetrating trauma scenarios in a porcine model. Students are evaluated on their knowledge, self-efficacy, and psychomotor ability [81, 82]. Table 23.2 contains an abbreviated skills curriculum outline, employed at the authors’ institution for weekly 90-min training sessions. Performance criteria for PGY-I and PGY-II residents, as well as the curriculum structure for PGY-II residents, are included in Table 23.2.
May 30, 2017 | Posted by in Uncategorized | Comments Off on Simulation in General Surgery

Full access? Get Clinical Tree

Get Clinical Tree app for offline access