Simulation for Licensure and Certification



Fig. 12.1
MSR, the Israel Center for Medical Simulation



From its outset, a strategic alliance and close collaboration was established between MSR and the National Institute for Testing and Evaluation (NITE) in Israel, an institute that specializes in measurement and provides psychometric services to all Israeli universities and higher-education colleges [37]. Expert psychometricians from NITE work routinely at MSR and together with the regulatory professional bodies who assign content experts for their exams, and MSR’s ­professional staff—with its simulation expertise—a team with all three domains of expertise, was formed. This team collaboratively develops and implements various simulation-based testing programs at MSR—to serve the needs identified by Israel’s healthcare system regulators.

In the following section of this chapter, we will describe the evolution of MSR’s national SBA programs conducted in collaboration with NITE and with Israel’s health professional regulatory bodies. We will also highlight the lessons learned and special insights which surfaced during the course of the development and implementation of these SBA programs.


The Israeli Board Examination in Anesthesiology


As in other medical professions, Israeli anesthesiology residents are subjected to a written mid-residency board examination and an oral board exam at the end of their five and a half years of residency training. Acknowledging the possible benefits of SBA and the lack of a structured performance evaluation component, the Israeli Board of Anesthesiology Examination Committee decided in 2002 to explore the potential of adding an OSCE component to the board examination process. The initial decision was that the SBA would be complementary to the existing board examination process and that SBA would be a task-driven test where relevant tasks are incorporated into realistic and relevant simulated clinical scenarios.

Being the first high-stakes test to be developed at MSR, this test evolved gradually since its first initiation demonstrating a dynamic development and ongoing attempts to improve various aspects of the SBA. Following are a few examples that reflect this dynamic approach on behalf of the test development committee: realism of scenarios improved by presenting a “standardized nurse” to the stations and by using more advanced simulation technology; scoring method was upgraded throughout the years (e.g., critical “red flag” items that examinee had to perform in order to pass were removed, checklist format was modified to improve raters’ ability to score examinees’ performance), a two-stage scenario model was developed and adopted to include a simulation-based scenario which is followed by a “debriefing” or an oral examination used to assess the examinee’s understanding of the previously performed scenario, and his ability to interpret laboratory results and tests; and finally, in terms of the SBA role in certification, passing the simulation-based test has become a prerequisite for applying to the oral board examination although recently due to logistic reasons the SBA stations are part of the oral board exams.

Major components of the SBA include an orientation day for examinees held at MSR a few weeks before the examination. During this day, the examinees familiarize themselves with the test format and the technological environment (simulators, OR equipment, etc.). Another major component is the examiners’ orientation and retraining (“refresher”) before each test. Table 12.1 describes the actual examination format, and Table 12.2 presents an example of a checklist used during the examination.

A978-1-4614-5993-4_12_Tab1_HTML.gif


Table 12.1
Anesthesiology board examination format


A978-1-4614-5993-4_12_Tab2_HTML.gif


Table 12.2
An example of an evaluation checklist for the assessment of axillary block performance (only part of the checklist is presented)

Nineteen examination cycles were held since 2002, and the number of examinees in each cycle was 25–35. The board examination in anesthesiology achieved good psychometric characteristic with satisfying inter-rater agreement, good intra-case reliability, as well as good structure validity and face validity [30, 38, 39]. In addition to the satisfactory ­psychometric qualities, the process of incorporating SBA into the board examination paradigm has had several important implications. Analysis of frequent errors in the test yielded important and precious feedback to the training programs aiming at highlighting areas of skills deficiencies (e.g., identifying and managing technical faults in the anesthesia machine) [40], with the hope that it would drive educational improvements in residency. The effort to keep high psychometric standards in the SBA inspired the test committee to aspire to the same standards in the oral exam. Hence, a more structured oral exam was developed, and an obligatory “train the rater” workshop was conducted to improve raters’ oral examination skills.


Paramedics Certification Exam


Paramedics training in Israel includes a 1-year course followed by an accreditation process that includes evaluation by the training program supervisor, a written examination, and a simulation-based exam at MSR. The SBA includes four stations with scenarios in four major professional areas: trauma management, cardiology, pediatrics, and respiratory ­emergencies. Various relevant simulators are used, and two of the test stations include an actor (SP) in a hybrid format that combines communication with the SP and clinical performance on a mannequin simulator. Since one of the core characteristics of the paramedics’ profession is team management, this had to become an integral part of the assessment in the simulation-based test (Fig. 12.2). Therefore, examinees perform as team leaders in all station and are assisted by two other more junior paramedics who are still in the paramedic course—and therefore are not test subjects (in fact, this experience serves also as part of the junior paramedics’ orientation for their future certification exam). The score in each station is composed of yes/no checklist items (70%) (20–30 items per station), holistic parameters (20%) assessed on a 1–6 scale (time management, team leadership, etc.), and one general holistic assessment (10%) on a 1–10 scale. The SPs in the two hybrid stations also score examinees’ performance. The final score per station is a weighted average of the SP score (10%) and the professional rater score (90%).

A978-1-4614-5993-4_12_Fig2_HTML.jpg


Fig. 12.2
Team Management Training at MSR, the Israel Center for Medical Simulation

The paramedics certification test takes place 4–5 times a year with about 25 examinees per day. A month before each test, an orientation takes place in which examinees practice at MSR on scenarios similar to the ones used in the actual test and receive feedback based on a “test like” scoring form. The raters (two raters per station) also participate in a mandatory “train the rater” workshop. Several psychometric features of this test are routinely measured, inter-rater agreement varies from 75 to 95%, and face validity, as reflected in participants’ feedbacks, is also very high.


National Registration Exam for Nurse Specialists


In Israel, to become a certified nurse specialist in any of the 16 defined nursing professions (intensive care, pediatric intensive care, psychiatry, oncology, etc.), one must undertake a yearlong specialty course. In 2008, recognizing the need of performance measures to the registration process, the nursing authority in Israel’s ministry of health decided to collaborate with MSR and NITE and develop a simulation-based test to replace the written multiple choice certification test. Currently, 16 different tests are developed annually for the various nursing specialties, requiring teams of nurse specialists to work closely with the simulation experts and psychometricians on the test content in each profession.

All exams have a common format that includes 11 ­stations with various combinations of the following station types:

(a)

High-fidelity simulation—measuring clinical competence in a scenario using high-fidelity mannequin simulators.

 

(b)

SP stations—measuring clinical competence, including communication with patients (Fig. 12.3).

A978-1-4614-5993-4_12_Fig3_HTML.jpg


Fig. 12.3
OSCE for Advanced Nursing Accreditation at MSR, the Israel Center for Medical Simulation

 

(c)

Debrief stations—following the SP station, the examinee is debriefed on the scenario and his/her performance and decision-making using questions such as what was your diagnosis? what facts regarding the patient led you to that diagnosis? and why did you choose a specific treatment?

 

(d)

Video-/PPT-based case analysis stations—written open-ended items all relating to a specific case, presented either in video or in Power Point Presentation.

 

(e)

Short computerized multiple choice test.

 

The raters in this examination are nurse specialists in the respective fields. All raters participate in an obligatory train the rater workshop before each test. The National Registration Exam for Nurse Specialists has been running for 3 years with 650–1,000 examinees per year in 13–16 different nursing professions. Unfortunately, the small numbers of examinees in each profession make it difficult to compute the psychometric parameters. However, in three professions, the number of examinees is relatively high: intensive care (about 160 examinees per year), ­midwifery (60–80 per year), and primary care in the ­community (40–50 per year). In these ­professions, internal consistency ranged from 0.6 to 0.8. In addition, inter-rater disagreement rate in all tests was less than 5% (unpublished data), indicating satisfactory reliability. At the moment, long-term predictive validity research is being conducted to measure the correlation between test scores and supervisor and peer evaluations in the workplace.


The “MOR” Assessment Center: Selection of Candidates to Medical Schools


Medical school admissions traditionally rely heavily on cognitive variables, with noncognitive measures assessed through interviews only. In recognition of the unsatisfactory reliability and validity of traditional interviews, medical schools are increasingly exploring alternative approaches that can provide improved measures of candidates’ personal and interpersonal qualities.

In 2004, the Tel Aviv University Sackler School of Medicine appointed MSR and NITE to join forces with its admission committee in order to develop and implement an original assessment system for the selection of its candidates, focused exclusively on their noncognitive attributes.

The MOR assessment center that was developed included three main assessment tools:

1.

A biographical questionnaire

 

2.

An ethical judgment and decision-making questionnaire

 

3.

A set of OSCE-like behavioral stations

 

For a full description of the questionnaires and the original behavioral stations structure, see Ziv et al. [18].

The raters of candidates’ attributes in the behavioral stations are faculty members (doctors, nurses, psychologists, social workers) as well as SPs. They score candidates’ behaviors on a standard structured scoring form that includes four general parameters (each divided into 2–6 scored items): interpersonal communication skills, ability to handle stress, initiative and responsibility, and maturity and self-awareness. All raters are trained in 1-day mandatory train the rater workshops.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

May 30, 2017 | Posted by in Uncategorized | Comments Off on Simulation for Licensure and Certification

Full access? Get Clinical Tree

Get Clinical Tree app for offline access