Simulation-based integrated clinical skills sessions have great potential for use in medical curricula. Integration is central to simulation efficacy. The aim of this study was to obtain medical students' perceptions toward effectiveness of integrated clinical skills sessions by using different simulation adjuncts and to know the challenges/obstacles encountered toward the implementation of such sessions. A study was conducted to obtain anonymous feedback from male (n = 156) and female (n = 179) medical students in years 2 and 3 during the 2014–2015 academic sessions at Alfaisal University about their perceptions of the effectiveness of integrated clinical skills sessions, uses of simulation adjuncts, and obstacles encountered toward the effective implementation of such sessions. The response rate was 93.4. Factor analysis showed data being valid and reliable. Cronbach’s α-values for effectiveness of sessions, use of simulation adjunct, and obstacles encountered were 0.97, 0.95, and 0.95, respectively. We conclude that students perceived positively the effectiveness of integrated clinical skills sessions as well as the use of simulation adjuncts, especially SPs. They suggested overcoming the obstacles and limitations of simulation. They highly valued the role of the facilitators in achieving effective sessions.
- basic clinical skills
- simulation adjuncts
- curriculum integration
- integrated clinical skills sessions
- obstacle toward implementation
the term “clinical skills” refers to “any action, performed by a healthcare worker involved in daily patient care, which impacts clinical outcome in a measurable way,” and includes cognitive, nontechnical, and technical skills (1, 20). Teaching of clinical skills is effectively the practical application of basic science subjects that supports the assertion “practical is the practice of theory” (24).
Early introduction of clinical skills in the preclinical phase has a positive impact, especially in a curriculum where integration is the backbone of the curriculum and has been found to be beneficial by facilitating the integration of clinical and basic science knowledge (12). It increases the student’s confidence, improves their performance, makes them “feel” like doctors and also helps to best prepare them for their clerkship phase (6, 14). However, the training of basic clinical skills like communication, history taking, physical examination, interpretation of laboratory data, procedural skills etc., has been reported to be inadequate by many students even after graduation (13, 21).
The most important resource for delivering the best clinical skills education in an undergraduate medical curriculum is simulation. “Simulation” is a training and feedback method in which learners practice tasks and processes in life-like circumstances using different technologies, such as models or virtual reality, with feedback from observers, peers, actor-patients, and video cameras to assist improvement in skills (7). These technologies range from simple screen-based demonstrations to partial task trainers’ devices to full environment simulations using high-fidelity simulators (21). Technology-based simulation has been reported to enhance knowledge, skills, and attitude among medical students (5).
There are different types of simulation modalities, also called simulation adjuncts, used in clinical education and training. Several forms of simulation adjuncts are described in the literature (4, 11, 15, 16, 18, 26), such as standardized patients (SPs), task-specific simulators (manikins), animal models, human cadavers, written scenarios, computer-based clinical simulation, audio simulations, video-based simulations, 3-D static models, and virtual reality simulation. SPs, or patient actors, are people carefully trained in interview, communication, physical examination, and feedback techniques. They act for an examinee or trainee after immersing themselves into the script of a patient, including symptoms and past medical history, family situation, and even physical examination findings, whereas a simulator is a training device mimicking reality where the complexity of events can be controlled (3).
Simulators range from low to high fidelity. As defined by Miller (18) in 1987, “The term ‘fidelity’ is used to designate how true to life the teaching/evaluating experience must be to accomplish its objectives. Fidelity does not necessarily reflect the level of technology. We can have a high-tech/low-fidelity simulation.
Curriculum integration is critical to the success and effectiveness of simulation-based health care education (19). In most medical curricula, clinical skills are taught during the advanced or clerkship phase (1, 20). Clinical teachers often complain that senior students are not prepared for clinical teaching.
A medical school needs to have a plan to implement the integrated clinical skills sessions into their curriculum. Implementation of such curriculum is not free of challenges or obstacles (19). Integration of simulation into the curriculum has remained a major challenge (24, 25) and is most successful when made part of curriculum and not an additional component (10, 17).
However, there are not enough data available regarding student perception on the effectiveness of integrated clinical skills sessions using different simulation adjuncts (2) and the challenges of integrating simulation and simulation adjuncts into the medical curriculum (9). Recently, we introduced an integrated model of basic clinical skills training at our institution starting from junior students (24, 25). The aim of this study was to obtain students' perceptions toward effectiveness of integrated clinical skills sessions, use of simulation adjuncts, and obstacles faced toward the effective implementation of simulation sessions during integrated clinical skills courses offered during years 2 and 3 of the MBBS program. The study also aims to find the sex differences among student perceptions.
METHODS AND ANALYSIS
This quasi-experimental study was conducted at the College of Medicine, Alfaisal University, Riyadh, Saudi Arabia, during the 2014–2015. academic session. The study included undergraduate medical students studying in years 2 (semester 4) and 3 (semester 6). A total of 335 students were eligible to participate in the study, with the distribution of male (n = 156; year 2 = 89 and year 3 = 67) and female (n = 179; year 2 = 94 and year 3 = 85) students.
At Alfaisal University (AU), College of Medicine (CoM), Professional (Clinical) Skills courses are taught in an integrated fashion with ongoing blocks in two successive phases as follows:
1. Professional Skills I (Communication Skills) (PRO 115): Semester I
2. Professional Skills II (Introduction to Basic Clinical Skills-PRO 234): Semester III
Professional Skills III, IV, and V (Integrated Clinical Sessions): Semesters IV, V, and VI.
Considering the importance of early introduction of clinical skills teaching in the preclerkship phase, we have designed an innovative clinical skills curriculum for the better delivery of the clinical skills objectives.
The Alfaisal Model for Integrated Clinical Skills Courses
At CoM, AU, we have adopted a hybrid curriculum utilizing problem-based learning that is taught in three successive phases over the period of 5 yr (24, 25). The integrated clinical skills sessions are offered simultaneously to male and female medical students studying in semester 4, year 2, and semesters 5 and 6, year 3, respectively. The courses run for 16–18 wk parallel to the system blocks of the semester 4, 5, and 6 curricula (Table 1). Training involves history taking, symptomatology recognition, systemic physical examination, common diagnostic methods, and the acquisition and deployment of necessary procedural skills related to system blocks. The following simulation adjuncts are used during integrated clinical skills sessions:
1. Standardized patients (SPs).
2. Written senarios or screen-based simulators (laboratory reports, images, etc.).
3. Partial task trainers (simple simulators), e.g., I/M injection model, male and female catheterization models, endotracheal tube placement, etc.
4. Low-fidelity simulators (simple mechanical); examples include Harvey for auscultation.
5. High-fidelity simulators (complex mechanical). Examples include full body metiman used for peripheral pulses, respiratory sounds.
6. Computer-based simulators.
7. Audio simulations (heart and lung sound simulators).
8. Video-based simulators (videos).
9. Three-dimensional or static models (anatomy models).
Students’ feedback was obtained through an anonymous electronic survey.
Students were asked to respond to items using a five point Likert scale from 1 to 5: 1 = strongly agree, 5 = strongly disagree for appendices 1 and 3; for appendix 2, students were asked to respond to items using a five-point Likert scale from 1 to 5: 1 = very good and 5 = very weak. Survey forms addressed the effectiveness of integrated clinical skills sessions, use of different simulation adjuncts, and the obstacles/challenges faced during implementation of integrated clinical skills sessions.
Ethics approval of the study was obtained from the Institutional Review Board of College of Medicine Alfaisal University.
The data were analyzed using SPSS version 22.0 software and AMOS version 21. The statistical analysis was conducted using IBM SPSS version 20 and AMOS version 21. The Kaiser-Meyer-Olkin Measure of Sampling Adequacy Test (KMO) and Barlett’s Test of Sphericity were used to determine whether the data were suitable for factor analysis.
Medical students responded positively to the survey with a 93.4% response rate. The baseline characteristics are given in Table 2.
Validity of the Data
Exploratory factor analysis.
The validity outcome performed after the study is given in Table 3. All items that had a loading of <0.4 were removed from the table. The rotation method is varimax with Kaiser normalization. Kaiser-Meyer-Olkin Measure of Sampling Adequacy is 0.956, and Barlett’s test of sphericity is P = 0.000 or P < 0.001, whereas total variance is 67.362%
Confirmatory factor analysis.
Confirmatory factor analysis was then performed using AMOS statistical software. The results of the analysis are reported below:
Root mean square residual: 0.035
Comparative fit index: 0.956
Normed fit index: 0.905
Non-normed fit index or Tucker-Lewis Index: 0.947
Goodness-of-fit index: 0.845
Relative fit index: 0.883
Incremental fit index: 0.957
Root mean square error of approximation: 0.057
The above values show adequate validity. The representative scree plot showing all three domains is given as Fig. 1.
The reliability is given in Table 4. The overall Cronbach’s α-coefficient is 0.955, which is indicative of a high internal consistency.
The individual and mean frequency distributions for different items are given in Table 5. The data showed that students’ average self-rating was (3.92 ± 0.05) in the effectiveness domain, (3.81 ± 0.05) in the simulation adjuncts domain, and (3.91 ± 0.05) in the obstacles domain. In the effectiveness domain, the items with the highest self-rating were facilitator supervision (4.06 ± 0.05) and the ability to perform physical examination (3.99 ± 0.05), and items with lowest self-rating were achieving the interpretation of clinical procedures and imaging (3.74 ± 0.06) and the intended learning outcomes (3.86 ± 0.06).
In the simulation adjuncts domain, the items with the highest self-rating were standardized patients (3.92 ± 0.05), simple simulators (3.89 ± 0.05), and 3D/Static models (3.89 ± 0.05), and items with the lowest self-rating were computer- (3.71 ± 0.06) and video-based simulators (3.73 ± 0.06).
In the obstacles domain, the items with highest self-rating were cooperation of standardized patients (4.03 ± 0.05) and preparation of the facilitators (4.00 ± 0.05), and items with lowest self-rating were the number of stimulators (3.81 ± 0.06) and students’ motivation to attend sessions (3.84 ± 0.06). The same results are shown in Fig. 2.
Pearson’s Correlation (r)
Correlations were sought between different parameters, which showed the following:
A positive correlation between effectiveness and barriers of simulation (r = 0.752, P = 0.00).
A positive correlation between effectiveness and modalities of simulation (r = 0.586, P = 0.00).
A positive correlation between modalities and barriers of simulation (r = 0.644, P = 0.00).
A positive correlation between year and grade point average (GPA) of the students (r = 0.172, P = 0.01).
Effect of Demographics on Students’ Ratings in All of the Domains
Wilcoxon Mann-Whitney U-test was performed to determine whether there is any difference between second- and third-year students in all three domains. Year 3 students scored higher than year 2 students in the modalities domain (P = 0.02).
There is no significant difference in the effectiveness (P = 0.10) and obstacles (P = 0.21) domains however there is a significant difference in between year 2 and year 3 students’ ratings in the modalities domain.
Independent-samples t-test was done to compare sex/gender differences among the three domains (t-test; means ± SD).
Male students (M = 3.83, SD = 0.79) rated less effectiveness toward simulation when compared with their female colleagues (M = 4.06, SD = 0.67) (P = 0.03).
There was no significant difference between the ratings of males (M = 3.74, SD = 0.70) and females (M = 3.93, SD = 0.73) in the modalities domain (P = 0.07).
There was no significant difference between the ratings of males (M = 3.85, SD = 0.74) and females (M = 4.04, SD = 0.66) in the barriers domain (P = 0.06).
One-way analysis of variance (ANOVA) was used to determine whether students with different academic levels show any difference in self-ratings of all three domains:
Effectiveness: F = 7.488, P = 0.001
Modalities: F = 6.404, P = 0.002
Barriers: F = 4.205, P = 0.016
One-way (ANOVA) results showed that academic performance affected the perceived effectiveness of the course (F = 7.488, P = 0.001), the ratings of different modalities (F = 6.404, P = 0.002), and barriers (F = 4.205, P = 0.016). Furthermore, Bonferroni post hoc test showed the following:
1. In the effectiveness domain:
Good students’ scores were higher (4.08 ± 0.06) compared with students who had excellent scores (3.81 ± 0.07, P = 0.037).
Excellent students’ scores were higher (3.81 ± 0.07) compared with students who had poor scores (2.96 ± 0.82, P = 0.043).
Good students’ scores were higher (4.08 ± 0.06) compared with students who had poor scores (2.96 ± 0.82, P = 0.004).
2. In the modalities domain:
Excellent students’ scores were higher (3.81 ± 0.08) compared with students who had poor scores (2.62 ± 0.68) (P = 0.002).
Good students’ scores were higher (3.85 ± 0.65) compared with students who had poor scores (2.62 ± 0.68) (P = 0.001).
3. In the barriers domain:
Excellent students’ scores were higher (3.88 ± 0.07) compared with students who had poor scores (3.00 ± 0.82) (P = 0.031).
Good students’ scores were higher (3.98 ± 0.06) compared with students who had poor scores (3.00 ± 0.82) (P = 0.014).
The same is presented in Fig. 3.
Analysis of Qualitative Responses
As per the qualitative analysis, students’ responses were analyzed using the directed content analysis approach (Table 8).
Our study suggests that students showed positive perception of the effectiveness of integrated clinical skills sessions as well as toward the use of simulation adjuncts, especially SPs. They acknowledged the need to overcome the obstacles and limitations of simulation. They also valued highly the role of the facilitators in achieving effective sessions but also requiring them to be well prepared.
The use of simulation in undergraduate medical curriculum is evolving rapidly for competency assessment as well in multiple domains (23). It is also very well accepted by health care educators worldwide to improve hands-on experience by enhancing the performance of medical professionals (26). Numerous studies report positive results in relation to increasing knowledge, skills, and attitude toward simulation-based clinical skills sessions, especially in the training of resuscitation, airway management, etc., during the clerkship phase (2). A literature review in 1995 concluded that most medical students were deficient in interviewing, history taking, and systemic examination skills (1, 24). In 1998, a group on Educational Affairs Plenary of the Association of American Medical Colleges also discussed clinical skills deficiencies of medical students (14, 24). Because of the role of students in current educational systems, it is essential to measure and get their input on the use of simulation. As demonstrated in the results above, this self-administered survey is a validated and reliable tool in assessing three domains: perceived effectiveness, simulation adjuncts, and obstacles. After the perceived effectiveness of several components of integrated clinical skills sessions was assessed, this study examined which simulation adjunct is more highly appreciated by the preclinical students. This is essential in managing the available resources in the right direction and also gives insight for future development of a clinical simulation center. Additionally, this study addressed the limitations and obstacles from the students’ point of view toward implementation of integrated clinical skills sessions.
In our study, students have shown positive perception (any mean rating >3) of the effectiveness of the integrated clinical skills sessions as well as toward the use of all simulation adjuncts they encountered. This agrees with the previous studies (19, 25) and demonstrates the acceptance of students of integrated clinical skills sessions and the different simulation-based learning modalities, even at such early stages, i.e., the 2nd and 3rd yr of medical school. In particular, students find these comprehensively organized, integrated clinical skills sessions helpful for them in achieving their intended learning outcomes.
Students rated highest the standardized patients (SP), simple simulators, and static models, whereas computer- and video-based simulators were rated the lowest among all provided methods of simulation. This indicates that they relate differently to these. It could also be due to the objectives of our courses, most of which deal with the learning of basic clinical skills, e.g., history taking and physical examinations that can best be learned with SPs and partial task trainers. Use of simulation adjuncts like standardized patients and manikins during integrated clinical skills has been found to bridge the gap between basic and clinical medical sciences (25). In a study done by Giesbrecht et al. 2014 (4a), SPs were perceived to be most useful for learning and practicing basic clinical skills, particularly interviewing, communication, clinical assessment and interventions, physical examinations, and objective structured clinical examination (OSCE), and for feedback. Significant improvement in the performance of students using SP feedback is reported by Park et al. (22).
The greatest success in our program is our ability to integrate our clinical skills sessions with relevant organ system blocks. The challenges in integrating simulation and simulation adjuncts into the medical curriculum have received little attention (9). Some of the barriers identified in planning and implementing comprehensive, integrated, simulation-based curriculums are faculty time to evaluate the curriculum and incorporate clinical skills sessions, acceptance and support from the administration, committed facilitators to supervise the sessions, competition for time in the curriculum, and scheduling (19).
Eder-Van Hook (7) has provided suggestions for overcoming obstacles:
1. Enlisting the faculty, clerkship directors, and learners in recognizing the importance of the simulation components, which will aid in surmounting scheduling or time allotment obstacles.
2. Faculty support in the form of developing scenario templates, providing technical assistance, and programming cases.
3. Faculty development.
4. Clearly defined intended learning outcomes or objectives.
6. Cooperation of curriculum planners and the curriculum planning committee.
7. Cooperation of course directors.
Generally speaking, in our study, students reported the obstacles in the implementation of clinical skills sessions and acknowledged the need to overcome these limitations. This can be demonstrated by the students valuing highly the role of the facilitators in achieving effective sessions while at the same time also requiring them to be well prepared. Simulated patients are another area of continuous development that needs close supervision. Students consider SPs as the most effective adjunct simulation method. Nevertheless, they acknowledge that SPs should be well trained to cooperate with them. Interestingly, the students had the least amount of concern about the number of simulators available to achieve effective session despite the fact that our simulator/student ratio ranges from 1:6 to 1:12. This indicates that large numbers of students can be accommodated with effectively organized, well-timed sessions, as in our case. Additionally, we believe that sequential demonstration and practice on a few simulators allows students to learn from each other’s mistakes.
The positive correlation between effectiveness and modalities of simulation implies that the success of simulation aspects goes with the appropriateness of simulation modalities and the quality of the simulators.
Similar to what Joseph et al. (13) has found, female students seem to find simulation more effective in their learning and skills development compared with their male colleagues. Yet when it comes to comparing both sexes’ perspectives on simulation adjuncts and the barriers that face them in reaching their learning goals, we find no significant difference. To be noted, at our college, male and female students take their sessions separately. As an educational center, we have always recognized the need to overcome this through strict standardization between males and females while at the same time adapting to the cultural needs of our students. These results point out that even with such standardization, which successfully made simulation adjuncts of artificial simulators and standardized patients equally accessible to females and males, females have a different perspective on simulation compared with males. This could allude to a psychosocial variability, requiring further investigation, to allow for an appropriate sex/gender-wise standardization, if any.
We have also addressed the prospect of standardization through simulation on students at different levels of academic performance. Because of the the low number of poor-performance students (GPA <2.5) enrolled in the study, we focused on the difference between average (GPA >2.5 to <3.5) and high-achieving students (GPA >3.5). When addressing simulation adjuncts and the obstacles toward implementation, we noted again that the effect of standardization was not significantly different in the perspective of students, regardless of their academic level. However, when examining the overall perceived effectiveness of simulation sessions, “average” students scored the effectiveness higher than “high-achieving” students. While noting that cumulative GPA of preclinical students is based mostly on their performance in theoretical subjects, this may shed light on the difference of performance among different types of learners, possibly indicating that students who prefer a more hands-on approach (represented here by simulation) as an effective learning tool may perform less in theoretical exams. However, further analysis and research should examine such hypothesis.
In conclusion, students
showed positive perception of the effectiveness of integrated clinical skills sessions as well as toward the use of simulation adjuncts.
suggested overcoming the obstacles encountered and limitations of simulation.
valued highly the role of the facilitators in achieving effective sessions.
considered SPs as the most effective simulation adjunct.
Preclinical students value clinically oriented simulation the least, even though it is favorable for them.
The author declares no conflicts of interest, financial or otherwise.
M.Z. conception and design of research; M.Z. performed experiments; M.Z. analyzed data; M.Z. interpreted results of experiments; M.Z. prepared figures; M.Z. drafted manuscript; M.Z. edited and revised manuscript; M.Z. approved final version of manuscript.
- Copyright © 2016 the American Physiological Society