learning clinical skills presents a novel experience for undergraduate students, particularly when it comes to preparing for skill assessment. Compared with the thousands of hours of practice believed to be necessary for the development of motor skill expertise (1), these students have significantly limited exposure time. Furthermore, effective clinicians combine procedural skills expertise with the flexibility required to implement these in the real-world environment in which clients and situations can change. Therefore, unlike the learning of theory (such as solving a physics problem), rote learning and repetition may not maximize learning outcomes for the development of new clinical skills.
There is a substantial body of research that shows that conception of learning influences the learning outcomes in university students, and much of this research has been conducted in undergraduate science cohorts (3, 5, 6, 9, 11). For instance, when first-year physics students approached learning with a focus on understanding (a deep conception and approach), their learning outcomes were superior to those who intended to merely complete the task requirements and associated learning with memorization and reproduction (a surface conception and approach) (4). Yet it is unclear whether conception of learning influences outcomes for clinical skill development. Conception of learning varies with the academic task (7), and it could be argued that, unlike theory-focused learning (e.g., physics), a deep conception and approach may not lead to superior learning of hands-on clinical skills. Arguably, a conception of learning that emphasizes memorization and procedural skill reproduction (characteristics of surface learning) may be ideal for undergraduate students preparing for clinical skill assessment, particularly given the limited preparation time available in the context of a 10- to 13-wk unit of study. Students have a tendency to identify for themselves what counts for assessment (or at least what they perceive counts) and orientate their efforts in study in preparation for assessment accordingly (2).
This pilot study reports the level of clinical skill mastery in undergraduate exercise science students who were learning and being assessed on clinical skills for the first time. In this unit, we provided the students with a variety of learning experiences (delivery modes: videos, face-to-face tutorials, printed handouts, and lectures) from which to prepare for a skills examination that assessed their accuracy of implementing clinical techniques with real human subjects. The purpose of this investigation was to examine whether student performance in the skills assessment was related to their conception of learning, which was inferred from responses to a voluntary survey. To gain some insight into student approaches to the learning of clinical skills and to inform future teaching and learning approaches, we also sought to examine student engagement with the learning material and the possible interaction between clinical skill mastery and conceptions of learning.
This research was conducted in a first-year undergraduate course (EXSS 1032 Fundamentals of Exercise Science). This unit of study is delivered in semester 2 of year 1 and is the first in which students enrolled in Exercise and Sport Science and Bachelor of Health Science degrees learn practical skills essential to their professional development. All students were individually assessed on their performance of these skills. For the purposes of our investigation, performance was assessed on both 1) procedural accuracy and 2) mastery.
A total of 289 undergraduate students were enrolled in the unit of study. All students learned clinical skills via interactive laboratory classes and self-directed learning with bespoke instructional videos and supporting text, which were available via an online resource (WebCT). Interactive laboratory classes (n = 22–24 students) were conducted for 2 h weekly for a total of 6 wk and were supervised by a tutor, who was available to answer questions and provide feedback on clinical skills. All tutors held or were completing a Masters or PhD degree in exercise science and had at least 6 yr of experience with the skills being assessed. All tutors acted as the assessors for the subsequent clinical skills examination. Students were also able to book the laboratory outside of these times to practice the skills. Before providing their consent, participants were informed of the study protocol and that involvement was voluntary.
Students undertook an examination of their clinical skills in the eighth or ninth wk of the semester, giving them ∼2 mo between starting the unit and completing the assessment. During the examination, all students were assessed on three tasks, which comprised blood pressure measurement and two of the following: resistance exercise instruction, heart rate (HR) measurement/electrocardiography (ECG), Monark cycle ergometry, respiratory gas (Douglas bag) collection, or respiratory gas sampling (Douglas bag analysis). The latter two were administered by balanced design, such that each skill was completed by ∼55 students. All skills are considered fundamental to the profession and therefore of similar complexity.
Students received two results for their assessment: 1) a score (out of 15) for their procedural skill [procedural skills score (PSS)] and 2) categorization as “Mastered” or “Failed.” Procedural scores were composed of task-specific performance indicators (see the Supplemental Material).1 For example, for Monark cycle ergometry, points were obtained for correct determination of force, seat height, instruction given to the participant to begin pedaling at zero force, and continual monitoring of force/pedaling rate; for blood pressure, students received marks for correct stethoscope handling, cuff placement, inflation pressure, and cuff deflation speed. Students were deemed to have Mastered the assessment if all of the Mastery criteria boxes (see the Supplemental Material) were ticked for each of the three tasks, indicating accurate performance of all tasks on first attempt. That is, aside from the procedure being followed appropriately, we evaluated whether the measurement/outcome obtained was accurate. Namely, for Mastery, the student's reported blood pressure, HR, or Douglas bag gas collection/analysis measurement had to agree with that measured by the assessor, the student's Monark cycle ergometer instruction needed to ensure that the client maintained cycling at a correct power output, and the student's resistance exercise instruction was required to be performed in such a way that the client was safely receiving the correct movement/feedback and load. Our rationale for a focus on Mastery was to separate learners who mimicked skill from those who could implement a skill. We highlighted to students that merely following a procedure would not necessarily produce an accurate result, in this case, a clinical measurement. All assessors had strict written instructions to inform their assessment of both PSS and Mastery. For Mastery categorization, the measurement outcome needed to comply with that of the trained assessor (see the Supplemental Material).
Self-reported interactions with learning material and conception of learning.
Students were administered an electronic questionnaire (SurveyMonkey, http://www.surveymonkey.com), which they could voluntarily complete within the 7 days preceding the practical examination. Questionnaire items 1–4 were broader questions pertaining to purported activities undertaken for assessment preparation, perceived level of confidence, and perception of the role of laboratory classes. These were designed to inform our future delivery of laboratory classes but were not included in the present analysis because these did not directly relate to student conceptions of learning. Three items on the questionnaire (items 5–7) assessed student conceptions of learning using a five-point Likert-type scale (strongly disagree to strongly agree; Table 1), which was converted to a numerical result (scores of 1–5, respectively), and were based on the work of Ramsden (4) and communication with the university's Institute for Learning and Teaching, including an authority in this field (Prof. Keith Trigwell). From these three items, a total conception of learning score (CLS; out of 15) could then be calculated for each student. Disagreement with item 5 (“I can master a skill through videos and written instructions without the need for feedback from my tutor”) and item 7 (“Learning a new skill is more about me practicing the skill than exploring my personal understanding of the skill”) and agreement with item 6 (“I feel that understanding how a technique works rather than practicing it is a good way for me to master a skill”) contributed to a higher CLS score, which we inferred as consistent with a deep conception of learning. For subanalysis by learning style, students who responded neutral (a score of 3) to any of items 5–7 were removed from analysis; individuals who answered two or more of these questions as “agree/strongly agree” or “disagree/strongly disagree” in the same direction were then categorised as “deep” or “surface” learners, respectively.
To investigate general student engagement with the learning material, we also obtained electronic records of the number of views of supporting video material preceding the practical examination. To examine whether differences in Mastery were related to learning material engagement, we calculated the total number of video viewings and time of viewing on WebCT for a subgroup of respondents: those with the 10 highest and 10 lowest CLSs from both the Mastered and Failed groups.
Calculations and statistics.
All data are reported in the text and displayed in the figures as means ± SD. One-tailed t-tests were used to compare the prevalence of Mastery and PSSs between deep and surface learners. Statistical significance was accepted at P < 0.05. Calculations were performed using Excel (Microsoft Office 2007).
Of the 281 students who completed the practical examination, 147 students received a Mastery result (52% of the cohort).
CLS and PSS.
A total of 168 students completed the voluntary questionnaire (60% of the cohort). There were no significant differences in PSSs for those students who did and did not complete the questionnaire (11.96 ± 1.82 vs. 11.83 ± 2.47, P = 0.331). Similarly, there were no significant differences in the proportion Mastered for those students who did (54%) and did not (50%) complete the questionnaire (P = 0.273; Table 2).
For the cohort who completed the questionnaire, students who achieved a Mastery result had a significantly higher CLS than those who Failed (9.25 ± 1.58 vs. 8.71 ± 1.54, respectively, P = 0.013; Fig. 1). PSS was also significantly higher in those who Mastered the skill (12.64 ± 1.56 vs. 11.17 ± 1.80, respectively, P < 0.001; Fig. 1).
On the basis of questionnaire data, 28 of the 168 students were deemed as surface learners and 19 students were deemed as deep learners. PSSs were higher in deep versus surface learners, although this did not reach statistical significance (12.28 ± 1.93 vs. 11.61 ± 1.73, respectively, P = 0.120; Table 3). The percentage of students who received a Mastery result was also higher in deep (56%) versus surface (39%) learners, although this did not reach statistical significance (P = 0.147; Table 3).
Of those students who Mastered the skill, the number of hits (24.8 ± 11.0 vs. 34.4 ± 13.3, respectively, P = 0.048; Table 4) and time spent viewing (81.40 ± 40.84 vs. 138.40 ± 74.07 min, respectively, P = 0.024; Table 4) videos were significantly lower in those with a deep versus surface conception of learning.
The aim of this study was to report the level of performance (as both procedural reproduction and mastery of implementation) in a clinical skills assessment in undergraduate exercise science students who were learning and being assessed on these skills for the first time. Our primary goal was to investigate whether conception of learning was related to performance in a clinical skills assessment. The results show that Mastery (i.e., the ability to achieve an accurate outcome for all examined skills on first attempt) was associated with a higher CLS on our bespoke survey. We contend that a higher CLS is consistent with a deeper conception of learning. Furthermore, when we directly compared cohorts of students who were identified as having a bias toward a deep or surface conception, there was a higher percentage of Mastery and higher PSSs in students whose survey responses were consistent with a deeper conception of learning, although this did not reach statistical significance. This was associated with significantly less time engaging with electronically delivered learning material.
The original theory that conception of learning is related to learning outcomes was developed in undergraduate science students who were learning theoretical knowledge and academic problem solving (7). In professional practice-based undergraduate degrees, there is an additional emphasis on the learning and assessment of practical/clinical skills, and there is some evidence to suggest there may be less interaction between conception of learning and learning outcomes in this environment (8, 10). We sought to identify conception of learning with our questionnaire and see if it related to performance in a skills assessment, and the results of our pilot study provide early evidence to support this. We highlight that our categorization of deep versus surface conception of learning via CLS was made on the basis of a bespoke survey and acknowledge that the test lacks psychometric data. The survey questions focused on replication versus the development of understanding when approaching skill learning, and we contend that it provided useful information to dichotomize students for the purpose of this study. The three questions upon which the CLS was derived were based on terms commonly used to describe surface versus deep learners (7) and were formulated in consultation with an expert in the field from the university's Institute for Teaching and Learning (Prof. Keith Trigwell). Furthermore, we only used the extremes of responses to characterize deep and surface learners, i.e., those who consistently reported a strong bias toward answers consistent with a surface or deep approach across the questions.
The majority of students completed the voluntary questionnaire, and we appear to have a representative sample in that there was no difference in PSS or CLS between those who did and did not complete the questionnaire. The high PSS (∼12/15) yet high fail rate (∼50%) of the cohort highlights the relative ease of learning and replicating the basic procedure for novel clinical skills for undergraduate students but the difficulty in implementing this accurately with real human subjects. On the basis of the responses to this questionnaire, we were able to identify a difference in skill performance by purported conception of learning. Specifically, a Mastery result was significantly more likely in students who had a higher CLS, which was consistent with a deep conception of learning. This outcome builds on those seen previously in undergraduate students learning science concepts (3, 5, 6, 9, 11). Although not statistically significant, we observed a similar trend in greater clinical skills mastery in those students who were classified exclusively as deep (56%) versus surface (39%) learners. These findings indicate that there may be a relationship between conception of learning and outcomes in clinical skills acquisition. The inability to statistically confirm this may reflect a lack of power in this present study. However, the nature of the assessment item used in this study may also have affected this result. Our assessment comprised fundamental clinical skills, which may have reduced the capacity for the task to differentiate performance between those with deep and surface conceptions of learning due to the lack of more advanced or difficult task demands. Findings of a weaker relationship between assessment outcomes and conceptions of learning have been demonstrated in a medical student cohort being assessed on basic competencies (8). Lack of a relationship between strategies for learning and performance clinical rotations in more advanced medical students has also been demonstrated, potentially due the significant time restrictions and chaotic nature of the clinical environment preventing utilization of deep learning strategies (10). While our study assessed conceptions of learning rather than learning approaches, it is conceivable that more challenging assessment tasks that allow students to demonstrate high level learning may have better differentiated performance between students with surface and deep conceptions of learning.
As conceptions of learning are not fixed, the present results may suggest that teachers should actively encourage a deeper approach to the learning of practical skills, despite the possible assumption that mimicry is appropriate. In addition to enhancing student learning outcomes, this could also reduce contact hours required for repeating assessments for teachers. That is, we experienced a high rate of failure during this assessment task, which resulted in students having to repeat the skills until superior performance was demonstrated, ultimately increasing teacher-student contact time. On the basis of our observation of less engagement with the supporting material in those with a deep conception of learning who Mastered the assessment, we could speculate that these students used their face-to-face time more efficiently. Perhaps these students used the video/supporting material as a learning refresher, whereas those adopting a surface approach may have relied heavily on the videos as a teaching tool and made less effective use of face-to-face time. More research on this is warranted to best inform approaches for teaching clinical skills in the future.
We acknowledge that our findings are from a limited sample size and based on a relatively narrow pool of clinical skills, which may reduce the generalizability of our results to other clinical teaching and learning contexts. The categorization of the approach to learning (CLS) was also derived from a small number of self-reported questions based on a bespoke survey. More detailed assessment of conception of learning in the future may help to clarify this issue. However, our findings should serve as a stimulus for future research examining the interaction between learning approach and clinical skills development for undergraduate students. This potentially includes issues such as how to implement teaching and assessment to best promote a deep approach to clinical skill learning and the interaction between conception of learning and the long-term retention of clinical skills.
No conflicts of interest, financial or otherwise, are declared by the author(s).
Author contributions: N.A.J., V.H.C., and K.B.R. conception and design of research; N.A.J. and K.B.R. performed experiments; N.A.J. and K.B.R. analyzed data; N.A.J., V.H.C., and K.B.R. interpreted results of experiments; N.A.J. and K.B.R. prepared figures; N.A.J., V.H.C., and K.B.R. drafted manuscript; N.A.J., V.H.C., and K.B.R. edited and revised manuscript; N.A.J., V.H.C., and K.B.R. approved final version of manuscript.
The authors thank Prof. Keith Trigwell for the assistance with the study design.
↵1 Supplemental Material for this article is available at the Advances in Physiology Education website.
- Copyright © 2013 The American Physiological Society