Evidence shows that factors contributing to success in physiology education for allied health students at universities include not only their high school achievement and background but also factors such as confidence with their teachers and quality of their learning experience, justifying intensive and continued survey of students’ perceptions of their learning experience. Here we report data covering a 3-yr period in a physiology subject that has been redesigned for blended and online presentation. Consistent with previous reports, we show that when we undertook a blended mode of delivery, students demonstrated better grades than traditional modes of teaching; however the absence of didactic teaching in this subject resulted in lower grades overall. Students have very strong positive attitudes to weekly quizzes (80% positive approval) but report ambivalent attitudes to online self-directed learning (61% negative perception), even though they had 2-h weekly facilitated workshops. Overwhelmingly, students who undertook the subject in a self-directed online learning mode requested more face-to-face-teaching (70% of comments). From these data, we suggest that there is a quantifiable benefit to didactic teaching in the blended teaching mode that is not reproduced in online self-directed learning, even when face-to-face guided inquiry-based learning is embedded in the subject.
- online delivery
- live lecture
- blended learning
- online education
- human bioscience
- online pedagogy
- nursing education
- human bioscience problem
- formative quizzing
the higher education setting is continuously evolving into a technology-rich environment. How the ever-increasing variety of online learning tools are used to balance the expectations of the incoming student population with the push from university management to work smarter and in a more cost-effective way is a factor that must be considered. It is well documented that universities are under higher pressure than ever before to cut budgets while providing greater student flexibility and increasing enrollments (7, 13, 20). This has in part led many institutions to identify and pursue strategies for increasing the use of digital learning platforms to deliver learning content to students.
The Sloan Consortium provides a numerical definition for blended learning, stating that a course with online content of between 30 and 79% can be considered blended. They then use the term “web enhanced” to describe online content that contributes <30%. Contrastingly, a course with >80% content blended is considered “online” (2). Kwak et al. (10) agrees that online learning is where all of the learning resources are delivered through an information technology platform, with limited face-to-face contact.
The reasons for the shift from a traditional face-to-face model to a blended learning model are varied. The benefits of combining information technology into a curriculum are particularly useful for students that are unable to attend face-to-face classes due to distance from campus or other commitments such as full time employment (12). Various studies (11, 17, 21) have discussed the needs of the ever-changing student body, noting that current students come to university digitally literate, preferring to access learning materials when and where it suits them and at a pace appropriate for their learning style (11).
Teaching strategies that rely on “self-directed” learning assume that students are both self-motivated and well organized in their study habits. Online learning, where the students are required to work in a self-paced manner through sets of prepared learning activities such as assigned reading, mini-lectures, and problem sets, should be included as a self-directed learning design. A recent meta-analysis of the available empirical data on online and blended learning, as defined previously (8, 16), suggests that although blended learning programs produce significantly better outcomes in terms of student performance in subjects, online subjects fare no better than traditional didactic learning modes.
The shift toward asynchronous presentation and online teaching resources in place of face-to-face teaching disengages the students (3). The science anxiety that many allied health students experience adds to the disengagement that they feel toward their tertiary studies (5, 9, 14). Teaching physiology to students of allied health degrees has well-established challenges (15). A recent literature review (15) has identified the attributes that contribute to success in physiology. These attributes include high school success, as measured by ATAR (Australian Tertiary Admission Rank), or high school grade point average, science background, and a low degree of science anxiety. The most intensively studied area of allied health physiology education is focused on nursing education. Unfortunately, entrance requirements for nursing are low, with low ATARs and no science background, meaning that many students of this particular cohort are at a known risk of failing physiology (4, 9).
Another important factor linked to the success of nursing students in physiology is the perceived quality of the teaching. In this context, factors like the amount of content, teaching style, and degree of confidence that instructors have in the course content are important factors that influence student satisfaction (15).
Previously, we published (19) a description of a subject redesign process that described the development and transition of a first-year physiology subject that targeted a broad range of allied health students, including nursing, public health, and physiotherapy, from a traditional didactic design that included 3 h/wk of lecture and 1 h of instructor-led tutorial to a design that emphasized team-based and active learning. This redesigned subject (delivered in 2012 and 2013) included aspects that not only transformed the subject to student-centered learning but also matched the “blended” subject design (as defined previously). This included the fact that 30% of the learning objectives were available to students only via self-directed learning modules in an online format, and there were online weekly preworkshop quizzes. In that paper we reported significant improvement in student mastery of the learning outcomes, as reflected by a higher proportion of students achieving “A” and “B” grades.
In 2014, in response to institutional priorities, the subject underwent another redevelopment, with the removal of all lectures and didactic teaching; instead, learning activities to support mastery of the learning objectives were provided solely through online content. Two hour per week workshop activities (in face-to-face mode; enquiry based learning approach) were maintained as implemented previously in 2012 and 2013.
The perceived quality of the learning experience becomes an important consideration in the subject design.
Thus the aims of the study were the following:
to assess the impact of online learning resources on the students’ perceptions of their learning experiences
to evaluate the impact of the changed pedagogy on the students’ learning outcomes
to identify teaching and learning resources utilized in the subject that the students found most beneficial to their learning and to track the durability of these resources over time.
MATERIALS AND METHODS
This comparative cross-sectional study was carried out at La Trobe University’s regional campuses, Bendigo, Albury/Wodonga, Mildura, and Shepparton; in the years 2012–2014. The study was approved by La Trobe University Human Ethics Committee (2012: FHEC12-171; 2013: FHEC13-071; and 2014: FHEC14-084).
Only students who were enrolled in first-year, first-semester physiology at four regional campuses of La Trobe University were surveyed. Students at all four campuses were invited to participate in this survey.
Context of study.
Within our institution, first-year physiology is a common subject that is mandatory for all allied health students. It is offered in semester 1 to ~2,000 students at the five campuses of La Trobe University (1 major metropolitan campus and 4 regional/rural campuses). Approximately one-third of the cohort comes from the regional campuses.
In 2012, this first-year physiology subject became a blended learning subject consisting of traditional face-to-face lectures and workshops covering “core” physiology concepts, with “extension” material covered independently, outside scheduled class times, by guided activities online, as described previously (19). Workshop activities allowed students to consolidate and apply newly attained knowledge. The team-based, guided enquiry design of the workshop activities encouraged peer-to-peer learning of the subject content. The peer-to-peer interaction and the nature of the guided enquiry activity promoted good learning outcomes, particularly among the most at-risk students, with academically underprepared students benefiting from the guidance, support, and example of their more credentialed colleagues (13). These workshops were initially very guided and conceptual, progressing into more clinically relevant or real-life scenarios. Core material was assessed during scheduled workshops, in “collaborative testing methods.” This involved 50 multiple-choice questions (MCQ) tested initially under exam conditions individually, followed by team submission of the same 50 MCQs. Extension material was assessed under exam conditions at the end of the semester.
In 2012, compulsory pre-workshop online quizzes were introduced as a study tool to provide extra support for students undertaking the core physiology subject, as described previously (Table 1) (19). In 2013, weekly pre-workshop quizzes via Mastering A & P (Pearson Publishers) were incorporated into the curriculum to facilitate transition into the workshop component of the subject and to ensure that students engaged with the online resources. The quiz was linked to a 2% workshop participation grade, which was implemented to encourage students to remain engaged throughout the semester. These online quizzes included a combination of MCQ, label diagram and true/false questions, and a range of multimedia tools produced by the publisher.
In 2014, all lectures were cancelled in the subject, and teaching staff developed a package of learning resources that were intended to deliver the subject content online and/or via self-directed learning. These resources included short screen capture presentations recorded by academics that addressed individual learning outcomes, recommended reading from the prescribed text book, links to appropriate YouTube videos, and/or animated resources provided by the publisher of the textbook. Weekly pre-workshop quizzes via Mastering A & P were retained. The learning management system (LMS) was organized by physiological system, constructively aligned such that resources were directly tied to the learning objective that it was intended for. PowerPoint slides or lecture notes were not available; rather, instructors wrote “worksheet” activities that were intended to be problem sheets that students filled in as they worked through the online resources, and they were intended as a replacement for lecture notes to promote student engagement as they progressed through the self-directed learning tasks. As with every previous iteration of the subject, a student subject learning guide that listed all of the intended learning outcomes by topic and contained all of the guided enquiry activities to be completed in the workshop was produced.
Survey and data analysis.
Each year, students studying the physiology subject were asked to complete a survey about the learning environment. The survey tools as described in Table 2 were prepared, consisting of a range of questions relating to the demographics of the survey participants, asking students to rank the value of the learning resources provided and a number of Likert items (5-point scale) intended to elicit students’ perceptions of their learning experience in the subject. Two specific attitudes were assessed by the Likert items: student perception of blended and online learning and student perception of the value of the weekly online quizzes.
Summative scales for each of the attitudes being assessed by the Likert items were calculated according to methods by Trochim (22) and Desselle (6). Each item was assigned a maximum value of 5. Responses that strongly agreed with the statement were graded 5/5 for each individual item, whereas strongly disagree was assigned the value of 1/5. Inverse statements (i.e., I did not like. . .) were assigned inverted values (strongly disagree = 5, strongly agree = 1). For each of the attitudes being investigated, the values for the responses were summed giving a single attitude score (for 8 items, maximum score = 40, for 3 items, maximum score = 15). Summative scores are expressed as percentage of maximum scores to facilitated easy comparisons between years. Average responses for individual Likert items are expressed as the mean response ± SD. The surveys provided opportunities for the students to provide expanded answers, and a thematic analysis on their responses was then conducted.
Thematic analysis was also conducted on the institutional student feedback reports on the question, “Which 2 or 3 specific aspects of the subject have contributed most to your learning?” This analysis was conducted to measure the students’ perception of the quality of the learning resources.
Statistical tests were performed in Microsoft Excel 2013, using the Analysis ToolPak add-in. Single Factor ANOVA compared student ATAR and marks for first-year physiology. When statistical significance was reported (P < 0.001), post hoc t-tests with Bonferni correction were performed between the comparison years (2012 vs. 2013, 2012 vs. 2014, and 2013 vs. 2014).
Data are presented as means ± SD.
For all years of the survey, the balance in male/female ratio was disproportionally female, which reflects the enrollment. In every year of the survey, the majority of respondents entered university directly from high school, with the remainder entering via alternative entry pathways. Consistent with the majority of the students entering their tertiary studies directly from secondary school, the majority of the respondents were <20 yr of age.
In 2013, the proportion of respondents from the main campus (campus A) was underrepresented (51% compared with 70–71% for 2012 and 2014, respectively), whereas one satellite campus was substantially overrepresented (campus D). Campus B was underrepresented in both 2012 and 2013.
Subjects’ final results.
The distribution of student grades for this first-year physiology subject (2011–2014) is shown in Fig. 1. The 2011 and 2012 data have been reported previously (19) and are included here to aid in assessing the impact of the change of andragogy toward blended learning. As reported previously, redesigning the subject to promote active, team-based learning substantially improved student outcomes in the subject, with far a larger proportion of students earning “A” and “B” grades. ANOVA analysis of the subject grades between 2012 and 2014 shows statistically a significant difference in the final grade [F(2, 1,354) = 70.4, P < 0.001]. Post hoc testing showed no significant difference in results between 2012 and 2013 [t(885) = 1.39, P = 0.16] but found comparison of both these years to be significant compared with 2014 [2012: t(906) = 10.9, P < 0.001; 2013: t(915) = 9.31, P < 0.001]. The mean grade in 2012 was 71.9 ± 9.9% and in 2013 was 71 ± 10.5%. This grade fell to 64 ± 10.7% in 2014.
Comparison of entry ranking (ATAR) showed no statistical difference between 2011 to 2014 student cohorts [F(3,1922) = 0.87, P = 0.46].
Careful inspection of the distribution of final results (Fig. 1) reveals that the distribution of grades seen in 2014 more closely resemble those observed in 2011 (traditional didactic andragogy) than those seen in 2012–2013 (team-based learning). The transition to blended learning in 2012 and 2013 shifted the entire cohort marks toward a greater proportion of students gaining “A” and “B” grades. In contrast, the cohort results shifted back to the lower “C” and “D” grades in 2014, when the lectures were removed from the subject. Comparing the ratio of students who earned either an “A” or “B” in each of the years between 2012 and 2014 with the proportion of students who earned a “C” or “D” (the AB/CD ratio) shows that, in 2012, for every student who earned an “C” or “D”, 1.4 students earned either an “A” or “B”. Similarly, in 2013, the AB/CD ratio was 1.3. In 2014, this ratio fell to 0.48, or, in other words, for every two students who earned either a “C” or “D” in 2014, one student earned an “A” or “B”. This is comparable with the 2011 BC/DE ratio (0.35).
The substantial alteration to the subject was entirely in the presentation of learning content (teaching), which went from 26 h of lecture and 26 h of workshop and the majority of content delivered in lectures (some online) to no face-to-face lectures (all content delivered online) and 24 h of small class workshops. It is reasoned that the substantial fall in subject grade averages is related to the altered teaching approach.
Student perceptions of learning support and tools.
In 2012 and 2013, face-to-face lectures were a key part of the subject presentation. In both these years, staff identified five key teaching and learning resources available to students. These were 1) lectures, 2) workshops, 3) asynchronous online discussion, 4) weekly pre-workshop quizzes, and 5) private studies. On the survey, students were asked to rank the value of each of these resources. A single ranking for each resource was determined by calculating the weighted average response for each resource at a particular rank (expressed as a percentage; weighted average was calculated by assigning a top ranking the value of 5 and a bottom ranking a value of 1 and multiplying the percentage of respondents at that rank by the assigned value). The weighted averages were then summed giving a score out of 5 for each resource. The value of each learning resource is seen in Fig. 2A. It is notable that the two teaching/learning resources where there was face-to-face interaction between instructor and students (lectures and workshops) were consistently seen as the most valuable teaching/learning resource by students. Forty-three percent of respondents listed lectures as their most important learning resource in 2012, and 44% listed it as their most important in 2013. Similarly, 60 (2012) and 47% (2013) listed workshop as their most important learning resource. The final rankings placed lectures and workshop as nearly equally highly valued for each of 2012 and 2013 (lectures: 2012, 4.2 out of 5; 2013, 4.1; workshop: 4.1, 4.2) and little distinction between weekly quizzes and private studies (quiz: 3.4, 2.9; private studies: 3.8, 2.9). Asynchronous online discussion was the least valued resource (2.1, 1.5).
In 2014, academic staff involved in the subject identified 10 different teaching/learning resources to assist students in mastering the subject content (Fig. 2B). On the 2014 survey, students were asked to rank (in order from 1 to 5) their top 5 (of 10) activities that contributed the most to their learning experience. The rankings of each learning activity are summarized in Fig. 2B. Students ranked workshops (face-to-face) as their most important learning experience (2.8/5). The next most valued learning resources in order were the weekly pre-workshop quiz (2.4), the recommended readings (1.7), and worksheets (1.6). Surprisingly the academic presentation (1.3), the student study guide (which contained a list of all the learning objectives and the workshop activities) (1), and the publishers’ animation package (1) only ranked as the fourth- to fifth-most valued resource. The remaining learning resources scored <1, with the asynchronous discussion being the least-valued resource, scoring a ranking of 0.08/5.
The popularity of each of the learning resources was also assessed by simply counting the number of times a resource was listed in the students’ top 5 learning resources. The order of popularity was pre-workshop quizzes (75% of students included them in the top 5), workshops (74%), recommended readings (58%), worksheets (57%), private studies (46%), academic presentations (43%), publisher’s animations (35%), student study guide/external sources (33%), and asynchronous forums (4%).
Students’ attitude toward pre-workshop quizzes.
Student responses on each of the surveys on the questions were related to positive attitude toward the pre-workshop quizzes. Each Likert item was on a five-point scale; 1 = strongly disagree and 5 = strongly agree. For each item, the proportion of students that rated the statement to that level was calculated and multiplied by the value (i.e., X1 for “strongly disagree” and X5 for “strongly agree”). A group attitude for each item was then calculated by calculating the weighted average of the student response to each item and then multiplied by 5, giving a score for each item out of 5. Items that expressed a negative attitude toward the pre-workshop quizzes were scored in reverse order (5 = “strongly disagree” and 1 = “strongly agree”). The item scores were then added up to give a single “student attitude toward pre-workshop quiz” score, with the higher score representing greater positive attitude toward the quizzes. In 2012 the maximum positive attitude toward pre-workshop quizzes was 15, whereas in 2013–2014 the maximum positive attitude was 40.
Figure 3 demonstrates the overall positive attitude students have toward weekly online quizzes. These quizzes are a hurdle requirement (i.e., they are compulsory but of no ultimate grade value). The response to individual Likert items can be seen in Fig. 3. Likert scale scores of 12/15 (78%, 2012), 32/40 (79%, 2013), and 31/40 (78%, 2014) were recorded for these items, suggesting that students have an overwhelmingly positive attitude toward the weekly quizzes.
Student usage of learning resources in 2014.
The majority of students self-reported viewing academic presentations and reading the prescribed text most of the time. Whereas completion of worksheets and attending interactive tutorials were reported as accessed only some of the time, attending synchronous online tutorials were reported as never being attended by the majority of students (Table 8).
Attitude toward blended learning.
In 2014, we surveyed students’ attitudes toward the blended learning model. We asked eight questions in relation to perceptions of the blended learning experience, and the weighted average response to each item is summarized in Fig. 4. Of a possible maximum 40 mark for the Likert scale for student negative perception of their learning experience, students reported a mediocre negative perception of 62%, suggesting that students are generally ambivalent to the learning experience in the subject in that year.
When students were asked about the “amount” of learning content (resources) available online, 60% of students reported that the resources provided were adequate for their purposes, one-third said there was too much content, and 5% said there was too little content.
On the 2014 survey, we gave the students a number of opportunities to suggest qualitatively ways to improve the subject or provide additional comments about the subject design.
There were 437 responses; several themes emerged from the responses (Table 9). Seventy percent of the responses requested either lectures or more face-to-face teaching. Only 12 out of 437 responses indicated that the students felt adequately supported in their learning.
To assess the possibility that the poor perception of the online subject was due to the quality and presentation of the resources provided, we also conducted thematic analysis in response to the question, “Which 2 or 3 specific aspects of the subject have contributed most to your learning?” (Table 10). Students clearly identified that the resources prepared and provided by the instructors were beneficial to their learning. They listed that the pre-workshop quizzes, workshops, online learning resources, and worksheets were positive contributors to their learning.
Time spent on subject.
In semester 2, in 2014, we conducted an additional survey asking students to report the number of hours per week that they spent on the first-semester core first-year physiology subject and compared their workload to other core first-year first-semester subjects. The number of hours reported by students on the subject (including the “swot vac” study period, a private study period before exams) was on average 90.6 h, and the total number of hours spent on other first-year core subjects was 89.6 h on average for the entire semester (Fig. 5, A and B). The university student workload policy (1) suggests that 150 h be spent on each of these subjects (including class time). Twenty-nine percent of the students reported spending only 1–3 h/wk on the core physiology subject, which effectively means that this is equivalent to class time only (Fig. 5A). There is no evidence that time spent on first-year physiology distracted the student from study in other first-year subjects (Fig. 5, A and B).
As we have reported previously, changing the subject learning design from a purely didactic teaching model to a blended teaching and learning model in 2012 significantly improved student outcomes in this subject. This was evidenced by the higher proportion of “A” and “B” grades that students earned relative to previous years (19). This observation was sustained in the second year (2013) of this teaching model and is consistent with the reported benefits of “blended learning” over traditional didactic techniques (16). However, Means et al. (16) also noted that online models of education provided no benefit over traditional didactic teaching. In 2014, in response to institutional pressures and priorities, all face-to-face lectures were cancelled in the subject being described here. Although 2-h weekly workshops were retained, these workshops remained as team-based guided enquiry activities and not teacher-led classes. In essence, no face-to-face teaching occurred in the subject; students either engaged with online learning resources provided through the LMS or participated in self-directed learning activities in class. Analysis of student performance without any direct face-to-face teaching activity (2014) demonstrates that the proportion of students with “A” and “B” grades falls compared with the blended approach used in 2012 and 2013. Although pass rates remained high in 2014, the distribution of grades is more closely related to the 2011 student performance.
Our previous paper (19) indicates that as much as 40% of the assessment in the subject is the result of team activities during the semester. Among the team activities are weekly participation in workshops (20%) and two team tests (2 × 10%). In addition, during the semester, students complete two individual assessments that are also worth 10% each. We have previously indicated that a “D” and “C” grade in the subject represents effective participation and engagement in the subject and team assessments with only minimal individual success in achieving the intended learning outcomes, whereas “A” and “B” grades demonstrate higher levels of individual mastery of the subject content. Thus a high proportion of students achieving “A” and “B” grades in 2012–2013 indicates that the blended learning model encouraged greater individual student success in the subject. This conclusion was supported by structural equation modeling (18), which suggested that effective team-based learning improved individual marks on the final exam.
In contrast to the 2012 and 2013 outcomes, eliminating face-to-face teaching in 2014 seemingly has negated the observed gains in student outcomes. The high proportion of students who scored “C” and “D” grades in 2014 indicates that students are more heavily relying on the team marks to pass without demonstrating individual mastery of the content (freeloading from other students).
Caution should immediately be expressed regarding our interpretation of the results data. First, in each of the years 2012 and 2013, the specific learning objectives of the subject were divided into two components, the “core learning objectives” (CLO; 70% of the subjects’ learning objectives) and “extension learning objectives” (ELO; 30%), with the CLOs being taught and assessed in face-to-face classrooms, whereas ELO content was accessed exclusively by students online and assessed in the final exam. The elimination of all face-to-face lectures in the subject in 2014 made the division between CLO and ELO redundant; thus all learning objectives were assessed as they were covered during the semester, and the final exam was summative of all the subjects’ learning objectives. Thus the difference in performance between 2012–2013 and 2014 may be explained by the absence of the CLO/ELO division in the subject rather than a direct effect of the change in pedagogy.
Students’ perceptions of their learning experiences.
McVicar et al. (15) have identified students’ perceptions of their own learning experiences as being correlated to their success in human biosciences. Factors that influence the student perception included the quality of the teaching in the subject, the sense of being overwhelmed by the content, and the extent that they feel supported by the teaching staff. In this context it is important for teaching staff to identify the factors in the subject design that students value in their learning environment. Our institutional student feedback of subject surveys has indicated generally high satisfaction by students with the subject design. Notably, however, in 2014 student appreciation for the subject design, teaching, and learning support fell by 20% (data not shown).
In 2012 and 2013, we asked students to rank the learning activities in the subject by importance. Five distinct learning resources were identified: lectures, workshops, asynchronous discussions, weekly quizzes, and private studies. In these years of the survey, students consistently ranked the lectures and workshops as their most important learning resource. However, in 2014, students did not find online recorded academic presentations to be an adequate substitute for the live presented lectures. Students’ dissatisfaction with the absence of lectures in the subject was further reinforced by the expanded answer survey items, with 70% of students asking for more face-to-face time when asked to suggest ways that teaching staff can offer more support and 57% of respondents asking for more contact time with teachers when asked how the learning experience could be improved. A clear theme emerging from the survey in 2014 was that students felt that the lack of face-to-face interaction with teaching staff, and in particular a lack of face-to-face lectures in the subject, negatively impacted their learning experience. Notwithstanding students’ dissatisfaction with the absence of lectures, students reported that the resources provided by the teaching staff were significant contributors to their learning experience, suggesting that these students found the resources to be accessible, helpful, and constructive to their learning experiences. This clearly identifies that the lack of lectures, rather than the quality of the online resources, was the source of the student anxiety.
As part of an effort to encourage student engagement and provide increased contact time between teaching staff and students, synchronous online tutorials were provided during semester, using proprietary virtual classroom software (Collaborate; Blackboard, Washington, DC). Despite the vocal demand for more “lectures” and “face-to-face” time, students did not see the synchronous online class, where they could interact with an academic, as a substitute for the in-class environment. This may relate to access and availability issues, as the synchronous (Collaborate) classes required good internet availability and capacity to attend a class during evening hours, when family or work commitments may have interfered.
Our data also indicate a general lack of student engagement. Evidence of the lack of student engagement in the subject in 2014 (blended mode) was seen when we asked students to report on the number of hours that they dedicated to the subject over the semester. Students reported spending an average of 90 h during the semester dedicated to study of the subject. This compares unfavorably with the university student workload policy (1), which lists expected time spent on this subject as 150 h. What is striking is that nearly 30% of the students self-reported spending no more than 3 h/wk on the subject, indicating that the only time they focused on the subject was during the workshops and the online quiz. Thus, our data show that although students can be induced to attend class, by assigning a grade value to attendance they do not replace the lost contact time with more time engaging in the material online.
Since 2012 we have been surveying student attitudes toward compulsory pre-workshop quizzes. These quizzes were instigated as an engagement tool. The rationale was that students had to demonstrate some level of preparedness to be able to actively and meaningfully engage in the team learning task in workshops. From the outset, we were concerned that students would react negatively to a compulsory task that did not itself attract a grade mark and which failure to complete would result in forfeiture of marks. We were also interested in students’ perceptions of being required to complete the task before the workshop activity. Since 2013 we have used a proprietary software package (Mastering A & P, Pearson) provided by the textbook publishers and linked closely to the textbook itself. In all 3 yr of the survey, students reported a strong positive attitude toward the weekly quizzes, with the Likert scale value being ~78% of the maximum possible positive attitude rating toward the test. Individual questions on the scale and extended answer comments reveal that students found that the weekly quizzes helped them stay up to date with their studies and were a good revision tool for the weekly learning objectives. They also reported that they did not find the weekly quizzes excessively difficult or of little value. Finally, students were required to achieve a benchmark grade on the quiz (75%) to qualify for their workshop grade; students reported that this benchmark was not difficult.
Expanded answers to the survey itself support the Likert items, with students reporting that the quizzes encouraged constant revision of the course material. Interestingly, students also reported that allowing multiple attempts at the quiz was a disincentive to actually revise material, as the answer, if it was too difficult to reason, could be found by simple random guessing until the correct answer was achieved.
Importantly, the positive attitude toward the weekly quizzes acts as a control for the 2014 survey of the student learning experience. Although the student attitude toward the online learning experience, in the absence of didactic teaching, was ambivalent (at best), these same students reported strong positive attitudes toward the quizzes, consistent with the previous years’ results. Thus the negative perception of the learning experience in the subject cannot be dismissed as a cohort effect, with students being harsher judges than in previous years.
Disruptive technologies are intruding into every aspect of life and changing the way we do things. As tertiary education students become more “digitally native,” the thought is that they increasingly want their content available when they want, how they want, and where they want. That is, the content that we as educators provide is like any other content available on the Web and should be provided in that form. There is an inherent logical error in this thinking; entertainment content is sought by the consumer, and they will watch what they find interesting and entertaining. The amount of time that they will spend with the content from the Internet is proportional to their incentive to engage with it. It is without doubt wrong to confuse educational content with entertainment content. In physiology, it has long been known that we have to cater to the needs of students who can be described as “science averse” and reluctant learners. Thus we are asking students to consume content through online resources that they would not voluntarily seek out. Furthermore, unlike entertainment content, we expect our students to absorb the content and be successful on tests, on assessments, and in problem solving after watching this media, all things that they would not normally do with their entertainment from the Internet.
Similarly, there is a risk of “throwing the baby out with the bathwater” in the drive to eliminate didactic teaching from the curriculum. Scheduled and structured lectures tell the students that there is a time, place, and environment where they need to be. In effect, the structured timetable provides a scaffold around which students can plan their activities. Ultimately, we are training future health care professionals; these are people who will need to attend their workplace at a particular location and specific time and interact with real people in the work environment. Universities in Australia want their graduates to be “work ready” but insist in providing an environment that devolves the responsibility to be present.
Finally, the advantage of blended learning is that it provides an avenue to increase the student engagement in their learning. The success of the blended program is likely the result of increased student engagement. However, as has been reported previously, online learning provides no similar benefit. Blended content should augment the learning activities in the subject, not be a substitute for it, and our study here demonstrates that it can have negative consequences in terms of student success when done for its own sake.
A poor attitude toward online learning is not a reflection of dissatisfaction with the subject. Students have generally perceived the online quizzes as a positive learning experience. Survey results regarding online learning quizzes are consistent across all years of the survey.
No conflicts of interest, financial or otherwise, are declared by the authors.
J.P., T.M.-A., N.W., D.H., and J.A.R. conception and design of research; J.P. and J.A.R. analyzed data; J.P. and J.A.R. interpreted results of experiments; J.P. and J.A.R. drafted manuscript; J.P., T.M.-A., N.W., D.H., and J.A.R. edited and revised manuscript; J.P., T.M.-A., N.W., D.H., and J.A.R. approved final version of manuscript; T.M.-A., N.W., D.H., and J.A.R. performed experiments; N.W. and J.A.R. prepared figures.
Present address of J. A. Rathner: Department of Health and Medical Sciences, Faculty of Health, Art, and Design, Swinburne University of Technology, P. O. Box 218, Hawthorn, Victoria, 3122, Australia (e-mail: firstname.lastname@example.org).
- Copyright © 2017 the American Physiological Society