Learning via online activities (e-learning) was introduced to facilitate existing face-to-face teaching to encourage more effective student preparation and then informed participation in an undergraduate physiology laboratory-based course. Active learning was encouraged by hypothesis formation and predictions prior to classes, with opportunities for students to amend their e-learning submissions after classes. Automatic or tutor feedback was provided on student submissions. Evaluation of the course was conducted via student questionnaires, individual student interviews, and analysis of student marks in examinations and of the e-learning component. Student feedback on this entire subject in the university-wide quality of teaching survey was very high by University of Melbourne standards and most encouraging for the first implementation of such a curriculum modification. Results from further detailed surveys of student interactions and engagement and correlation analysis between student responses were also very supportive of the effectiveness of the course. There were no significant differences between examination marks in the new course with e-learning and the previous year without e-learning. However, there was a significant correlation between assessment of student e-learning work and their final examination mark. Correlation analysis between various survey responses helped interpret results and strengthened arguments for e-learning and suggested future improvements for student use of e-learning. This mode of e-learning used to support face-to-face learning activities in the laboratory can be adapted for other disciplines and may assist students in developing a greater appreciation and a deeper approach for learning from their practical class experiences.
- online learning
- blended learning
- active learning
e-learning, using web-based activities, is used extensively in distance education courses in which it is claimed that comparable learning outcomes to campus-based courses can be achieved (1, 21, 38, 41). Such e-learning is also used in on-campus courses, with a blended approach to teaching and learning, to support “traditional” classes such as lectures, tutorials, and/or laboratory-based classes (12, 16, 28, 32). In this context, e-learning has been used in a variety of ways including case-based learning, quizzes, synchronous and asynchronous online discussions, virtual seminars, interactive lecture notes, a frequently asked questions facility, and online revision support (4, 11, 14, 16, 32). However, Hedburg (18) cited an unpublished study by Alexander that showed that most students see e-learning as an educational resource, often only as background support rather than mainstream function, with the next largest group seeing its use for supporting discussion forums. It has also been suggested that there is a need to revisit e-learning and go beyond standard offerings of learning management systems to provide opportunities for more interactivity to engage students and to develop students' higher level learning skills (18).
Nevertheless, there have been examples of e-learning used in blended approaches to teaching and learning that encouraged interactive and deeper learning styles. These have yielded varied outcomes. e-Learning was introduced in a Physiotherapy course to complement lectures and laboratory classes by providing students with video clips of patients with neurological disorders and subsequent questions testing their observational and analytical skills and proved to be beneficial for students' preparation toward clinical placements by providing opportunities for formative assessment in context (12). Synchronous and asynchronous online discussions, as a learning forum, led to diverse student participation (4, 11). While online discussions encouraged some students not usually contributing in face-to-face sessions to do so in a less threatening environment, the online discussions were unfortunately dominated by a minority of participants (4). Participation rates were further complicated by the students' confidence and attitudes toward the technology (17). The success of online discussions has been shown to depend mostly on course design, facilitation style, and group dynamics (11). Furthermore, Ellis and Calvo (14) showed that students' approaches to the discussions influenced their performance not only in the discussions but also in their exam and in the subject overall. Students with a deep approach to the discussions and a deep understanding of the relationship between the discussions and possible learning outcomes tended to approach the discussions in more meaningful ways, to hold more positive perceptions of this learning tool, and to perform better than those students with a surface approach to the discussions (14). These findings are in accordance with those of the deep and surface approaches to learning previously described by Marton and Saljo (27). Similar findings were also made in another blended approach to teaching and learning where case-based online learning was used to support face-to-face teaching (16). Interestingly, this study also found a statistically significant relationship between students' conceptions of case-based learning and their approaches to both the face-to-face and online learning (16). These investigators based their methodologies on the relational student learning research from Marton and Saljo (27) and other similar studies. They found that students who held a cohesive conception of case-based learning displayed a deeper approach to the learning and also performed better in the subject than those with a fragmented conception (16). However, in a further study with psychology students, this cohesive conception relationship with exam performance only applied to online discussions and not to face-to-face discussions (15). Thus, all of such research suggests that implementation of online learning in a blended learning approach should aim to promote a deep approach to learning and a profound appreciation of the possible learning outcomes from e-learning. Furthermore, a wide variety of e-learning interactions should be considered in developing student-centered activities, with scaffolding to assist student assessment as the progress to their final learning outcomes in a course (18). New representations of knowledge may also be appropriate to improve learning (7). Overall, information and communication technologies may enable new forms of learning to take place, but it is the use of appropriate pedagogy and the engagement of students with the process that determine whether there are improved learning outcomes as a result (23).
Active learning has often been show to enhance student learning and performance (25, 29, 43). Hypothesis testing and predictions have been widely used as learning tools to promote active learning (3, 6, 10, 35, 39, 40). Several studies have demonstrated the effectiveness of the use of hypothesis testing and/or predictions as learning tools (5, 8, 26, 30, 31, 42). Of particular relevance to this study was research that was also conducted in a physiology laboratory course (30, 31). In these studies, students who were required to verbalize predictions of their results to an instructor prior to conducting the experiment were found to subsequently perform significantly better in posttests than those who did not complete any predictions or did not verbalize their predictions to an instructor beforehand (30, 31). Hence, these results indicate that this practical class teaching is more effective only when students verbalize their predictions to an instructor prior to conducting their experiment. However, these investigators suggested that the students who did not have to verbalize their predictions to an instructor prior to the experiment may not have taken these predictions seriously or may not have approached the learning task as intended, thereby resulting in poorer performance (30, 31). Perhaps the most important finding from the studies of Modell et al. was that students were more likely to correct their preexisting misconceptions if they had committed to a prediction and then found that it was erroneous when they interpreted their experimental data than those students who performed the experiment without predictions and continued with the preexisting beliefs despite the experimental evidence to the contrary (30). The use of predictions as a learning tool has also been shown to be more effective when students are required to provide explanations with their predictions (5). Furthermore, students can improve in their predictive ability with practice and appropriate timely feedback (19, 20, 36).
e-Learning was implemented in a Physiology laboratory-based subject for students undertaking their second year of a Science degree. This e-learning aimed to support and extend the face-to-face class activities that were already available in this subject and were previously sometimes supported by paper-based tasks. The goal of e-learning was, first, to encourage students to undertake the prereading specified before practical classes and having them test their knowledge and understanding, generally in scenario-based activities. e-Learning was also designed to encourage individual student-centered active learning that better prepared each student for the following face-to-face class activities. Thus, e-learning gave students an opportunity to start considering and trying to understand important concepts in their own time and at their own pace prior to the class. This could then lead to a more efficient use of limited class time. We hoped that a better prepared student would contribute more effectively to their peer-to-peer interactions and in discussions with their demonstrators and that this could, in turn, lead to students developing a deeper approach to learning about physiology. Thus, e-learning was just one component of our blended approach to teaching and learning.
The e-Learning website allowed students to view the work due in the current week and plan for what was a further week ahead, as shown in Fig. 1.
Hypothesis testing and predictions were the main e-learning approaches used to promote active learning about the underlying physiology before classes. Prior to each class, this e-learning usually required students to complete tasks in which they predicted the outcomes of a potential experimental investigation. Tasks also encouraged them to consider the underlying mechanisms or concepts supporting their predictions by providing appropriate explanations. Following the practical classes, students interpreted their results, reviewed their submitted predictions, made changes when appropriate, and then resubmitted their e-learning work as their interpretation of their results. An example of such activity is shown in Fig. 2, where students were asked before the exercise practical class to predict and explain the anticipatory responses seen in the cardiovascular system and then, after the practical class, they were asked to reconsider their predictions and additionally to consider the consequences for skeletal muscles involved in the exercise. As shown in Fig. 3, students were asked to review their prepractical predictions about the transporters involved in the acid-base class experiment and also their prior explanations of the consequences for important variables.
To encourage students to take their predictions and e-learning work seriously, e-learning activities were assessed. Tutors also provided feedback, often incorporated within students' own text submissions, on students' final answers and explanations for student later reflection. The total assessment for the pre- and postpractical e-learning work contributed to 10% of the subject assessment.
This study aimed to investigate student use of e-learning in this laboratory-based course and to evaluate the learning outcomes that may have been enhanced by the active learning facilitated by the e-learning component of the course. The evaluation was conducted via student questionnaires, individual student interviews, and analysis of students' marks in examinations and assignments.
A detailed evaluation questionnaire was distributed to all students within the second-year Science Physiology laboratory-based course in which the e-learning component was trialed. In addition, the usual university-wide “quality of teaching” (QOT) student feedback survey was distributed to students. This latter survey is a requirement for each subject taught at the university. Student enrollment in the subject for 2005 and 2004 were 31 and 47 students, respectively.
The detailed evaluation questionnaire contained rating-type questions with a 5-point Likert scale in which 5 was usually “strongly agree,” 3 was “neither agree nor disagree,” and 1 was “strongly disagree.” It also contained open-ended questions that required students' comments. This questionnaire also aimed to investigate views on the general subject format, the usability of the e-learning website, and the learning outcomes from both the online learning component of the subject and the research-based assignment. A total of 29 of 31 students fully completed the questionnaire, a response rate of 94%.
There were several approaches to the analysis of the completed questionnaires. Students' comments for each open-ended question were transcribed, and a thematic analysis was undertaken to allow the determination of the number of students who made similar comments under each theme. For rating-type questions, the percentage of students who selected that rating in addition to the mean student rating was determined. Further analysis of rating-type questions involved testing for correlations between any of the key questions set for the evaluation of e-learning.
The ratings-type questionnaire data was entered into Excel spreadsheets for initial statistical analysis. Relevant data sets were pasted from Microsoft Excel to SPSS for further correlational analysis, and the relevant correlations were then pasted back into Microsoft Excel and reorganized into appropriate tables for incorporation into the article.
Means and SDs have been given to indicate the variability of the data, even though the data may not be normally distributed. In rating surveys with Likert scales, there are many coincident points because of the limited set of responses, so scattergram plots have not been presented, as in the practice in other correlation studies in educational research.
Rank correlations, Kendall's (τ) and Spearman's (ρ) correlation coefficients, were chosen as they are most appropriate for analyzing Likert-type survey data that have some ordinal nature but have ranges that may neither be fixed nor viewed similarly by the respondents to surveys (9). Although Pearson's (r) product moment correlations are often used in reports with Likert scales (and they were calculated in our study case for comparisons), they are influenced by the choice of scale and one of the variables must be normally distributed, so Pearson's analysis is considered inappropriate for comparing such studies. Educational researchers commonly use 3-, 5-, 9-, and even 25-point scales, which will give quite different correlation results (9). Correlations were analyzed with all three methods, although only τ values are reported in this article as it provides a distribution free test of the dependence between two variables as appropriate for our study. ρ Values are satisfactory for testing the null hypothesis of independence between variables, but its interpretation is more difficult when the null hypothesis is refuted. (It is interesting to note in our data that Kendall's analysis showed one significant correlation where Pearson's did not, and on one occasion Pearson's showed a significant correlation not supported by Kendall's analysis.)
Volunteers were requested from the entire class to participate in individual student interviews. An effort was made to obtain at least one volunteer from each group of students who worked together for the assignment work. Six students volunteered for the interviews. These volunteers represented all assignment groups except for two.
All interviews were conducted by A. M. Dantas, who was also the convenor of the subject. While there were some obvious disadvantages to having a convenor of the course conduct the interviews and therefore not be independent, one major advantage was that this convenor was immersed in and familiar with all of the details of the course. The interviews were conducted at the end of the semester to avoid any possible conflict of interest with the student assessment.
Interviews were audio recorded and transcribed later. The aims of the interview were to investigate students' past experience with online learning prior to this subject, details of students' views on the clarity and navigation of the website, any problems they may have encountered, and the learning outcomes that they thought had been achieved by e-learning and the assignment. Additionally, the interview aimed to determine the students' feelings toward the group work in the assignment and the performance of their group.
Quality of Teaching Survey
As shown in Table 1, this laboratory-based subject rated well in the standard university-wide QOT survey that evaluated the entire subject. Student gave a rating of 4.1–4.4 for all of the questions on this questionnaire (the mean value was 4.2 compared with an average of 3.8 across the university and was an improvement on the previous year of 3.9); 87% of the students agreed that they had a clear idea of what was expected of them in this subject and that the subject was well taught and was intellectually stimulating. The same percentage of students agreed that they received helpful feedback on how they were going in this subject. The average rating of 4.2 received for this question was well above the university average score of 3.5. Similarly, the average ratings of 4.2 and 4.4 for the questions “There was effective use of computer-based teaching materials in this subject” and “Web-based materials for this subject were helpful,” respectively, were considerably higher than the university average of 3.5 for both questions and a major improvement in scores from the previous year. These two questions almost exclusively reflect the additional contribution of e-learning as no other web-based material was used in the course and only one other computer-based teaching material was used throughout the course (a virtual experiment on the practical laboratory computers).
Evaluation of e-Learning
Table 2 shows students' responses to questions investigating the usability of the e-learning website. Overall, positive results were obtained with all questions rating at 3.7 or above. More than 75% of the students agreed that the e-learning website was easy to navigate, had a logical structure, provided clear instructions of what they had to do, and provided clear due dates and times for submission of their work. A slightly lower proportion, 67% of the students, agreed that the e-learning website was user friendly.
As shown in Table 3, the questions evaluating student use of and learning outcomes from e-learning did not rate so well overall. While 87% of the students agreed that the entire subject was intellectually stimulating in the QOT survey (Table 1), only 51% of the students agreed that work on e-learning, as opposed to the entire subject, was intellectually stimulating (Table 3). Only 55% of the students agreed that the e-learning work was pitched at an appropriate level for them, and an even lower proportion, 38% of the students, agreed that e-learning motivated them to learn. However, only 49% of the students agreed that they put a lot of effort into completing their e-learning work. Interestingly, 79% of the students agreed that if they were asked to make predictions, they thought about them carefully and seriously tried to make those predictions. This question rated the best in this section of the survey, with an average rating of 3.8 attained. Nevertheless, only 47% of the students agreed that the prepractical work, which usually consisted of the predictions, encouraged them to consider the theory during the practical classes. Approximately half of the students (55%) agreed that e-learning improved their understanding of the practical classes and the underlying theory, and 58% of the students agreed that the generic skills of critical thinking and analysis was developed via e-learning.
Interesting findings were made when the possibilities of correlations between students' responses for the different questions were investigated, and these affect the interpretation of the previous student responses. There was a positive correlation between the appropriateness of the pitch and the intellectual stimulation of e-learning, as shown in Table 4 (τ = 0.36; P = 0.03). Students who agreed that e-learning was pitched at an appropriate level for them also found e-learning to be intellectually stimulating. As shown in Table 5, there was no correlation between the relative number of e-learning tasks completed and the motivation to learn resulting from e-learning. However, Table 5 also shows several positive correlations. There was a positive correlation found between the motivation to learn resulting from e-learning and the intellectual stimulation of e-learning, students' efforts into completing the e-learning work, students' seriousness for the predictions, or the encouragement of the prepractical work to consider theory during the practical classes. Students who agreed that e-learning motivated them to learn also agreed that their critical thinking and analytical skills were developed by e-learning (τ = 0.452, P < 0.01).
Students who agreed that e-learning work was intellectually stimulating, that they put a lot of effort into completing the e-learning work, that they thought about the predictions carefully and seriously, or that the prepractical work encouraged them to consider theory during the practical classes also agreed that e-learning motivated them to learn.
As shown in Table 6, there was a positive correlation between the ratings for the seriousness of the predictions and the encouragement of the prepractical work to consider theory during practical classes (τ = 0.33, P = 0.04). Thus, students who agreed that they thought about the predictions carefully and seriously also agreed that the prepractical work, which usually consisted of predictions, encouraged them to consider theory during the practical classes. This is in accordance with one of the aims of the use of predictions in e-learning.
The six individual student interviews concurred with the results from the evaluation questionnaire but also provided details of where and how any problems may have been encountered and importantly how future improvements could be made with e-learning. Half of the students interviewed had minimal or no previous experience with online learning. In general, students thought that the navigation around the website was good, but some felt that further instructions could have been provided, especially on how to perform some of the required tasks. One student in particular admitted to being “computer-phobic and computer-illiterate” and would have liked a lot more instructions for even the simple tasks. Interestingly this same student stated that she could not complete the activities directly online but instead printed out the activities and completed them in a paper-based format before copying her answers back onto the e-learning website.
Students felt that more alignment should exist between the e-learning tasks and the practical classes. They sometimes had difficulties seeing the relevance of the e-learning activities toward their understanding of the practical classes. Accordingly, students thought that e-learning generally helped their understanding of the practical classes and the underlying theory but not to the same extent for all of the class topics. They all commented that e-learning generally required them to think and that answers to questions asked could not simply be found in textbooks.
Students had very positive comments regarding the group work in the research-based assignment. All but one student felt that their group functioned well and that group members all contributed to the group's work. In fact, the student who felt that his group did not function well stated that a dominant and overpowering person was responsible for the group being dysfunctional.
Positive comments were also made regarding the drafting and feedback process for the scientific report. All the students felt that the process made a substantial difference to their appreciation and understanding of scientific report writing. They were more confident that they would be able to write a scientific report on their own and that they would be more critical of research papers when reading them in the future providing that it was pitched at their level of understanding.
Results were compared for the overall marks for the subject between year 2005, when e-learning was introduced, and year 2004, without the use of e-learning and between the corresponding examination components of the subject, but no significant differences were detected.
However, a positive correlation was found between the students' marks allocated for the e-learning component and the examination component for the new subject (τ = 0.400, P < 0.01), as shown in Fig. 4.
This study aimed to evaluate an active learning component supported by e-learning for the first time in our Science laboratory-based subject using a blended approach to teaching and learning. Overall, as shown in Table 1, student feedback on this entire subject in the university-wide QOT survey was very high by university standards and most encouraging for the first implementation of such a curriculum modification. It also represents a significant improvement on the mean score for the previous year.
As shown in Table 1, the majority of students felt that they had a clear idea of the expectations from this subject, that the subject was well taught and intellectually stimulating, that they received helpful feedback on how they were going in this subject, and that they were overall satisfied with the quality of the learning experience in this subject. Although e-learning was not the only teaching and learning activity conducted in this subject, one can assume that, as one of the learning tools, it did contribute to the high feedback scores received overall for this subject. This assumption is reinforced by the very high ratings of 4.2 and 4.4 (especially considering the university average rating of 3.5) attained for the effective use of computer-based teaching materials and the helpfulness of web-based materials for this subject, respectively. These two ratings almost exclusively reflect the contribution of e-learning as no other web-based material was used in this subject and only one other computer-based teaching material, in the form of a virtual experiment, was used throughout the course. These scores are also a considerable improvement on the previous year's scores of 3.5 and 2.9 in these areas. e-Learning was most probably also an important factor that influenced the very high rating of 4.2 (especially considering the university average rating of 3.5) achieved for the helpful feedback that students received in this subject. e-Learning provided an easy medium for feedback to be provided by demonstrators as well as automatically for some tasks. Feedback was given via e-learning on both the pre- and postpractical tasks completed. The other written feedback provided by demonstrators was through the paper-based practical report submissions and drafts submitted on a scientific report throughout the semester.
Another important positive aspect of the QOT responses was the high percentage of students (87%) who agreed that they had a clear idea of what was expected of them in this subject (Table 1). Such clear expectations is a factor that could determine students' approaches to learning, especially in a blended approach to teaching, as shown by Ellis et al. (16). These investigators demonstrated that students who had a cohesive notion of the teaching and learning tool and a profound appreciation of the possible learning outcomes from this learning tool displayed a deeper approach to the learning and also performed better in the subject that those with a fragmented conception (16). These clear expectations were reinforced by the positive feedback on the usability of the e-learning website. The majority of students felt that the website was easy to navigate around, was user friendly, had a logical structure, and provided clear instructions of what was needed to be done, although the instructions could have been enhanced according to the student interviews. These results are very positive especially considering that it was the first trial of the e-learning website in this subject.
In the detailed evaluation questionnaire, the questions evaluating student use of and the learning outcomes from e-learning did not rate so well overall (Table 3). However, some explanations for these lower ratings can be deduced from the correlations found between some of the responses to questions asked. Interestingly, a positive correlation was demonstrated between the appropriateness of the pitch and the intellectual stimulation of e-learning (Table 4). Thus, students who agreed that e-learning was pitched at an appropriate level for them also found e-learning to be intellectually stimulating. Unfortunately, the question does not allow the differentiation of whether the pitch may have been too low or too high for the other students. Hence, more detailed research needs to be conducted on the appropriateness of the pitch of e-learning for students undertaking this second-year Physiology laboratory-based subject, and the e-learning tasks will need to be reviewed accordingly.
A particularly low and disappointing rating of 3.1 was achieved for the statement “e-Learning motivated you to learn,” and only 38% of the students agreed with that statement. There were no correlations between that statement and the relative number of e-learning tasks completed (Table 5). However, the motivation to learn resulting from e-learning correlated positively with the intellectual stimulation of e-learning, students' efforts into completing the e-learning work, students' seriousness for the predictions, or the encouragement of the prepractical work to consider theory during the practical classes (Table 5). Thus, students who agreed that e-learning work was intellectually stimulating, that they put a lot of effort into completing the e-learning work, that they thought about the predictions carefully and seriously, or that the prepractical work encouraged them to consider theory during the practical classes also agreed that e-learning motivated them to learn. These results suggest that the motivation to learn from e-learning was dependent on the students' engagement with e-learning and not on the number of e-learning tasks that they completed. A complicating factor might also be the students' confidence with using computers and web-based activities. A factor that has been shown to contribute to students' engagement in online learning is the motivation provided by the online learning. Lack of motivation has been found to be the major reason for student dropouts in online courses (22). Motivation has also been shown to be the only significant factor influencing achievement in online learning (37). In a course with a blended approach to teaching and learning, as per the present study, intrinsic motivation was found to play a more important role in learning than extrinsic motivation (13). These investigators found that internal motivation, such as the will to learn, the desire to solve a problem, the will to understand the course content, and internally rewarded learning, is the critical factor that determines learning in a blended course. Students with extrinsic motivation such as grades and time were more prone to lose overall motivation (13). Thus, these findings must be considered when attempting to enhance e-learning to promote student motivation and engagement with the work.
As mentioned previously, the use of hypothesis testing and predictions as learning tools in e-learning aimed to promote active learning and a deeper approach to learning. In addition, several studies have previously demonstrated the effectiveness of the use of hypothesis testing and/or predictions as learning tools (5, 8, 26, 30, 31, 42). A large percentage (79%) of the students agreed that if they were asked to make predictions, they thought about them carefully and seriously tried to make those predictions (Table 3). Nevertheless, only 47% of the students agreed that the prepractical, which usually consisted of predictions, encouraged them to consider the theory during the practical classes. Paradoxically, considering the percentages of the students' responses, there was a positive correlation between the ratings for the seriousness of the predictions and the encouragement of the prepractical work to consider theory during practical classes. The low student numbers investigated in this study may have contributed to these findings. As for the low percentage of student who agreed that the prepractical encouraged them to consider theory during practical classes, the lack of student engagement and motivation to learn may have been contributing factors, as observed above.
We also compared exam results between the present and previous course without e-learning, but there were no significant differences detected. This is not surprising with the small numbers involved in the course and the many contributing factors that might affect such comparisons, as discussed previously with curriculum innovations where such comparisons have not yielded fruitful results (34).
However, we did find a positive and significant correlation between the assessment of students' performance in the e-learning component and their performance in the final examination. Although we cannot attribute this to any particular aspect of the e-learning, it does indicate that students who worked steadily and performed well on the e-learning throughout the semester developed a better understanding of the subject. As discussed above, the student surveys suggested that the motivation to learn from e-learning was dependent on the students' engagement with e-learning. However, the anonymity of the surveys does not allow us to determine whether those students who were more engaged with e-learning, and therefore more motivated to learn from e-learning, did indeed perform better in their e-learning work and achieved better learning outcomes. Students' attitudes, and therefore engagement, toward e-learning may be affected by both their learning styles and their computer literacy, as illustrated by the student who preferred to write their answer and then paste it back into the submission page. It is likely that many of the present and more of the future students may prefer electronic communications as “digital Natives,” as termed by Prensky (33), become more prevalent in our societies, although their digital competencies are not guaranteed by increased everyday immersion, as reported in a recent study, and age may not be a good indicator of the digital Native and their prevalence may be lower than was anticipated (2). Digitally experienced students may expect e-learning to be provided for them to engage in learning tasks effectively. There is also evidence that college students are more likely to seek help when it is available electronically rather than through traditional avenues (24). However, it has also been reported that a majority of students and staff in an extensive survey of university students did not like courses to be solely online and have a range of preferences for their learning (2). This dislike of completely online courses may reflect the lack of face-to-face activities that would be possible with a blended approach to teaching. However, these observations could also reflect a lack of appropriate pedagogy and interactivity of the educational materials in those online courses.
We obtained positive outcomes from our first implementation of an active learning component supported by e-learning in a Science laboratory-based subject with a blended approach of online activities and the strengths of face-to-face class activities to encourage student learning. However, the e-learning activities need to be reviewed and enhanced, in light of the evaluation results, and be focused on improving student engagement and their appreciation of the value of e-learning to their potential development. For example, new representations of knowledge (7) may be appropriate to cater for our diverse group of students with different learning styles. We may also need to consider a wider variety of e-learning activities, with scaffolding to assist students' self-assessment of their progress to more clearly articulated final learning outcomes for the course, along the lines of those suggested by Hedberg (18). Our goals are to promote better student engagement and motivation to allow students to develop deeper learning practices that may further improve their learning outcomes. Our blend of e-learning and face-to-face teaching in the laboratory can be adapted for other disciplines and could assist students to develop a greater appreciation of the value of the learning available in their practical class experiences. However, it is important to be mindful of the finding that the biggest barrier to incorporating information technology in learning and teaching (both in student and faculty member views) was reported in Berger's survey (2) to be that “Instructors don't know how to implement it.”
Student preparation for practical classes can be encouraged with paper-based activities, as we did prior to implementing e-learning, but the provision of timely feedback and administering the activities to allow students to reflect, revise, and resubmit their work was more difficult. e-Learning allows student work to be accessed by assessors to provide feedback as soon as the work is submitted online. In contrast, paper-based tasks need to be physically submitted, distributed to appropriate assessors, and then physically returned to students, a process that could take several days to weeks. e-Learning has the additional advantage of automatic feedback, which can be progressively revealed as students complete stages of the work. Unfortunately, it was not possible to compare student performance with the prior written tasks as the e-learning tasks were different in their expression and the inclusion of interactive and visual elements and the novelty effect of a new approach to learning may also have confounded any comparisons.
In conclusion, the most important characteristic of the approach, whether e-learning or paper-based activities, is to have the students be prepared and engage more knowledgably in the face-face activities in practical classes. It is the pedagogy and quality of the preparatory task that are paramount and not the mode of delivery. However, there are some advantages with e-learning in the range of activities, engagement, guidance, and feedback that is enabled for students whenever and wherever they are connected to the internet compared with paper-based activities. Laboratory activities are increasingly difficult to fund, so institutions need students be prepared to make the most of these limited opportunities and learn more effectively, particularly the skills of scientific investigators.
In our future evaluation of e-learning, we plan to code student surveys so that the investigators are not aware of student identities but independent research assistants can correlate performance in components of the examination and other assessment in the course with their survey responses. In this way, we may be able to find who is positively engaged with the e-learning process and other aspects of the course and if this results in improved learning outcomes. We would also like to investigate if there are any other factors that influence the students' involvement with e-learning and if this information can be used to further enhance the students' learning experiences and their learning outcomes.
There are also several future directions to enhance e-learning experiences that we plan to investigate but were not possible in this first implementation because of the lack of integration of the interactive e-learning system with the university's learning management system. We could strengthen teamwork on-line and out of class with asynchronous communications and also use peer review to develop students' abilities to critically appraise the work of others and then be more self-critical of their own submissions.
We also intend to use e-learning to help students to be better informed in our other second-year practical class, which is taken by students the semester before the one reported here. A major learning activity involves regular face-to-face workshops to discuss issues involved in experimental project design. e-Learning will support this research-led learning approach where students prepare for face-to-face work in small groups as well as learn to communicating their ideas to the whole class.
This project was supported by a grant from the University of Melbourne's Courseware Design and Development Program and the Department of Physiology.
We acknowledge the particularly significant support of the programmers, Gordon Yau and Patrick Fong, together with the graphics artist Judy McCombe and the project manager Matthew Riddle. Finally, without the continued strong support of the Department Head, department, Stephen Harrap, innovations such as these would not have been possible.
- © 2008 American Physiological Society