The Integrative Themes in Physiology (ITIP) project was a National Science Foundation-funded collaboration between the American Physiological Society (APS) and the Human Anatomy and Physiology Society (HAPS). The project goal was to create instructional resources that emphasized active learning in undergraduate anatomy and physiology classrooms. The resources (activity modules and professional development) addressed two factors thought to be limiting science education reform: instructors' knowledge of how to implement active learning instruction and time to design innovative curricula. Volunteer instructors with a strong interest in using active learning in their classrooms were recruited to use the ITIP modules and provide ease-of-use feedback and student assessment data. As the study unfolded, instructor attrition was higher than had been anticipated, with 17 of 36 instructors withdrawing. More surprisingly, instructors remaining with the project failed to use the modules and reported specific obstacles that precluded module use, including lack of support from academic leadership, unplanned class size increases and heavy teaching loads, a union strike, insufficient time to develop a mindset for change, inadequate technology/funding, an adverse human subjects ruling, incompatibility of modules with instructors' established content and expectations, and personal factors. Despite the lack of module use and obstacles, 8 of 19 site testers began independently to introduce new active learning instruction into their classrooms. In the larger picture, however, it is important to note that only 8 of the initial 36 volunteers (22%) actually ended up changing their instruction to include opportunities for student active learning. These findings underscore the difficulty of implementing instructional change in college classrooms.
- active learning
- curriculum development
- faculty development
- science education reform
- instructional change obstacles
the contemporary national science education reform movement calls for college instructors to move away from traditional didactic lectures that emphasize memorization of disconnected facts toward instruction that provides students with opportunities to actively engage with content material and the process of scientific inquiry (1, 2, 5, 12, 13, 16, 17). Although the national education reform calls have increased awareness among college faculty of the need to change the way they teach, increased awareness has not translated to significant instructional change in the majority of undergraduate classrooms. Many college instructors still rely heavily, if not exclusively, on the didactic lecture as their primary strategy for teaching (11, 22, 14). Instructors who do decide to change how they teach must step out of the comfortable role of a lecturer disseminating content to students and instead learn to guide students as they take responsibility for asking questions, engaging in logical reasoning and problem solving, and discussing scientific concepts and processes with their peers.
Instructional change of this type is understandably a challenge for many college faculty members. Instructors who were themselves taught with the traditional lecture method may not know how to actively engage or manage students effectively during interactive classroom sessions. Furthermore, instructors may have little time amid their many responsibilities to look critically at their course content and restructure their syllabus to emphasize concepts and the process of inquiry. The Integrative Themes in Physiology (ITIP) project was designed to address the issues of instructional know-how and lack of time by providing activity modules and professional development workshops that would support instructors in making the transition from traditional lectures to interactive teaching and learning.
ITIP was funded by the National Science Foundation (NSF) Course Curriculum and Laboratory Improvement, Educational Materials Development Program (15a). Project participants came from two professional societies with a vested interest in improving the quality of undergraduate teaching and learning: the American Physiological Society (APS) and the Human Anatomy and Physiology Society (HAPS). While most members of the APS are employed by research universities or medical schools and have a significant commitment to bench and clinical research in addition to teaching responsibilities, a large percentage of HAPS members are primarily responsible for educating undergraduate students in physiology, anatomy, or combined anatomy and physiology at 2- and 4-yr colleges.
The instructional resource development approach used by ITIP is supported by reform documents that urge the creation and implementation of inquiry-based curricula (13, 15) as well as by researchers who advocate the development of “educative curriculum materials” designed to enhance teacher understanding of inquiry teaching and student learning (e.g., Ref. 3). Although empirical data that document the efficacy of this approach on instructional change and student learning are scarce in the context of higher education, particularly in the life sciences, there is some evidence-based support for the instructional resource approach in precollege contexts. For example, Smith et al. (21) found that middle school biology teachers were only successful in changing their classroom instruction when they were provided with curriculum units that included instructional commentary that guided the implementation of structured activities. Using three treatment groups (faculty development workshops only, curriculum units only, or workshops and curriculum units) and a classroom observation rubric for scoring changed instructional practice, the researchers found that the teachers who used the curriculum units (with or without attending the workshops) were more likely to change their instructional strategies than the teachers who did not receive the units. Moreover, the researchers found that attendance at workshops (two half-day sessions) did not significantly impact or enhance instructional changes.
Based on these findings, the initial focus of the ITIP project was to produce ready-to-use activity modules with instructional commentary. Volunteer faculty members who did not contribute to activity module development were asked to integrate ITIP modules into their existing courses and to determine the impact of module use on student understanding of physiology concepts and on student attitudes and preferences related to the teaching and learning of physiology. To increase the potential for effective use of the modules, volunteers participated in faculty development sessions and content-based workshops related to the project.
The ITIP project was planned to occur in five stages. Stage I was dedicated to the development of instructional resources and assessment materials. Stage II focused on recruiting instructors who were interested in site testing the ITIP modules and enthusiastic about broadening their classroom instructional strategies to include instruction for active learning. During stage III, the volunteer instructors received startup packages containing assessment materials and instructions for obtaining the modules. In stage IV, the volunteer site testers were asked to use and assess three modules over the course of a semester. In stage V, the results of the assessment were to be evaluated and then used for module revision. The planned timeline was, however, modified in response to an unexpected outcome: by the end of stage IV, no ITIP modules had been site tested. The project focus was then revised to determine why the volunteer instructors had not used or assessed the modules.
The protocol for the ITIP project was approved and conducted in compliance with established requirements of the Institutional Review Board (IRB) at the University of Texas (Austin, TX) for projects involving human subjects. Site testers were responsible for obtaining approval from their home institutions if they required it for participation.
Instructional Resource Development
A resource group (listed in the acknowledgements) was responsible for creating the activity modules and faculty development sessions. The five modules consisted of classroom and laboratory activities, formative assessment that emphasized conceptual understanding, and instructional commentary with suggestions for implementation. Modules were designed around the theme of of “Gradients and Conductance” and addressed osmotic, chemical, electrical, thermal, and electrochemical gradients as well as conductance and resistance relationships within key physiological systems (cardiovascular, pulmonary, renal, nervous, and thermoregulatory systems). The modules were disseminated through download links from the ITIP home page on the APS website (http://www.the-aps.org/education/itip).
Four of the activity modules contained both faculty and student versions. Instructors' versions contained background content information, learning objectives, classroom activity descriptions, assessment tools, and suggestions for engaging students in active learning. The modules were constructed so that faculty members could selectively integrate the modules into their existing courses. Our intent was that inquiry instruction could thus be introduced in a stepwise fashion, without the need for faculty members to completely overhaul their existing courses. The modules were meant to be used in parallel with traditional textbooks, laboratory manuals, and test banks. Student versions of the ITIP modules included instructions for students to follow to complete the learning activities.
Professional development sessions at the HAPS 2000 conference included two interactive plenary sessions and six workshops focused on active learning strategies for presenting pressure-volume-flow relationships. Two of the workshops included demonstrations of ITIP activity modules in use.
The assessment plan included tools to assess changes in faculty instructional strategies and in students' physiology understanding and attitudes toward instruction. Faculty members were asked to report on their institution and teaching context as well as on their student demographic profile. They also completed surveys about their experience using active learning strategies, their teaching philosophy, how they defined active learning, and concerns they had about implementing instruction for active learning. Faculty comments and questions documented on a project listserv served as an additional assessment data source.
Student assessment materials included a survey on student backgrounds and a pre/postcourse questionnaire to determine changes in students' attitudes toward anatomy and physiology teaching and learning. The questionnaire was exploratory in nature and adapted from a variety of resources used by the University of Texas Center for Teaching Effectiveness. The 1996 HAPS Competency Examination (www.hapsweb.org) with 87 multiple-choice questions was distributed to determine changes in students' physiology content understanding.
Recruitment of site testers took place after professional development plenary sessions at the 2000 HAPS Conference. Thirty-six instructors with a variety of backgrounds were purposively sampled (8) from a volunteer pool of fifty-five instructors. Although the site test instructors were selected to approximate the range of institutional affiliations of HAPS attendees at the conference and to represent both genders, the sampling method also intentionally maximized a range of experience in implementing instruction for active learning and a range of years of teaching experience. Volunteers' self-described experience using active learning ranged from “no experience” to “extensive experience.”
All instructor volunteers were affiliated with HAPS and taught physiology or anatomy and physiology to undergraduate students at institutions ranging from associate's colleges, tribal colleges, baccalaureate, and master's colleges to doctorate-granting and specialized institutions (6).
Startup packages were mailed to instructors during the summer of 2000. Packages contained instructions for downloading modules and subscribing to an ITIP listserv, a CD supplement to one of the modules, human subjects consent forms, assessment materials, and information on the project protocol. A small selection of relevant research literature with information on active and cooperative learning and formative assessment was also included in the packets.
Project Extension: Interviews and Analysis
When it became apparent that the module site testing was not being accomplished, we received an extension from the NSF to determine why instructors were unable to fulfill their commitment to site test the ITIP modules.
Three paramount questions shaped the design of the project extension: Can we understand why the instructors decided not to site test the ITIP modules? What kinds of data will help us understand? What techniques will allow us to collect the desired types of data? Because we wanted to understand the instructors' perceptions and decision-making processes related to nonuse of the modules, but did not want to control or influence the information they provided in any way, qualitative open-response interviews (8) were conducted to learn about each instructor's thinking. Instructors were given the option of individual telephone interviews or responding to a set of e-mail questions. Interviews and e-mails began with a very direct exploratory question: “Why aren't you using the ITIP modules?” As instructors spoke, the information they provided (primary interview data) was recorded verbatim using a recording device. In addition to the tape recording of the telephone interviews, notes were taken to create a running outline that allowed the interviewer to respond using the instructors' choice of words and to identify questions that needed further probing and clarification.
All interviews were transcribed, and the transcripts became the primary data used in this qualitative study. Next, interview data were systematically analyzed by a process of “unitizing” (8), which involves breaking the interview data into sections containing common themes. Once unitized, data sections were labeled using codes that clearly identified each section's theme. Finally, a list of emerging themes was compiled by listing codes ascribed to sections. The data-analysis process was assisted by three anonymous science education researchers not associated with the ITIP project.
Instructor Attrition and Characterization
Attrition of volunteer instructors from the study began within 2 wk of the instructor recruitment at the May 2000 HAPS conference. Shortly after the conference ended, 9 of the 36 instructors (25%) indicated that after reconsidering their fall workloads, they decided not to participate. In late August, the remaining 27 instructors received startup instructions and materials. Within the first 2 wk of the autumn 2000 semester, when module site testing was to begin, another eight instructors withdrew from the project, citing new academic responsibilities and lack of time to continue in the project. Ultimately, only 19 of the original 36 volunteers were left to site test and assess activity modules.
Attrition was much higher than we had anticipated, approaching 50%, and was similar across most institutional types (Table 1). The two exceptions were 100% attrition in the tribal college category (1 site tester) and 0% attrition in the specialized college category (1 site tester). While instructor attrition did not change the institutional representation or diversity in instructor gender or years of teaching experience, attrition did significantly diminish the number of volunteers who had previous experience facilitating active learning in their classrooms. After attrition, only two of the site testers self-reported that they had “extensive” knowledge and experience implementing instruction for active learning, whereas the remainder reported having no or only limited knowledge. We have no data to explain this but can speculate that perhaps instructors who were already using active learning strategies felt less need to test the ITIP activity modules because they had already developed their own activities.
After attrition, the core group of 19 faculty site test instructors was characterized as follows:
All taught either physiology or anatomy and physiology courses.
16 of the instructors had doctoral degrees, and the remainder had master's degrees.
All 19 instructors had been educated primarily in the sciences, and only 2 instructors had taken a formal course in education.
At the time of the study, 14 of the instructors believed that they received encouragement from their department to participate in the ITIP project as a means of improving the quality of undergraduate education. Five instructors reported receiving no departmental encouragement and reported having “reservations” about committing time to project participation.
13 instructors reported having tenure or a form of continuing status at their institutions.
11 instructors were women; 8 instructors were men.
Faculty instructors represented a broad range of age groups: 1 instructor was in the 25- to 34-yr-old range; 8 instructors were between 35 and 50 yr old; 9 instructors were between 51 and 60 yr old; and 1 instructor was older than 60 yr old.
6 of the instructors were engaged in conducting research (3–30% of their time) in addition to teaching and administration. Research topics varied from traditional bench physiology research to research on science teaching and learning.
On average, an instructor's self-reported work week was 71 (±13) h.
All instructors reported spending 60–100% of their time engaged in classroom and laboratory teaching.
3 of the faculty instructors were department chairs at the time of the study.
Instructors taught classes of variable sizes. Not surprisingly, class sizes were smallest at the associate's and master's colleges (10–60 students) and largest at the doctorate-granting and specialized colleges (100–150 students).
Instructors taught courses that were highly variable in instructional format, objectives and content coverage, and course prerequisites (examples are summarized in Table 2).
Instructors provided education to diverse groups of students that included women, underrepresented minorities, nontraditional, part-time, and older students as well as students with a wide variety of grade point averages.
Failure of Module Site Testing
Early in the site testing phase of the project, it became clear that despite consistent urging through the listserv, personal telephone calls, and e-mails, instructors simply were not using the ITIP modules. Although a few instructors indicated that they had read through one or more of the modules, ultimately none of the modules were site tested. This finding was surprising because, although there was a lot of initial interest in active learning, when it came time to site test the modules, faculty members did not follow through.
During this part of the project, we dropped the requirement for instructors to use the ITIP modules, and instructors were encouraged to develop their own strategies for incorporating active learning into their courses. Site testers received one-on-one instructional coaching and support via e-mail and phone conversations during this phase. The decision to drop the requirement to use ITIP modules was consistent with published suggestions that faculty change is best supported by a voluntary orientation rather than a mandated set of activities (18).
In addition to not site testing modules, instructors only minimally utilized the listserv setup to support dialog during the module site testing. Although the project leaders regularly initiated postings to the listserv, the site testers rarely initiated a thread of discussion and generally limited their postings to responding to questions posed by the project leaders. In conversations with individual site testers, it became clear that although faculty members valued the listserv for the information they received, there was a level of distrust and apprehension related to making posts to the listserv. In fact, one-half of the instructors indicated that they preferred to keep issues about their teaching private. Many reported similar discomfort with other professionally associated listservs on which they routinely “lurked.” For example, an associate's college instructor made the following comment:
I once participated on another listserv and people misinterpreted what I had written and I was attacked for beliefs and ideas that people read into my words.Another associate's college instructor made the following comment specifically regarding the ITIP listserv:
I've been uncomfortable asking questions on the ITIP listserv I didn't know people. For example, I asked a question and silenceso I don't know if I'm asking the wrong questions or maybe the other faculty think I might be dumber than the rest about this education stuff.
Failure of Student Assessment
Although 13 of the 19 faculty members did administer and return the student assessment pretests, there was a great deal of variation in the extent to which the materials were completed, which rendered the materials unusable. The most common problem was for instructors to cite “limited class time available for assessment” and therefore to ask their students to complete only those portions of the assessment materials that the instructor judged to be important, interesting, or aligned with their course content. Given the tremendous variation among courses (Table 2), it is not surprising that the portions of the assessment materials administered were highly variable.
A less common problem was for instructors to rearrange or reword assessment questions. One instructor commented that because she wanted her students to do well on the assessment surveys, she had taken the liberty of translating the wording and setting a context that students were more apt to be familiar with. Finally, there was variation in the incentives offered for students to take the assessment materials seriously. Some instructors integrated the project assessments into their grading system so that the incentive to participate was clear to students. Other instructors awarded “bonus” points for careful participation. A few site testers who offered donuts or candy bars for survey completion reported concern about student effort when they noticed that students were “rushing sloppily through the surveys so that they could beat their classmates to the best treats.”
Instructor-Identified Obstacles to Module Use
Personal interviews conducted during the project extension revealed four important findings. First, interviews confirmed that the ITIP modules were not being used. Second, most of the ITIP instructors remained deeply committed to improving the quality of undergraduate education despite the lack of module use. Third, all of the instructors attributed nonuse of the modules to obstacles associated with their institution and/or their department, conflicts with the nature of the ITIP materials, or personal events. A summary of the obstacles reported is provided in Table 3 and elaborated on in the text that follows. It should be noted that multiple obstacles were reported by some instructors, whereas others elected not to make specific comments, and therefore the data are not quantifiable. Finally, despite the fact that none of the instructors site tested modules, 8 of the 19 instructors enthusiastically reported that they were experiencing a positive impact from participating in the ITIP project, and they offered explanations of how they were beginning to gradually introduce instruction that engaged their students in classroom learning.
Institutional and departmental obstacles.
Instructors reported five early-semester events at the level of the institution or department that halted their plans to site test the ITIP modules. In the most extreme example of an institutional obstacle, one associate's college site tester was fired. The instructor said, “It happened suddenly and was quite a shock.” He explained that being fired was related to student attrition when he announced that he would be using active learning modules in his class. He elaborated as follows:
After I made my class announcement that we would be using active learning, over half of the students dropped my class and went to another section taught by an instructor who doesn't use active learning techniques … the [other] instructor has been described by students as ‘slack' … He doesn't require much … It seems that when the students discovered that they were going to be expected to learn something in order to pass my course, they rationalized that they really didn't need to learn the stuff … all they needed to do was pass [to get the program certification] so they chose an easier and more familiar path … they bailed and went to a class where they could get the grade they wanted by doing a little bit of cramming of their notes right before the exam … they just wanted to get through the class with the least effort and time Soon after the student attrition incident, I was called before my Division Chair and the Academic Vice President who perceived that there must be something wrong with my teaching if so many of my students were fleeing. Interestingly, I had thought that the current administration had prioritized student learning … but it turns out that their primary interest is in keeping students contented … ‘student satisfaction' they called it.One instructor teaching at an associate's college reported immediately giving up on using the modules when the college staff went on strike. She explained that with so much uncertainty and disruption on campus, it simply was not the time to try something new. Another instructor at an associate's college attributed failure to use the modules to loss of startup materials during numerous moves en route to a new science building.
High student demand for anatomy and physiology courses and heavy teaching loads kept four instructors from participating. One department chairperson unexpectedly doubled an instructor's class size, whereas another instructor reported “a massively overworked semester” when a colleague took leave and she was assigned an overload. Two instructors from different associate's colleges indicated that the characteristically heavy teaching loads at their institutions had made the possibility of participation seem slim from the beginning, but, nonetheless, they had volunteered as site testers because they thought that participation in an educational reform study funded by NSF might be valued by the institution and their teaching loads temporarily reduced. Both instructors later explained that course loads had remained the same and they were unable to honor the ITIP commitment.
ITIP project design obstacles.
In cases where instructors did not comment on institution- or department-level obstacles, elements of the ITIP project design were cited as obstacles. One instructor indicated that he had not received the startup packet and assessment materials in time for the start of his class but then conceded, “I probably could have gone ahead with it … but I chose not to.” He explained as follows:
Trying to add modules after I'd already started wasn't feasible. It takes more planning than that … trying to do something new isn't something you can just do on the fly … but in all fairness even if the materials had been on time I'm not sure I would have continued. I'd seriously have needed to have been thinking about this during the prior year … developing sort of a mindset for change about how I might make these kinds of changes and add this kind of material to the course I teach.Two instructors teaching at separate associate's colleges commented that they had difficulty as they tried to access or download the ITIP modules from the APS website. One instructor claimed she was not “tech savvy” and had been unable to navigate the site, whereas the other instructor explained that obtaining the modules had not been possible because the old computer would not download them or open attachments.
An instructor at a doctoral/research university reported a “major disaster” when trying to administer the student preassessment materials. His IRB had reviewed the ITIP letter of consent for project participation and told the instructor that students could not be required to participate in the assessment process. The instructor explained as follows:
Because … the IRB … was uncertain about how the data would be used and whether it might be published, they said that student participation had to be voluntary … optional. To my students, ‘optional' means they won't do it. I advertised the project in my class … and tried to convey why it's important … but I only had one person take the pre-test!The instructor concluded that it did not seem worth continuing to participate without some measure of the impact on student learning and attitudes.
Two instructors (teaching at an associate's and master's college) commented that the materials required to build the apparatus for the module they wanted to use had been cost prohibitive. An instructor who taught a two-semester course at an associate's college reported that the sequence in which she covered topics would only allow her to site test one module because her first-semester course focused on anatomy rather than physiology. Because she would have been unable to follow the instructions to test three modules during the first semester of her course, she chose not to participate.
Eleven of the nineteen instructors (almost 60%), from a variety of institutional types, commented that the modules simply “didn't fit” their courses or teaching style. The instructors all stated that the modules were “too much” and then offered a range of explanations for what “too much” meant. Two of the instructors explained that the modules provided “too much depth of coverage.” For example, one instructor explained that, although he did hold the students responsible for “understanding the cardiovascular system,” it was more of a “general familiarity kind of understanding” rather than an understanding of relationships between factors such as flow and resistance. He explained that the information in the modules was simply too complex and cited specific terms that he would not introduce to his students (preload, myocardial contractility, stroke volume, end-diastolic volume, afterload, epinephrine, norepinephrine, acetylcholine, sympathetic and parasympathetic nervous systems, arteriolar resistance, mean arterial pressure, and blood viscosity).
Three instructors indicated that the modules required “too much and too many new kinds of instructional skills” and that their preference was to start out with more introductory kinds of activities, for instance, the kinds of “low-level” activities that they had been accumulating in their teaching files from a variety of sources (e.g., past HAPS meetings, materials passed to them by colleagues, or articles from The American Biology Teacher). One instructor's swim analogy suggested that the modules were too instructionally sophisticated for her to use comfortably. She explained that she had hoped for a situation where she could “wade into the pool” rather than “dive in head-first.”
Four instructors commented that the reason they had not implemented modules was because the assessment was too much, i.e., defined as “problematic and burdensome.” One of these instructors elaborated that her students would not be taught the material covered in the assessment. She stated: “I didn't want to test my students over material we don't cover. That's not fair.” Two other instructors indicated that the assessments were too long and took too much class time to administer. Another instructor elaborated that she could not ask her students to do more tests. Moreover, she was not sure how her students would do on the tests because the terminology differed from that she used in her class.
Finally, and quite surprisingly, two instructors from different associate's colleges expressed their frustration over the expectation that using the modules required them to “do active learning.” Although we believed we had been very explicit about the active instructional techniques embedded in the modules, for these two instructors it came as a surprise that using the modules carried the expectation that students would have the opportunity to be actively engaged in learning during class time. Rather, the instructors had anticipated being able to add the modules' theme-based content material to their courses in a traditional lecture format. They explained that using class time to get students engaged meant relinquishing content material that students might need to know to do well on subsequent professional licensing examinations. Although these instructors were willing to add more content to their courses, they were not willing to cut the content in their courses because they believed doing so would jeopardize student success in future courses and on professional exams.
Two instructors expressed personal obstacles that had precluded their use of modules. One instructor said he was “totally burned out, physically and mentally exhausted, disillusioned, and cynical towards the departmental organization” and simply was no longer motivated to participate in the ITIP project. Another instructor reported that a family illness had dominated her time and thus it was not possible for her to continue in the project.
Individual Change in Instructional Strategies
Despite the fact that the ITIP modules were not site tested, interviews with faculty members who continued in the study revealed that the exposure to the ITIP project had prompted 8 of the 19 ITIP volunteers to begin changing their teaching practices. These instructors reported adding their own brief classroom activities to break up lectures, rethinking and prioritizing what content to teach, and adding followup (formative) assessments after an activity or lecture segment to determine what students had or had not learned. Instructor perception was that these small changes in their practice had benefited their students. They mentioned that the changes in their teaching seemed to add enjoyment to classroom learning and added opportunities for students to reflect on their understanding and develop the ability to self-assess.
Instructors appreciated the opportunity to get specific and immediate feedback on their students' understanding–in the words of one instructor, to “Find out if students are really learning what I think I am teaching!” They also reported that collecting feedback on student learning during class gave them a better idea of how to adjust their instruction to improve student learning. Instructors' comments on the perceived benefits included the following, by five instructors:
Thanks to participation in this project, I have changed the format of my 75-minute lectures. I was pretty scared at first to try it but now I either put a 'challenge question of the day' on the board before I start or give one about half way through the lecture. Sometimes I put a lot of thought into the questions and sometimes they are spur of the moment. I try to come up with questions that students will find interesting and be able to relate to … Students really seem to like it! There's more class interaction … I don't rush the questioning … I give them time to think and talk. I actually set my watch for four minutes.
One strategy I am using that I have not used in the past is during questioning in class I allow students to go down their own paths towards a wrong answer or use the wrong strategy at arriving at an answer. Rather than me correcting them, I wait … or ask the rest of the class if this makes sense. Invariably, other students will say that this does not make sense. I then help them to figure out what in the strategy was inaccurate and then follow them down, or lead them to, a more appropriate strategy that leads to the correct answer, or one of the correct answers.
Participating in this project has convinced me of the importance of using classroom activities to break up a lecture. I know that I need to cut some content, and I am rethinking what I teach, so that I have more time for my students to complete active learning exercises.
I'm learning that if I follow up a segment of lecture with a short learning exercise and then with a small assessment, I can get a feel for the different levels of student understanding … and oh my, what a range of levels of understanding I'm uncovering! Then I can tweak and rethink the way I might want to try to teach the topic next time … or even what else I can do now to improve students' levels of understanding. It's amazing how the feedback helps me better understand where students are getting stuck and amazed at how the approach has opened my students up to wanting to interact more with me.
Despite the fact that I am a very good lecturer … the use of various assessment tools [classroom assessment techniques] has shown me that my students don't always get it from my ‘brilliant' lectures. Introducing these assessment tools has done more than anything else to show me I need to mix it up more and to be willing to try new techniques … I think there are better ways to do what I do. I cannot change everything at once nor will I ever evolve into a completely different teacher, but I am convinced of the value of having the students more involved.
The results described in this article are limited by three factors. First, the faculty members who were recruited to site test the ITIP activity modules were not intended to be a random or representative sample of all undergraduate anatomy and physiology instructors teaching at all types of academic institutions. Rather, participating faculty were volunteers from a professional society (HAPS) dedicated to excellence in education. Faculty volunteers indicated a strong interest in modifying their teaching to include instruction for active student learning. Moreover, all faculty participants, despite their affiliations with different types of academic institutions (from associate's colleges to doctoral/research universities), were primarily educators. Although some faculty members devoted a small amount (3–30%) of their professional time to scientific or educational research, their primary function was undergraduate teaching and the administrative responsibilities that complement this function. Because the sampling procedure used in this study was “purposive” and designed to maximize the discovery of context-specific events, the findings of the study are not necessarily applicable to a broader population of faculty instructors.
Second, the explanations given by site testers during the interviews regarding why they did not use the ITIP modules were retrospective. It is therefore reasonable to assume that the instructors' explanations are not as complete as they might have been if interviews had been ongoing from the start of the project. Finally, although the interview data and categories of obstacles are presented in the words of the faculty instructors, the discussion and “lessons learned” are author interpretations. Despite the attention to author biases, the interpretations and discussion are undoubtedly shaped by author subjectivities.
In Revitalizing Undergraduate Science: Why Some Things Work and Most Don't (24), Sheila Tobias suggests that reform progress in undergraduate science education has been limited because higher education researchers have been reluctant to go beyond the most prevalent reform approach (that of developing course materials and teaching enhancements) to discover the real issues responsible for failed reforms. The results of the ITIP project suggest that this claim is valid. When site testers failed to use the ITIP modules (much to our surprise and initial disbelief), the project was refocused to find out what had stopped the previously enthusiastic faculty instructors from testing the course materials. Ultimately, what we began to discover and understand from the interviews, candid dialogue with the faculty instructors, and qualitative analysis of interview data was far more significant than what we had set out to accomplish. ITIP showed us that providing ready-made activity modules (and professional development) did not translate to instructional change, and our research into why this failed to occur allowed us to document obstacles that make it difficult for faculty members to change the way they teach. In addition, the project gave us some understanding of the kinds of support faculty members need as they begin to change their classroom instruction.
The most important lessons learned from the ITIP project are highlighted below.
Lesson 1: Many Faculty Are Interested in Improving Their Teaching
Faculty instructors affiliated with HAPS responded very positively and enthusiastically to the call to get involved in a project dedicated to instructional change, with a diverse group of 55 instructors applying for 36 site test positions. The high interest in the volunteer faculty group was encouraging because it represents the first step needed to improve undergraduate science education.
Lesson 2: Lack of Instructor Time Was a Formidable Obstacle to Translating Interest to Action
An equally important but contrasting lesson learned from the early stages of the ITIP project came when nearly one-half of the interested volunteers withdrew from the project before they received the startup materials, saying that they simply did not have time to devote to the project. Assessment data support the instructors' claims: the average reported work week of faculty participants was 71 h. The data on faculty work hours and attrition lend support to the hypothesis that a shortage of faculty time is one factor that makes it difficult for faculty members to change the way they teach. The lesson learned was that interest and enthusiasm tempered by time constraints translated to participation in only half of the volunteers.
Lesson 3: Providing Readily Usable Course Materials Did Not Facilitate Instructional Reform Because the Materials Did Not Integrate Easily into the Existing Courses
The results of the module site testing stage support Tobias' claim that reform efforts must go beyond materials development. In the ITIP project, providing the site testers with modules did not set in motion a linear chain of events from module adoption to improved quality of instruction to improved quality of undergraduate education. More than half of the 19 site test instructors claimed that the modules “didn't fit” with their courses, but their specific reasons for the poor fit varied.
Our interpretation of the ITIP findings suggests a number of reasons why the majority of instructors might have concluded that the modules did not fit. First, there may simply be too much variability in course content (Table 2) to create modules that can meet the needs of a wide range of instructors teaching at a wide range of institutional types. The electronic format of the modules meant that instructors could have modified them to fit their particular courses, but they did not do this.
Another explanation for why ITIP instructors thought that the modules did not fit their courses may be related to the instructors' value systems. Ball and Cohen (3) summarized evidence in existing research and concluded that curricula may not be useful for facilitating instructional change because instructors may value their professional autonomy and authoring their own course materials more than the convenience of using someone else's materials. Evidence from the ITIP study supports this explanation in that almost all of the instructors reported authoring some portion, if not all, of their own instructional materials (texts, notes, manuals, or outlines).
A third explanation could be that the ITIP instructors were not comfortable with the depth of content or variety of instructional strategies they encountered in the modules. Our findings suggest that instructional modules need to match the process faculty members go through as they try to change how they teach. The ITIP instructors said they wanted modules that dealt with familiar content material and that would allow them to approach instructional change gradually. This could be accomplished in future projects by creating a continuum of modules that begin with very short, straightforward, easy-to-implement strategies and build toward longer-duration, more complicated instructional strategies and activities. This format has long been advocated by Bonwell and Eison (4) and is a useful idea to remember for future module development projects.
The ITIP participants also asked for more instructional commentary so that they would know what to expect when using the modules. For example, including typical student responses might help them anticipate student answers. Interestingly, Ball and Cohen (3) have suggested that instructors' guides that offer insight into student thinking would go a long way toward preparing instructors as they begin to incorporate interactive teaching into their courses. Although some research on students' thinking about physiology exists (e.g., Refs. 9 and 10), there is much to be learned about student thinking that requires substantial and specific inquiry into student responses to particular tasks.
Lesson 4: Departmental and Institutional Obstacles Played a Significant Role in the Failure of the Site Test Phase of the ITIP Project.
Another lesson from the ITIP project was that institution- and department-level obstacles, factors known to impede change in science education reform (14), immediately and unexpectedly put a stop to four instructors' intentions to enact reform-based instructional strategies. One of the most troubling of these constraints was the administrative decision to fire an ITIP instructor due to student attrition when the instructor announced that he would be including active learning in the course. Administrative decisions to increase class size or teaching loads also put a stop to some site testers' plans to participate in the project. Certainly, these are obstacles that lie outside of an instructor's immediate control and make it extremely difficult for instructors to change the way they teach.
Documentation of these obstacles in the ITIP project supports the perception that institutional and departmental policies impede faculty attempts to improve undergraduate education. If widespread reform progress is to be made, it is critical that a constructive process for identifying and managing institutional and departmental obstacles be initiated. For example, the ITIP findings suggest that institutions should have policies to ensure consistency in the level of academic rigor between similar courses so that innovative instructors are not penalized (e.g., fired) when students strategically choose the least demanding path to degree or program completion. In addition, institutions should define strategies to clearly differentiate student satisfaction from quality education. Finally, institutional and departmental leaders need to recognize that implementing new instructional strategies is time and labor intensive and allocate sufficient time and resources to instructors who want to learn to teach more effectively, despite the conflict with economics that drive large class sizes and heavy teaching loads. Initiatives to improve undergraduate education are not likely to be successful without the “buy in” and support of key administrators.
Lesson 5: Technological Limitations and the Cost of Supplies Can Be Obstacles to Instructional Innovation
We learned that to ensure access to electronically available modules, some instructors may require technology training for navigating websites and downloading attachments, whereas others may require access to newer computers. It is also important to note that even relatively inexpensive supplies may place a strain on the classroom budgets of some instructors.
Lesson 6: Ethical Requirements for Conducting the ITIP Project Were Complex and the Project Would Have Benefited from Communication with the IRBs of Faculty Participants' Home Institutions
A lesson learned from the ITIP assessment protocol was the potential difficulty of conducting cross-institutional educational research with the advance intent to publish the data. Despite our careful attention to human subjects research requirements, one instructor was only able to administer the ITIP assessment materials to volunteer students. In future studies, we need to specify in consent documents that student assessment data will be compiled and published only as aggregate data for the project and that, in the event that data are worthy of publication, researchers will contact IRB personnel at each of the participating institutions and verify compliance with all protocols before publication. We suggest that if national funding agencies expect multiple-site research and evidence-based improvements in educational practice, these agencies need to help investigators by providing study guidelines that will satisfy the IRBs of diverse participating institutions.
Lesson 7: ITIP Faculty Would Have Benefited from Education on Project Assessment Methods and from Being Made Partners in Designing the ITIP Assessment Protocol
We erroneously assumed that faculty members would understand the indepth nature of the assessment necessary for cross-institutional module site testing and evaluation. However, 4 of the 19 site test instructors said that the problematic and burdensome nature of the assessment component had constrained their ability to use the modules. Moreover, much of the data returned to the assessment team were unusable because of omissions and changes made to the materials. In hindsight, it is clear that the assessment component was much more involved than the site testers had anticipated. The written instructions given to site testers did not provide the level of explanation and guidance that the faculty members needed. In hindsight, the project would have benefited from a meeting with the site testers to explain the assessment rationale, ask for input on how to make the assessment manageable, and clarify the requirements so that all participating faculty members agreed with the assessment plan.
Lesson 8: Instructors Benefit from Specific Professional Development Opportunities and On-Going Support.
We know, and research has repeatedly demonstrated, that students learn best when given opportunities to actively engage with the material they are learning. An important lesson learned from the ITIP project is that the nature and process of faculty learning is not so different from the nature and process of student learning. Why should we expect that faculty learning to teach using a new approach can accomplish change by simply reading about the process or seeing someone do it? For perhaps too long, it has been assumed that faculty instructors effortlessly learn everything that is presented to them in reform documents, information sessions, workshops, and thematic modules and that they can immediately and effectively translate all the new information into changed instructional practice. The ITIP project taught us that this assumption is not valid: faculty members must have opportunities to practice unfamiliar instructional strategies and receive constructive feedback.
Our findings suggest that an effective faculty development model would provide faculty with the following:
Opportunities to regularly discuss questions that emerge from their own classroom teaching experiences
A safe community for interaction with like-minded colleagues where it is understood that what is said will be treated with respect and confidentiality
Individual instructional coaching, provided on an as-needed basis
Opportunities to experiment with different instructional strategies and receive feedback on their efforts so that faculty members can revise and continuously improve their instruction
The faculty learning community approach described by Cox (7) appears to be a promising university model that meets the criteria identified above and that might be adaptable to professional society settings with the addition of a well-designed distance education component. Research evidence indicates that both faculty and student learning are greatly improved through application of the faculty learning community model. The ITIP project taught us that under ideal conditions, faculty participation in professional development activities should be endorsed and approved by each instructor's home department.
Was the ITIP project a failure? We do not believe it was. Although the module site testing phase of the project stalled, the project served as a valuable venue for collecting qualitative data on some of the obstacles that may be contributing to the failure of educational reform efforts in undergraduate classrooms. Furthermore, the project successfully provided insight into the nature of instructional resources (curriculum development and faculty development models) that may better support faculty learning and instructional change. The ITIP project laid the groundwork for research into the kinds of resources needed to help faculty move away from traditional faculty-centered instruction to a format that focuses on improved student learning. The project findings strongly support the need to create opportunities for faculty development and on-going support as instructors begin to implement science education reform recommendations for learner-centered instruction.
The project also helped increase the number of undergraduate faculty involved in science teaching innovation and reform. Some site testers became convinced of the efficacy and importance of using teaching strategies for active learning and wanted other faculty members to know about the changes they had made. At the 2000 and 2001 HAPS conferences, all ITIP-related talks and workshops were conducted by APS members or by HAPS members of the curriculum development team. In 2002, five of the nine ITIP-associated workshops were given by site testers who wanted to share what they had learned with colleagues. This trend persisted beyond the life of the grant, with three site testers presenting workshops at the 2003 HAPS meeting.
Furthermore, in the three years of the grant, we observed a growing interest in learning about different teaching methods among the general HAPS membership. At the 2000 HAPS meeting, where the ITIP project was introduced, the update talks and discussion of active learning elicited comments such as “This is nonsense. All learning is active” and “So show me the scientific evidence that says this active learning stuff is better than what I've been doing.” By the 2002 meeting, HAPS members as a whole were much more receptive to the idea and importance of modifying or changing their teaching to improve learning among diverse groups of students. The workshops offered by ITIP participants were well attended and engendered a considerable amount of lively discussion, and similar workshops have been given at HAPS meetings since 2002.
In addition, although the ITIP modules were not used by the site testers, they have been used by other instructors (23), and they remain freely available through the APS Archive of Teaching Resources (www.apsarchive.org), which currently averages 95,000 hits/mo. Because there is no “one size fits all” solution for instructional materials and because instructors seem more likely to use materials that they have had a role in creating, we feel that curriculum materials should be made available in digital format that can be easily adapted by instructors to fit different student population and teaching goals. The collection, review, and dissemination of appropriate materials are organizational challenges, however, and likely to be a major limiting factor. One of us (D. U. Silverthorn) is currently heading another NSF-funded project (15b) to assemble a web-based sourcebook of simple laboratory activities whose design uses lessons learned from the ITIP project. The sourcebook will be an electronic resource that interested faculty members must adapt to create their own laboratory handout, and it includes extensive instructional commentary to facilitate successful implementation.
A final point of success to be noted is that we found that participation in the ITIP project prompted some of the site testers to become more critically reflective about their teaching and to begin to shift toward learner-centered forms of instruction (23). The following are words of two instructors who began to gradually change their instructional strategies:
Giving up control of the classroom is the most frightening thing I have done professionally, but it allowed me to see how well the students can actively ‘steer' themselves in their learning.
Participation in this project has definitely changed my conception of teaching. I used to think it was my role to lead students into a new area, clarify concepts, interpret information and ‘show the way.' Now I feel like my role is to provide opportunity, open doors, etc., but don't necessarily lead the students.
The results of the ITIP project strongly support the need to create opportunities for faculty development and on-going support if faculty members are to successfully implement science education reform recommendations.
This work was supported by National Science Foundation Grant 9952458.
The authors extend thanks to the ITIP activity module Development Group, which consisted of Daniel Lemons and Joseph Griswold (City College of New York), Robert Carroll (East Carolina University School of Medicine), Barbara Goodman (University of South Dakota School of Medicine), Joel Michael (Rush Medical College), Harold Modell (Physiology Educational Research Consortium), Penelope Hansen (Memorial University), William Cliff (Niagra College); Don Kiesel (State University of New York), and Patricia Bowne (Alverno College). The faculty members who volunteered to test the modules and served as case study participants will remain anonymous to protect their privacy, but we thank them for their participation. The project would not have been possible without the generous commitment of their time, willingness to share the details of their struggles and successes in the classroom, and dedication to improving the quality of undergraduate physiology education. We also thank the three anonymous science educators who assisted with the unitizing and theme analysis of the constraints identified by faculty instructors as well as the officers and members of HAPS and the American Physiological Society (APS) for the support of the project, especially Dr. Marsha Matyas and Melinda Lowy of the APS Education Office.
Present address of P. Thorn: Choice Insights, Science Learning in Higher Education, Phoenix, AZ (e-mail:).
Present address of M. Svinicki: Dept. of Educational Psychology, Univ. of Texas, Austin, TX 78712.
↵* D. U. Silverthorn and P. M. Thorn contributed equally to this work.
- © 2006 American Physiological Society