This is a story of why and how my courses underwent an all-encompassing course redesign.
Once upon a time, early during my tenure at Heartland Community College, the nursing faculty invited the A&P instructors to lunch to discuss what was covered in the A&P courses because the nursing students were replying that they “didn’t learn that” in A&P.
The dialog went like this: “Do you teach the autonomic nervous system?”
“Yes, we do!”
“The students say they didn’t learn that. Do you teach the cranial nerves?”
“Yes, we do!”
“The students say they didn’t learn that.”
After that meeting, I had a revelation that rocked my world: I wasn’t teaching, and the students weren’t learning!
Then the question was what to do about it? Retirement or Remediation? Well, shortly after my revelation the economy tanked so retirement wasn’t an option. Remediation, on my part, was the only course of action to take. I went back and hit the books.
I found and used many excellent resources and used parts of all, but it wasn’t until I was searching for how to assess conceptual understanding that I found methods that were used for the major redesign of my courses.
When I hit the books, I read that third graders could learn to do physics. So, I thought there should be no reason that the method developed by a physics professor/research scientist at Harvard, couldn’t be used for A&P courses at Heartland. Therefore, I chose to redesign my courses using a combination of Just-in-Time Teaching (JiTT), Peer Instruction (PI), and Concept Questions (CQs) that are assessed with clickers, in a manner described by Eric Mazur.
It is very important to make expectations known. In the first week of class, students are asked to complete an anonymous, on-line introductory questionnaire (Mazur, 1997). This helps to make sure that the student’s expectations conform to what will be taking place in class. The results of this questionnaire are compiled into a handout and discussed in class. This questionnaire is followed up with another questionnaire (Mazur, 1997) during the fourth week of the semester to identify is there is anything I can do to improve the in-class experience to help their learning and to address any expectations that are contrary to what we are doing in class. The result of using these questionnaires is an improved sense of cooperation.
The first week of the semester is also used expressly to help students get acclimated with the flow of the course and the technology used in class with several non-graded assignments and assessments completed just for practice. Students must become familiar with the Learning Management System (LMS) and the classroom response system (CRS).
Basically, how it works is students are given pre-class reading assignments and are required to take a pre-quiz following the completion of the reading assignment which are posted in the LMS. In one way, the quizzes are used to check for reading comprehension. In another way, the pre-quizzes allow the students to identify and verbalize areas of confusion. This emphasizes that knowledge acquisition occurs outside of the classroom so that in class, based upon their input, the focus is placed on what students are having difficulty with.
The last question of the pre-quizzes is the JiTT part of the pre-quiz. “Please tell me briefly what single point of the reading that you found most difficult or confusing. If you did not find any part of it difficult or confusing, please tell me what you found most interesting.” (Mazur, 1997) Many times students tell me something they found interesting when they didn’t answer any of the questions correctly. So, they indirectly tell me they don’t know what they don’t know. In either case, their feedback determines the topics for discussion the next day.
Generally, there are about three topics that are identified from the pre-quizzes. CQs to be used in class are written for those topics. The following flow-chart demonstrates how it works in class. This process forces students to think through the arguments being developed and provides a way to assess their understanding of the concept.
Questions can be written to begin easy and progress to more conceptual content such as application and prediction questions, etc. This allows for scaffolding of knowledge to occur. It is important to monitor discussions to keep students on task, find out how students are thinking, and to identify possible sources of confusion.
The CQs are assessed with the classroom response system. Sometimes technologies fail so it is good to have a back-up plan. I have letter cards available in such situations. The CQs and are graded upon completion, not on correctness. Doing so encourages cooperation among students. Students must be continually reminded that it is okay to get questions wrong and by just committing to an answer will help produce more durable learning.
Tangible benefits from the redesign include:
For most of the CQs asked throughout a semester the percentage of correct responses after PI were greater than before PI. Students were able to convince their classmates what the correct answer was. Occasionally, the percent of correct responses following PI was lower than before PI. This was usually due to a poorly worded or ambiguous question, or a discussion between a student who was confidently wrong and one who was correct but not confident.
Persistence after the redesign was greater than before the redesign. Before the redesign 18% of students ended up dropping the course; after the redesign only 12% of the students ended up dropping.
Students liked using the classroom response system and student discussions. Students responded to open ended questions on anonymous, end of the semester surveys: “Discuss your thoughts on the use of clickers in the classroom”; “Please discuss your thoughts on the ‘convince your neighbor’ portion of the course.” Numerical value to their responses were assigned on this Likert scale: 4 = really liked; 3 = liked; 2 = disliked; 1 = really disliked. The mode/median for the responses regarding using clickers was 4; and 3 for responses regarding the ‘convince your neighbor’ portion of the course. In their responses, students also raised some concerns: “my partner never did the readings, so he wasn’t a lot of help; but it did help me to try to explain things to him;” “convincing your neighbor never really helped me mainly because my neighbor was never sure.”
Intangible benefits of the redesign include:
Students are conversing using the language of the discipline and are provided with an opportunity to identify and verbalize what they don’t know. Answering the CQs is a form of forced retrieval which leads to more durable learning. Students must formulate arguments to support their position when “convincing their neighbors.” And lastly, by listening to student discussions instructors can identify confusing questions, misconceptions, students with clear answers, students with faulty logic/reasoning or who are confidently wrong, etc.
The following are recommendations to address issues of concern identified by students and the instructor.
- To reinforce the importance of pre-class reading assignments, in addition to the reading assignments posted to the LMS along with the pre-quizzes, give the students a hardcopy of all the reading assignments in the first week of the semester and post it to an informational page in the LMS.
- Explicitly tell the students that work outside of class is expected. The following chart is provided to the students so that they can visualize the general layout of the course.
- To reduce knowledge voids and the influence of confidently wrong students, encourage students to seek advice from classmates all around them rather than those sitting next to them. If you use Learning Catalytics (LC) as a classroom response system, it can be set to run the class automatically which will tell each student who they should consult with. The instructor sets up the parameters (i.e., three students, with different answers, within a certain number of seats or if it is in a small class – anywhere in the room) but LC uses a sophisticated program to reduce the influence of confidently wrong students. Having diverse permanent/fixed teams and having students discuss the CQs with their teammates also addresses this issue.
- To alleviate some anxiety from this non-traditional format students are given lecture notes. Traditional lectures aren’t given, but students are given the next best thing – the lecture notes.
- To help motivate the students and to reinforce the importance of meaningful learning and moving away from rote memorization exams should have 50% conceptual questions.
So, there you have it – the why and how I completely redesigned my courses. Is that the end of the story, you ask? Of course not. Teaching is an iterative process and with anonymous, end of the semester input from students, self-reflection, and professional development, the changes have been continual. Perhaps, in a future blog, I will write the tale of why and how this course redesign evolved and changed overtime.
References for Redesign and Remediation:
Bransford, J.D., Brown, A.L., Cocking, R.R., eds. (2000). How people learn: Brain, mind, experience, and school. Washington, DC: National Academy Press.
Broida, J. (2007). Classroom use of a classroom response system: What clickers can do for your students. Upper Saddle River, NJ: Prentice Hall.
Bruff, D. (2009) Teaching with classroom response systems: Creating active learning environments. San Francisco, CA: Jossey-Bass.
Bybee, R.W. (ed.) (2002). Learning science and the science of learning. Arlington, VA: NSTA Press.
Duncan, D. (2005). Clickers in the classroom: How to enhance science teaching using classroom response system. San Francisco, CA: Pearson Addison Wesley Benjamin Cummings.
Ellis, A. B., Landis, C.R., & Meeker, K. Classroom assessment techniques: ConcepTests. http://www.flaguide.org/cat/contests/contests2.php
Fink, L. D. (2003). Creating significant learning experiences: An integrated approach to designing college courses. San Francisco, CA: Jossey-Bass.
Finkel, D.L. (2000). Teaching with your mouth shut. Portsmouth, NH: Boynton/Cook.
Herreid, C.F, ed. (2007). Start with a story: The case study method of teaching college science. Arlington, VA: NSTA Press.
Mazur, E. (1997). Peer instruction: A user’s manual. Upper Saddle River, NJ: Prentice Hall.
Michael, J. A. & Modell, H. I. (2003) Active learning in secondary and college classrooms: A working model for helping the learner to learn. Mahwah, NJ: Lawrence Erlbaum Associates.
Novak, G. M., Patterson, E. T., Gavin, A. D., & Christian, W., (1999). Just-in-Time Teaching: Blending active learning with web technology. Upper Saddle River, NJ: Prentice Hall.
Sullivan, W.M. & Rosin, M.S. (2008). A new agenda for higher education: Shaping a life of the mind for practice. San Francisco, CA: Jossey-Bass.
Woditsch, G.A. & Schmittroth, J. (1991). The thoughtful teachers guide to thinking skills. Hillsdale, NJ: Lawrence Erlbaum Associates.
||After a post-doctoral fellowship at Washington University School of Medicine, Jane began her academic teaching career at Benedictine University in the graduate programs in exercise physiology. After that Jane taught in the Physician Assistant Programs at Rosalind Franklin University and the University of Kentucky. For the past 18 years Jane taught Anatomy and Physiology at Heartland Community College in Normal, IL, where innovative, student-centered instruction is encouraged. For the last decade, Jane employed Just-in-Time Teaching with Peer Instruction and concept questions assessed with a classroom response system. Recently, permanent, fixed teams were used in her classes, along with team-based summative assessments, as well as with in-class and post-class forced retrieval activities. Jane is a Professor Emeritus of Biology and had served the Anatomy and Physiology course coordinator.
Jane received her B.S. from Eastern Illinois University, her M.S. from Illinois State University, and her Ph.D. from Marquette University.