Tag Archives: formative

Thinking Critically About Critical Thinking

 

A few mornings ago, I was listening to a television commercial as I got ready for work.  “What is critical thinking worth?” said a very important announcer.  “A whole lot” I thought to myself.

But what exactly is critical thinking?  A Google search brings up a dictionary definition.  Critical thinking is “the objective analysis and evaluation of an issue to form a judgement.”  The example sentence accompanying this definition is “professors often find it difficult to encourage critical thinking among their students.” WOW, took the words right out of my mouth!

Have any of you had the following conversation? “Dr. A, I studied and studied for this exam and I still got a bad grade.  I know the material, I just can’t take your tests!”  The student in question has worked hard. He or she has read the course notes over and over, an activity that has perhaps been rewarded with success in the past.  Unfortunately re-reading notes and textbooks over and over is the most common and least successful strategy for studying (4).

In my opinion, as someone who has been teaching physiology for over 20 years, physiology is not a discipline that can be memorized.  Instead, it is a way of thinking and a discipline that has to be understood.

Over the years, my teaching colleague of many years, Sue Keirstead, and I found ourselves during office hours trying repeatedly to explain to students what we meant by thinking critically about physiology.  We asked the same probing questions and drew the same diagrams over and over.  We had the opportunity to formalize our approach in a workbook called Cells to Systems Physiology: Critical Thinking Exercises in Physiology (2).  We took the tough concepts students brought to office hours and crafted questions to help the students work their way through these concepts.

Students who perform well in our courses make use of the workbook and report in student evaluations that they find the exercises helpful. But we still have students who struggle with the critical thinking exercises and the course exams.  According to the comments from student evaluations, students who struggled with the exercises report they found the questions too open ended.  Furthermore, many of the answers cannot be pulled directly from their textbook, or at least not in the format they expect the answer to be in, and students report finding this frustrating.  For example, the text may discuss renal absorption and renal secretion in general and then the critical thinking exercises asks the student to synthesize all the processes occurring in the proximal tubule.  The information is the same but the organization is different.  Turns out, this is a difficult process for our students to work through.

We use our critical thinking exercise as a type of formative assessment, a low stakes assignment that evaluates the learning process as it is occurring.  We also use multiple choice exams as summative assessments, high stakes assessments that evaluate learning after it has occurred.  We use this format because our physiology course enrollment averages about 300 students and multiple choice exams are the most efficient way to assess the class.  We allow students to keep the exam questions and we provide a key a couple of days after the exam is given.

When a student comes to see me after having “blown” an exam, I typically ask him or her to go through the exam, question by question.  I encourage them to try to identify how they were thinking when they worked through the question.  This can be a very useful diagnostic.  Ambrose and colleagues have formalized this process as a handout called an exam wrapper (1).  Hopefully, by analyzing their exam performance, the student may discover a pattern of errors that they can address before the next exam.  Consider some of the following scenarios:

Zach discovers that he was so worried about running out of time that he did not read the questions carefully.  Some of the questions reminded him of questions from the online quizzes.  He did know the material but he wasn’t clear on what the question was asking.

This is a testing issue. Zach, of course, should slow down.  He should underline key words in the question stem or draw a diagram to make sure he is clear on what the question is asking.

Sarah discovers that she didn’t know the material as well as she thought she did, a problem that is called the illusion of knowing (3). Sarah needs to re-evaluate the way she is studying.  If Sarah is cramming right before the exam, she should spread out her studying along with her other subjects, a strategy called interleaving (3).  If she is repeatedly reading her notes, she should put her notes away, get out a blank piece of paper and write down what she remembers to get a gauge of her knowledge, a process called retrieval (3).  If she is using flash cards for vocabulary, she should write out learning objectives in her own words, a process called elaboration (3).

Terry looks over the exam and says, “I don’t know what I was thinking.  I saw something about troponin and I picked it.  This really frustrates me. I study and study and don’t get the grade I want.  I come to lecture and do all the exercises. I don’t know what else to do.” It is a challenge to help this student.  She is not engaging in any metacognition and I don’t claim to have any magic answers to help this student.  I still want to try to help her.

I feel very strongly that students need to reflect on what they are learning in class, on what they read in their texts, and on the activities performed in lab (3).  I have been working on a project in one of my physiology courses in which I have students take quizzes and exams as a group and discuss the answers collaboratively.  Then I have them write about what they were thinking as they approached the question individually and what they discussed in their group.  I am hoping to learn some things about how students develop critical thinking skills.  I hope I can share what I learn in a future blog posting.

  1. Ambrose SA, Bridges MW, DiPietro M, Lovett M, Norman MK. How Learning Works: 7 Research Based Points for Teaching. San Francisco CA: Jossey-Bass, 2010.
  2. Anderson LC, Keirstead SA. Cells to Systems: Critical Thinking Exercises in Physiology (3rd ed). Dubuque, IA: Kendall Hunt Press, 2011.
  3. Brown PC, Roediger HL, McDaniel MA. Make it Stick: The Science of Successful Learning. Cambridge MA: The Belknap Press of Harvard University Press, 2014
  4. Callender AA, McDaniel, MA. The limited benefits of rereading educational text, Contemporary Educational Psychology 34:30-41, 2009. Retrieved from http://ac.els-cdn.com/S0361476X08000477/1-s2.0-S0361476X08000477-main.pdf?_tid=22610e88-61b4-11e7-8e86-00000aacb35e&acdnat=1499281376_e000fa54fe77e7d1a1d24715be4bbf50 , June 22, 2016.

 

 Lisa Carney Anderson, PhD is an Assistant Professor in the Department of Integrative Biology and Physiology at the University of Minnesota. She completed training in muscle physiology at the University of Minnesota. She collaborates with colleagues in the School of Nursing on clinical research projects such as the perioperative care of patients with Parkinson’s disease and assessment of patients with spasticity. She directs a large undergraduate physiology course for pre-allied health students.  She also teaches nurse anesthesia students, dental students and medical students.  She is the 2012 recipient of the Didactic Instructor of the Year Award from the American Association of Nurse Anesthesia.  She is a co-author of a physiology workbook called Cells to Systems: Critical thinking exercises in Physiology, Kendall Hunt Press. Dr. Anderson’s teaching interests include teaching with technology, encouraging active learning and assessment of student reflection.
Teaching Toolbox: Tips and Techniques for Assessing What Students Know

GanzImage.What has to shift to change your perspective? Thomas Kuhn coined the term paradigm shift and argued that science doesn’t progress by a linear method of gathering new knowledge, rather, a shift takes place when an anomaly subverts the normal practice, ideas and theories of science. Students learn through interaction with the surrounding environment mediated by prior knowledge from new and previous interactions with family, friends, teachers, and other sociocultural experiences (Falk & Adelman, 2003). Deep understanding of concepts depend on the interaction of prior experience with new information. As Kuhn stated in his 1962 book The Structure of Scientific Revolutions, “The challenge is not to uncover the unknown, but to obtain the known.”

In order to assess what students know, you need to find out what they already knew. An assessment can only provide useful information if it is measuring what it is intended to. In the medical field, assessments are used all the time, for example, an MRI is a useful diagnostic tool to determine the extent of tissue damage but it is not necessarily useful for establishing overall health status of an individual. Assessing what a student knows with a multiple choice test may also not be useful in establishing an overall picture of what knowledge a student possesses or how that knowledge is applied, especially if the items are not measuring what they are supposed to. Construct validity provides evidence that a test is measuring the construct it is intended to. How to measure construct validity is beyond the scope of this article, for information, see the classic work by Messick (1995). Outside of the psychometrics involved in item or assessment construction, I’ll provide some quick tips and techniques I have found useful in my teaching practice. What can you do to separate real learning with deep understanding from good test taking skills or reading ability? How can you assess what students know simply and effectively?

Instruction in a classroom environment needs to be connected with assessment rather than viewing instruction and assessment as separate activities. Understanding student thinking can be done with formative assessment which benefits students by identifying strengths and weaknesses and gives instructors immediate feedback regarding where students are struggling so that issues can be addressed immediately. By providing students with context in the form of a learning goal at the start of a class, the clear objective of the lesson allows them to begin making connections between what they already know and new information. When designing or preparing for a class, ask yourself:

  1. What do I assume they already know?
  2. What questions can I ask that will help me confirm my assumptions?
  3. What are the most common misconceptions related to the topic?

Tips for checking students background knowledge

  • On a whiteboard or in a presentation, begin with one to three open ended questions and/or multiple choice questions. Ask students to respond in two to three sentences, or circle a response. It’s important to let them know that the question(s) are not being graded, rather, you are looking for thoughtful answers that will help guide instructional decisions. Share the results at the start of the next class or with a free tool like Plickers for instant feedback.
  • Short quizzes or a survey with Qualtrics, Google Forms, or Doodle Poll can be used via Black Board prior to class. Explain that you will track who responded but not what the individual student responded at this point. Share the results and impact on course design with students.
  • Group work. Using an image, graph, or some type of problem regarding upcoming course content, have students come up with a list of observations or questions regarding the material. Use large sheet paper or sticky notes for them to synthesize comments then review the themes with the class.

Formative assessment is used to measure and provide feedback on a daily or weekly basis. In addition to learning goals communicated to students at the beginning of each class and warm up activities to stimulate thinking about a concept, formative assessment can include comments on assignments, projects or problem sets, asking questions that are intentional towards essential understanding rather than a general, “Are there any questions?” at the end of a lesson. To add closure and summarize the class with the learning goal in mind, provide index cards or ask students to take out a piece of paper and write in a couple of sentences what the most important points of the lesson were and/or ask them to write what they found most confusing so that it can be addressed in the next class. Formative assessments provide tangible evidence for you to see what your students know and how they are thinking and they provide insight and feedback to students in improving their own learning.

Summative assessment includes quizzes, tests and projects that are graded and used to measure student performance. Creating a well-designed summative assessment involves asking good questions and using rubrics. In designing an assessment that will accurately measure what students know, consider:

  1. What do you want your students to know or be able to do? This can also be used in each lesson as a guiding objective.
  2. Identify where you will address the outcomes in the curriculum.
  3. Measure what they know with your summative assessment.
  4. Based on the measurement, what changes can be made in the course to improve student performance?

Good questions

  • Measure what you intend for them to measure.
  • Allow students to demonstrate what they know.
  • Discriminate between students who learned what you intended versus those that did not.
  • Examine what a student can do with what they learned versus what they simply remember.
  • Revisit learning goals articulated at the beginning of a topic, unit or course.
  • Use a variety of questions such as multiple choice, short answer and essay questions.

Rubrics

  • Used for oral presentations, projects, or papers.
  • Evaluate team work.
  • Facilitate peer review.
  • Provide self-assessment to improve learning and performance.
  • Motivate students to improve their work.

Online rubric resources for educators include, Rubistar, Online Instruction Rubric, and Value Rubrics.

Students do not enter your classroom as a blank slate. Assessing and determining what students know targets gaps in knowledge. By incorporating an activity or a question in a small amount of time at the start and end of a class, you can check on potential and actual misconceptions so that you may target instruction for deep understanding. Background checks of prior knowledge provide awareness of the diversity of your students and their experiences further designing and improving instruction for active, meaningful learning. Creating a bridge between prior knowledge and new material provides a framework for students for a paradigm shift in learning and makes it very clear for them and for you to see what they learned by the end of a lesson or the end of a course.

 

References

Falk JH, Adelman, L.M. Investigating the Impact of Prior Knowledge and Interest on Aquarium Visitor Learning. Journal of Research in Science Teaching. 2003;40(2):163-176.

Kuhn TS. The Structure of Scientific Revolutions. 4th ed. Chicago: The University of Chicago Press; 1962.

Messick, S. (1995). Standards of validity and the validity of standards in performance assessment. Educational measurement: Issues and practice,14(4), 5-8.

 

PECOP Gatz picture

 

 

Jennifer (Jen) Gatz graduated from Ithaca College in 1993 with a BSc in Exercise Science and began working as a clinical exercise physiologist in cardiac and pulmonary rehabilitation. Jen received her MS in Exercise Physiology from Adelphi University in 1999, founded the multisport endurance training company, Jayasports, in 2000, and expanded her practice to include corporate health and wellness for Brookhaven National Laboratory, through 2012. Along the way, Jen took her clinical teaching practice and coaching experience and returned to school to complete a Master of Arts in Teaching Biology with NYS teaching certification from Stony Brook University in 2004. A veteran science teacher for 12 years now at Patchogue-Medford High School in Medford, NY, Jen is currently teaching AP Biology and Independent Science Research. A lifelong learner, Jen returned to Stony Brook University in 2011 and is an advanced PhD candidate in Science Education anticipating the defense of her dissertation in the fall of 2016. Her dissertation research is a melding of a love of physiology and science education focused on understanding connections among cognitive processes, executive functioning, and the relationship to physical fitness, informal science education, and environmental factors that determine attitudes towards and performance in science. In 2015, Jen was a recipient of a Howard Hughes Medical Institute Graduate Research Fellowship.