Category Archives: Education Research

Thinking Critically About Critical Thinking

 

A few mornings ago, I was listening to a television commercial as I got ready for work.  “What is critical thinking worth?” said a very important announcer.  “A whole lot” I thought to myself.

But what exactly is critical thinking?  A Google search brings up a dictionary definition.  Critical thinking is “the objective analysis and evaluation of an issue to form a judgement.”  The example sentence accompanying this definition is “professors often find it difficult to encourage critical thinking among their students.” WOW, took the words right out of my mouth!

Have any of you had the following conversation? “Dr. A, I studied and studied for this exam and I still got a bad grade.  I know the material, I just can’t take your tests!”  The student in question has worked hard. He or she has read the course notes over and over, an activity that has perhaps been rewarded with success in the past.  Unfortunately re-reading notes and textbooks over and over is the most common and least successful strategy for studying (4).

In my opinion, as someone who has been teaching physiology for over 20 years, physiology is not a discipline that can be memorized.  Instead, it is a way of thinking and a discipline that has to be understood.

Over the years, my teaching colleague of many years, Sue Keirstead, and I found ourselves during office hours trying repeatedly to explain to students what we meant by thinking critically about physiology.  We asked the same probing questions and drew the same diagrams over and over.  We had the opportunity to formalize our approach in a workbook called Cells to Systems Physiology: Critical Thinking Exercises in Physiology (2).  We took the tough concepts students brought to office hours and crafted questions to help the students work their way through these concepts.

Students who perform well in our courses make use of the workbook and report in student evaluations that they find the exercises helpful. But we still have students who struggle with the critical thinking exercises and the course exams.  According to the comments from student evaluations, students who struggled with the exercises report they found the questions too open ended.  Furthermore, many of the answers cannot be pulled directly from their textbook, or at least not in the format they expect the answer to be in, and students report finding this frustrating.  For example, the text may discuss renal absorption and renal secretion in general and then the critical thinking exercises asks the student to synthesize all the processes occurring in the proximal tubule.  The information is the same but the organization is different.  Turns out, this is a difficult process for our students to work through.

We use our critical thinking exercise as a type of formative assessment, a low stakes assignment that evaluates the learning process as it is occurring.  We also use multiple choice exams as summative assessments, high stakes assessments that evaluate learning after it has occurred.  We use this format because our physiology course enrollment averages about 300 students and multiple choice exams are the most efficient way to assess the class.  We allow students to keep the exam questions and we provide a key a couple of days after the exam is given.

When a student comes to see me after having “blown” an exam, I typically ask him or her to go through the exam, question by question.  I encourage them to try to identify how they were thinking when they worked through the question.  This can be a very useful diagnostic.  Ambrose and colleagues have formalized this process as a handout called an exam wrapper (1).  Hopefully, by analyzing their exam performance, the student may discover a pattern of errors that they can address before the next exam.  Consider some of the following scenarios:

Zach discovers that he was so worried about running out of time that he did not read the questions carefully.  Some of the questions reminded him of questions from the online quizzes.  He did know the material but he wasn’t clear on what the question was asking.

This is a testing issue. Zach, of course, should slow down.  He should underline key words in the question stem or draw a diagram to make sure he is clear on what the question is asking.

Sarah discovers that she didn’t know the material as well as she thought she did, a problem that is called the illusion of knowing (3). Sarah needs to re-evaluate the way she is studying.  If Sarah is cramming right before the exam, she should spread out her studying along with her other subjects, a strategy called interleaving (3).  If she is repeatedly reading her notes, she should put her notes away, get out a blank piece of paper and write down what she remembers to get a gauge of her knowledge, a process called retrieval (3).  If she is using flash cards for vocabulary, she should write out learning objectives in her own words, a process called elaboration (3).

Terry looks over the exam and says, “I don’t know what I was thinking.  I saw something about troponin and I picked it.  This really frustrates me. I study and study and don’t get the grade I want.  I come to lecture and do all the exercises. I don’t know what else to do.” It is a challenge to help this student.  She is not engaging in any metacognition and I don’t claim to have any magic answers to help this student.  I still want to try to help her.

I feel very strongly that students need to reflect on what they are learning in class, on what they read in their texts, and on the activities performed in lab (3).  I have been working on a project in one of my physiology courses in which I have students take quizzes and exams as a group and discuss the answers collaboratively.  Then I have them write about what they were thinking as they approached the question individually and what they discussed in their group.  I am hoping to learn some things about how students develop critical thinking skills.  I hope I can share what I learn in a future blog posting.

  1. Ambrose SA, Bridges MW, DiPietro M, Lovett M, Norman MK. How Learning Works: 7 Research Based Points for Teaching. San Francisco CA: Jossey-Bass, 2010.
  2. Anderson LC, Keirstead SA. Cells to Systems: Critical Thinking Exercises in Physiology (3rd ed). Dubuque, IA: Kendall Hunt Press, 2011.
  3. Brown PC, Roediger HL, McDaniel MA. Make it Stick: The Science of Successful Learning. Cambridge MA: The Belknap Press of Harvard University Press, 2014
  4. Callender AA, McDaniel, MA. The limited benefits of rereading educational text, Contemporary Educational Psychology 34:30-41, 2009. Retrieved from http://ac.els-cdn.com/S0361476X08000477/1-s2.0-S0361476X08000477-main.pdf?_tid=22610e88-61b4-11e7-8e86-00000aacb35e&acdnat=1499281376_e000fa54fe77e7d1a1d24715be4bbf50 , June 22, 2016.

 

 Lisa Carney Anderson, PhD is an Assistant Professor in the Department of Integrative Biology and Physiology at the University of Minnesota. She completed training in muscle physiology at the University of Minnesota. She collaborates with colleagues in the School of Nursing on clinical research projects such as the perioperative care of patients with Parkinson’s disease and assessment of patients with spasticity. She directs a large undergraduate physiology course for pre-allied health students.  She also teaches nurse anesthesia students, dental students and medical students.  She is the 2012 recipient of the Didactic Instructor of the Year Award from the American Association of Nurse Anesthesia.  She is a co-author of a physiology workbook called Cells to Systems: Critical thinking exercises in Physiology, Kendall Hunt Press. Dr. Anderson’s teaching interests include teaching with technology, encouraging active learning and assessment of student reflection.
Education Research: A Beginner’s Journey

Why does it seem so hard to do education research? I have never been afraid to take on something new – what is stopping me?  These thoughts were burning in my mind as I sat around in a circle with educators at the 2016 Experimental Biology (EB) meeting. During this session, we discussed how we move education research forward and form productive collaborations. Here are my takeaways from the meeting:

EDUCATION RESOURCES

Here are some tips to get started on education research that I learned from the “experts”.

1. Attend poster sessions on teaching at national conferences such as Experimental Biology.

2. Get familiar with published education research and design.

3. Attend the 2016 APS Institute of Teaching and Learning

4. Reach out to seasoned education researchers who share similar interests in teaching methodologies.

6. Get engaged in an education research network such as APS Teaching Section – Active learning Group

“Doubt is not below knowledge, but above it.”
– Alain Rene Le Sage

As seasoned research experts discussed education research in what sounded like a foreign tongue, I began to doubt my ability to become an education researcher. However, the group quickly learned that we had a vast array of experience in the room from the inspiring new education researchers to the seasoned experts. Thus, the sages in the room shared some valuable resources and tips for those of us just starting out (see side bar).

“We are all in a gutter, but some of us are looking at the stars”
– Oscar Wilde

You may already have all the data you need to actually publish a research study. In my mind, education research had to involve an intervention with a placebo and control group. However, it can also be approached like a retrospective chart review. To proceed, you should consult with your local Institutional Review Board to see if you will need informed consent to utilize existing data or if it qualifies for exemption.

“Setting out is one thing: you also must know where you are going and what you can do when you get there.”
– Madeleine Sophie Barat

It became clear at our meeting that the way forward was collaboration and mentorship. A powerful approach that emerged is taking a research idea and implementing it across a number of institutions in a collaborative research project. By doing this, we would have a network of individuals to discuss optimal research design and implementation strategies and increase statistical power for the study.

At the end of my week at EB, I reflected on my experiences and realized that education researchers are a unique group – in that, we are all passionate about the development of others. Collaborating with individuals who seek the best of each other will lead to great friendships and good research.

If you are interested in joining the APS Teaching Section “Active Learning Group”, please contact Lynn Cialdella-Kam.

Resources:

Suggested Readings:

Alexander, Patricia A, Diane L Schallert, and Victoria C Hare. 1991. “Coming to terms: How researchers in learning and literacy talk about knowledge.”  Review of educational research 61 (3):315-343.

Matyas, M. L., and D. U. Silverthorn. 2015. “Harnessing the power of an online teaching community: connect, share, and collaborate.”  Adv Physiol Educ 39 (4):272-7. doi: 10.1152/advan.00093.2015.

McMillan, James H, and Sally Schumacher. 2014. Research in education: Evidence-based inquiry: Pearson Higher Ed.

Postlethwaite, T Neville. 2005. “Educational research: some basic concepts and terminology.”  Quantitative research methods in educational planning:1-5.

Savenye, Wilhelmina C, and Rhonda S Robinson. “Qualitative research issues and methods: An introduction for educational technologists.”

Schunk, Dale H, Judith R Meece, and Paul R Pintrich. 2012. Motivation in education: Theory, research, and applications: Pearson Higher Ed.

PECOP Lynn Cialdella Photo

 

Lynn Cialdella Kam joined CWRU as an Assistant Professor in Nutrition in 2013. At CWRU, she is engaged in undergraduate and graduate teaching, advising, and research. Her research has focused on health complications associated with energy imbalances (i.e. obesity, disordered eating, and intense exercise training). Specifically, she is in interested in understanding how alterations in dietary intake (i.e., amount, timing, and frequency of intake) and exercise training (i.e., intensity and duration) can affect the health consequences of energy imbalance such as inflammation, oxidative stress, insulin resistance, alterations in macronutrient metabolism, and menstrual dysfunction. She received her PhD in Nutrition from Oregon State University, her Masters in Exercise Physiology from The University of Texas at Austin, and her Masters in Business Administration from The University of Chicago Booth School of Business. She completed her postdoctoral research in sports nutrition at Appalachian State University and is a licensed and registered dietitian nutritionist (RDN).

Statistical Strategies to Compare Groups

A blog about statistics. How great is this?! If it’s a blog, it has to be short. My wife, however, would say that even a blog about statistics is still going to be way too long.

In physiology education, we usually want to compare the impact of something—a new instructional paradigm, say—between different groups: for example, a group that gets a traditional approach and a group that gets a new approach. Depending on the number of groups we want to compare, there are different ways to design the experiment and to analyze the data.

Two Samples: to Pair or Not to Pair?

Suppose you want to see if formative assessments over an entire semester impact learning. Clearly, your students can either have formative assessments or not. So you randomly assign your 12 students to be in one group or the other. You teach your course, give the 6 students formative assessments, and then grade your 65-point final. The question is, did formative assessments (given to the students in Group 1) impact their grade on the final? These are the grades:

Group 1 2
47 40
48 56
63 65
64 33
62 65
50 51
Mean 55.7 51.7

These groups are independent of each other: the observations in one group are unrelated to the observations in the other group. So we want an unpaired 2-sample test. One option is a 2-sample t test. Here, the grades in the 2 groups are similar (P = 0.54): in this fictitious experiment, formative assessments did not impact grades.

What happens if the observations in one group are related to the observations in the other group? This could happen if you gave formative assessments to each student (Treatment 1) for half of your course and then gave an exam. During the other half of your course, each student got no formative assessments (Treatment 2). For each student you randomly assign the order of the treatments so that half get Treatment 1 first, the other half get Treatment 2 first.

In this situation each subject acts as her own control—this makes the comparison of the treatments more precise—and we want a paired 2-sample test. These are the data:

Subject Treatment 1 Treatment 2 Difference
1 49 58 9
2 47 55 8
3 52 39 –13
4 39 19 –20
5 59 58 –1
6 44 46 2
    –2.5 Mean

Here, the grades after each treatment are similar (P = 0.62): in this fictitious experiment, formative assessments did not impact grades.

When You Have Three or More Samples

Let’s pretend we want to think about the amount of fat donuts absorb when they are cooked. These numbers represent the amount of fat absorbed when 6 batches of donuts are cooked in 4 kinds of fat.

Fat Type 1 2 3 4
64 78 75 55
72 91 93 66
68 97 78 49
77 82 71 64
56 85 63 70
95 77 76 68
Mean 72 85 76 62

If you are watching your diet, the lower the number, the better. There is good news and bad news about this example. The good news is there are 24 donuts in a single batch. The bad news is 100 has been subtracted from the actual amount in order to simplify the numbers.

The first question: why not just use a 2-sample (unpaired) test to compare the amount of fat absorbed? There are two answers. First, if we compare just 2 groups at a time, we fail to use information about the variation within each of the two remaining groups. Second, if we compare just 2 groups at a time, we can make a total of 6 comparisons (1–2, 1–3, 1–4, 2–3, 2–4, 3–4). And if we do that, the chances we find at least one of the 6 comparisons to be statistically meaningful when all 6 are all statistically equivalent is about 1 in 4 (26%). The more comparisons we make, the greater the chances that we find a comparison to be statistically meaningful simply because we are making more comparisons.

What’s the solution? Use a procedure that initially compares all 4 groups at the same time. One option is analysis of variance. In analysis of variance, if the variation between groups is enough bigger than the variation within groups, then that is unusual if the group means are equal. Here, by analysis of variance, the amount of fat absorbed differs among the 4 fat types (P = 0.007). You can then use other techniques to identify just which groups differ.

The Big Picture

No matter how many groups you want to compare, the idea is the same: you want to design the experiment to account for—as best you can—extraneous sources of variation (like individual differences) that can impact the thing you want to measure, and you want to use all the information you collected when you compare the groups.

References

  1. Curran-Everett D. Multiple comparisons: philosophies and illustrations. Am J Physiol Regul Integr Comp Physiol 279: R1–R8, 2000.
  2. Curran-Everett D. Explorations in statistics: hypothesis tests and P. Adv Physiol Educ 33: 81–86, 2009.
  3. Curran-Everett D. Explorations in statistics: permutation methods. Adv Physiol Educ 36: 181–187, 2012.
  4. Snedecor GW, Cochran WG. Statistical Methods (7th edition). Ames, IA: Iowa State Univ. Press, 1980, p 83–106, 215–237.

Curran-Everett

Doug Everett (Curran-Everett for publications) graduated from Cornell University (BA, animal behavior), Duke University (MS, physical therapy) and the State University of New York at Buffalo (PhD, physiology). He is now Professor and Head of the Division of Biostatistics and Bioinformatics at National Jewish Health in Denver, CO. In 2011, Doug was accredited as a Professional Statistician by the American Statistical Association; he considers this quite an accomplishment for a basic cardiorespiratory physiologist. Doug has written invited reviews on statistics for the Journal of Applied Physiology and the American Journal of Physiology; with Dale Benos he has written guidelines for reporting statistics; and he has written educational papers on statistics for Advances in Physiology Education. Doug and his wife Char Sorensen officiate for USA Swimming and US Paralympic Swimming. After 32 years in 6th-grade classrooms, Char is now on her Forever Summer schedule: she retired in May 2009.

 

Building Critical Thinking through Technical Writing: Are We Taking the Right Approach?

thinking croppedI religiously read (ahem…meaning I quickly skim the RSS feeds) Faculty Focus for tips, tricks, latest educational research trends and general teaching strategies to help me overcome my classroom anxieties. Over the last year or so these blogs and articles have helped me with many ideas and issues with the courses that I teach. Recently, one particular blog resonated with me “How Assignment Design Shapes Student Learning” (Weimer, 2015). This blog spoke about how specific assignments guide students to think and perform in specific ways and how that influences their overall learning. You may be thinking; well of course. But my current overall educational research is: “How do writing lab reports contribute to a student’s understanding of the scientific method?”.  This makes me wonder if when we are working on helping students build critical thinking skills while using the premise of the scientific method, we may be going about lab report writing assignments in the wrong way.

When I first started teaching undergraduate General Biology and Anatomy and Physiology courses three years ago I was dismayed at what was the normal for student writing of lab reports. The first thing that I asked myself was “Was my science writing that bad when I was an undergrad?” My answer: a resounding Yes!

In Spring 2013, I set out to develop a series of assessments that helped students practice technical writing skills and create clear rubrics to help them develop this skill. I have collected data on student’s technical writing skills with the goal of correlating these new skills with student understanding and use of the scientific method.

The science technical writing assignments that I build into my lab courses are to help students via low stakes practice to reflect on the labs performed and the implication(s) of the data obtained. I have specific “chunked” assignments where students write components of a lab report. For example, the first lab write-up may have students write their testable hypothesis and the methods used in the lab. For the second lab, students write their testable hypothesis and results. These types of assignments continue until the students have had a chance to practice all of the components of a lab report prior to writing a complete lab report. Our data show students who perform practice chunked assignments do significantly better on the final lab report assignment (Hannah & Lisi, 2015).

Now, let’s mentally jump back to the blog  “How Assignment Design Shapes Student Learning” and the corresponding article “Private Journals versus Public Blogs The Impact of Peer Readership on Low-stakes Reflective Writing” (Foster, 2015). The data from Foster’s research illustrates that students have inherently different styles of writing depending on the target audience. Specifically, students who have open writing assignments (blogging to their peers) where they have to respond to peers and defend their information are more mentally adventurous than when they write journal assignments for only the professor or teaching assistants to read.

My technical writing data suggests students’ science technical writing improves with practice and regular prompt feedback. But are they only practicing the “form” and the rules that I set up in the assignments, or are they truly working through the material and using the scientific method to develop their critical thinking skills? In the end I want to help people explore science so that they can apply and evaluate scientific information to determine its impact on their daily lives. How does the traditional lab report accurately reflect a student’s ability to work through data? I would love to have comments if you have any thoughts or suggestions regarding how I might investigate students critical thinking skills using the blog format when writing science lab reports.

 

References

Foster, D. (2015). Private Journals versus Public Blogs: The Impact of Peer Readership on Low-stakes Reflective Writing. Teaching Sociology, 43(2), 104–114. http://doi.org/10.1177/0092055×14568204

Hannah,R., Lisi,M., (2015) Technical Writing for Introductory Science Courses – Proficiency Building for Majors and Non-majors, 2015 Experimental Biology Meeting Abstracts, Abstract #678.25; Accessed June, 10, 2015

Weimer, M. (2015). How Assignment Design Shapes Student Learning. (2015, August). Retrieved June 10, 2015, from http://www.facultyfocus.com/articles/teaching-professor-blog/how-assignment-design-shapes-student-learning/

 

PECOP rachael hannah
Rachel Hannah is a new Assistant Professor of Biological Sciences at University of Alaska, Anchorage. Previously, she was an Assistant Professor in the Math and Sciences Department at the University of Maine at Presque Isle. Helping people become scientifically literate citizens has become her major career focus as a science educator. As a classroom and outreach educator, Rachel works to help people explore science so they can apply and evaluate scientific information to determine its impact on one’s daily life. She is trained as a Neurophysiologist and her graduate degree is in Anatomy and Neurobiology from the University of Vermont College of Medicine. Recently Rachel’s research interests have migrated to science education and how students build critical thinking skills.