Lesson

A Classroom Intervention to Reduce Confirmation Bias

Author(s): Elizabeth N. Hane*1, Evelyn Brister1

Rochester Institute of Technology

Editor: Katie Burnette

Published online:

Courses: Science Process SkillsScience Process Skills

Keywords: bias process of science metacognition Confirmation Bias Cognitive Bias

3913 total view(s), 457 download(s)

to access supporting documents

Abstract

Resource Image

STEM students are often unable to recognize cognitive bias in their own disciplines, and simply describing cognitive bias to students has shown to be insufficient to improve critical thinking. However, habitual metacognitive techniques show promise for correcting cognitive biases, such as confirmation bias, a maladaptive cognitive strategy that specifically threatens the objectivity of scientists. As part of a course on metacognition in science, first-year STEM students were asked to give an oral presentation about a controversial socioscientific topic (e.g., GMO crops, de-extinction, or hydrofracking). The first year the course was offered, presentations exhibited confirmation bias at a high rate, despite instructions to examine multiple viewpoints about the scientific issue. In subsequent years, an intervention in the form of an interactive lecture/discussion/activity about confirmation bias and two specifically-designed homework assignments asked the students to reflect on evidence, search processes and potential biases. This intervention was jointly developed by faculty members in biology and philosophy to focus on habitual metacognitive techniques. Compared to no intervention, the resultant presentations had a higher percentage of reliable sources and a lower percentage of citations that only supported their conclusion. These results indicate that after the intervention exercise, students were discriminating among sources more carefully (Mann-Whitney p=0.009) and were using more sources from the other side of the argument, including presenting more reasons that refute their own ideas (Mann-Whitney p=0.003). We find that providing classroom instruction supported by deliberate practice to counteract confirmation bias improves students’ evaluation of scientific evidence.

Primary image: Venn diagram that illustrates the idea of confirmation bias.

Citation

Hane EN, Brister E. 2022. A classroom intervention to reduce confirmation bias. CourseSource. https://doi.org/10.24918/cs.2022.7

Society Learning Goals

Science Process Skills
  • Process of Science
    • Locate, interpret, and evaluate scientific information and primary literature
    • Interpret, evaluate, and draw conclusions from data
    • Construct explanations and make evidence-based arguments about the natural world
  • Modeling/ Developing and Using Models
    • Build and evaluate models of biological systems

Lesson Learning Goals

Students will:
  • recognize cognitive biases that influence their ability to be objective in science.
  • understand and use reflective techniques to reduce confirmation bias when searching for evidence in a socioscientific problem.

Lesson Learning Objectives

Students will be able to:
  • describe types of cognitive bias that are commonly encountered in the process of scientific discovery.
  • utilize techniques to reduce confirmation bias in their own scientific experiences.
  • evaluate sources of information and their role in the process of science.
  • apply the process of science to a socioscientific problem, connecting science to societal issues.

Article Context

Introduction

People like to be right. It is unpleasant to be wrong, and it is hard work to revise beliefs. In recent decades, psychologists have revealed a number of cognitive biases that lead us to make poor decisions about what to believe, at least in the complex information environment in which we now live. “Cognitive bias” refers to predictable, systematic deviations from objective reasoning that occur due to misapplications of otherwise effective strategies for evaluating evidence (1). Under the influence of cognitive biases, people feel like they are actively investigating evidence and searching for true beliefs, even though they are systematically prone to maintaining false ones. For instance, people tend to avoid challenges to their beliefs (confirmation bias); they tend to believe what the people around them believe, even when those beliefs are poorly justified (false consensus bias); they preserve consistent sets of beliefs by denying the validity of disconfirming evidence (backfire effect); and then, when confronted with evidence that they are cherry-picking what to believe, they deny the possibility that they are biased in this way (blind spot bias) (1-3).

Social and cognitive psychologists have revealed a number of biases that lead to reasoning errors. At the same time, examining whether evidence supports or disconfirms hypotheses is central to the pursuit of scientific inquiry (5-6). Searching for exactly this type of evidence, which has the possibility of disconfirming a belief one is otherwise motivated to accept, is not a cognitive process that comes naturally—but is nonetheless a key skill for critical thinking and scientific reasoning.

Making decisions about socioscientific issues, which include social, technological and scientific components, is a complex process (7) and is increasingly needed to become an informed global citizen. One of the reasons the process is so complex is that it draws on both the knowledge and experiences of the individual, as well as the values shared with peer groups (7). In some cases, values may play an even greater role than the evidence itself, as Grace and Ratcliffe (8) found when examining decisions students made about endangered species. Additionally, assumptions and understanding about the scientific process itself can also influence decision-making processes. For example, Sadler et al. (9) examined students’ decision-making process when given contradictory information about global warming. The researchers found that the students’ decisions were influenced not just by prior knowledge and evidence presented, but also assumptions about the scientific process itself. In particular, students struggled with the construct of the variability of the stability of scientific knowledge, and that some knowledge is more stable, while other knowledge is continuously being reevaluated and reinterpreted. Albe (7) found that after a classroom intervention about scientific evidence, some students changed their minds about a perceived link between the use of mobile phones and human health. In this case, the evidence was the key determinant in the students’ decisions, and how they understood the role of evidence in the scientific process elevated the evidence’s importance (7). The process of teaching the critical thinking skills needed in socioscientific decisions needs to be better studied to understand this interplay among students’ prior knowledge, use of evidence, individual and group values, and assumptions about the scientific process. Developing teaching strategies and ways to assess them will enable students to better navigate this complex decision-making process.

Students may encounter explicit critical thinking instruction in stand-alone courses offered by philosophy instructors. However, evidence suggests that general critical thinking skills do not transfer well between domains of instruction (10, 11). This means that better critical thinking learning outcomes are likely achievable through deliberate critical thinking instruction in discipline-specific courses. Although there is abundant literature in both philosophy and science pedagogy about how to teach the scientific method and the logic of scientific reasoning (e.g., 12-14), there is much less available about how to teach students to process scientific controversy and recognize their own biases. Additionally, it is not enough to teach students about cognitive biases – they must have the opportunity to practice these skills themselves within the context of their disciplines (15).

A recent emphasis on metacognition has brought the idea to the forefront of teaching students to monitor and evaluate their own learning processes (16-21). Metacognition involves not just being aware of one’s learning, but actively and intentionally planning, assessing, correcting, and re-directing learning processes. For example, Tanner (18) provides a framework for directing students to plan, monitor and evaluate their own learning with a series of self-directed questions; she also provides a similar set of questions to help guide metacognitive thinking in faculty to examine their own teaching. Metacognitive tools such as exam wrappers (22, 23), muddiest point (24), concept maps (25), lateral transfer maps (26), and guided reflections (27) provide instructors tools for intervening in the classroom to promote metacognitive practices.

This paper describes and evaluates a metacognitive classroom activity that has the purpose of explicitly teaching STEM students how cognitive biases are involved in evaluating evidence. It focuses in particular on confirmation bias, or the propensity to seek and utilize evidence that supports preexisting beliefs about a particular issue while paying less attention to disconfirming evidence. While this method focuses on techniques for mitigating individual bias, it also provides an opportunity to develop facility with identifying and discussing values and how they relate to scientific knowledge.

Intended Audience

Our students were first-year STEM students (science, engineering and computing students) in a science course that was devoted to metacognition (Metacognitive Approaches to Scientific Inquiry). However, the lesson could be adapted and used in any science class.

Required Learning Time

The lesson requires approximately one hour of instruction time, plus two additional homework assignments outside of class (one before the class, one after). In order to assess the efficacy of the lesson, the instructor should incorporate the ideas from the lesson into additional coursework. In our case, it took the form of a group presentation, which took two additional class periods. If additional class time is not available, alternatives might include a recorded presentation, an informal presentation in a discussion section, a written report, etc.

Prerequisite Student Knowledge

No prerequisite knowledge is needed.

Prerequisite Teacher Knowledge

The instructor should be familiar with the various kinds of cognitive bias. We provide potential sources for learning about these biases referenced in the introduction.

Scientific Teaching Themes

Active Learning

The lesson includes a pre-class homework that asks students to search for sources of information about scientific controversies (in the form of case studies) and form an opinion about the issues. During class time, students participate in a brief interactive lecture designed to introduce various forms of bias, focusing on confirmation bias in particular. The lecture has discussion components, and includes a logic activity, a think-pair-share and a reflection. In the middle of the lecture, students are placed in pairs to complete a logic puzzle. After the puzzle activity, the students participate in a discussion/debrief (think-pair-share) and reflection about what they observed about themselves and others during the puzzle solving. Finally, as a second homework, they are asked to 1) explicitly apply what they have learned by re-examining their pre-class homework to determine if they find evidence of confirmation bias in their own searches, and 2) to brainstorm ways to recognize and avoid confirmation bias in the future.

Asking students to apply what they have learned to a real-world socioscientific problem is important to allow the students the opportunity to practice these skills. For example, we required the students to do a short group presentation about a scientific controversy. In preparing for this presentation, students were explicitly asked to pay attention to both sources of evidence and their own decision-making processes. Instructors can adjust this assignment to fit the level of their students and the time available (e.g., recorded presentations, informal presentations, written reports, etc.).

Assessment

Assessment was performed on two levels. First, the pre- and post-class homework assignments provided information about how the students self-reported their own learning. The pre-class homework (Supporting File S1. Confirmation Bias – Homework Assignment #1) asked the students to search for information about a controversy before introducing the topic of confirmation bias. The post-class homework (Supporting File S4. Confirmation Bias – Homework Assignment #2) then asked them to re-examine their search methods and consider the role that confirmation bias might play in those searches and in forming their opinion about a particular topic. A short reflection is included in the homework to encourage students to explain what they found out about themselves and how their view of the topics may have shifted as a result of recognizing their own biases. They were also asked to brainstorm how they might reduce potential cognitive bias in the future.

Second, because the course was taught in multiple years, we were able to compare the performance of students on a final project presentation with and without the intervention. The first year (2014), students did not have an assignment or intervention about confirmation bias, and the second year (2015), we implemented the intervention that is included in this lesson. Students in both years completed the same final project, which was an oral or ASL signed presentation about a scientific controversy. By examining the quality of their evidence and potential confirmation bias of their sources and comparing them between years, we were able to examine the impact of the intervention on the subsequent selection and use of sources.

Inclusive Teaching

This activity is open to all students, independent of academic background, identity, or major. Our class was a blend of hearing and deaf/hard-of-hearing (D/HH) students, and the discussions and activities were designed to encourage participation by all. D/HH students are supported in the classroom by campus-based access services, which may include sign language interpreters or real-time captionists, depending on the student’s needs and preferences. Because our course was targeted to first generation and D/HH students (21), early in the course the students were all given instructions and modeled behavior of how to support an inclusive learning environment for individuals with disabilities.

The puzzle activity includes two roles – a guesser and a teller, and students are allowed to select the role in which they are most comfortable. The use of this intervention supports the development of metacognitive skills, particularly monitoring one's own learning. Aspects of identity, including being aware of ways in which reasoning and identity intersect, can be included in classroom discussions. The controversial topics themselves often lead to discussions in the classroom around diversity, and cultural or political diversity in particular. For example, one of the topics discussed in our classroom is hydrofracking, and the discussion about the pros and cons of this method for extracting energy often shifts to political and rural/urban differences in how this topic is viewed and how information about its utility and safety is presented. Modeling respectful behavior and active listening (28) is incorporated throughout the course.

Lesson Plan

Table 1. Lesson Timeline. At a minimum, the lesson includes one 50 minute class period and two out-of-class assignments (one pre- and one post-assignment).

Activity Description Estimated Time Notes
Preparation for Class
Students Complete Homework #1 Students complete first homework prior to coming to class. 45-60 minutes Students choose and read case studies and then search on the internet for a few sources about the topic. Students then identify their sources and briefly explain their position on the questions associated with each case. Supporting File S1. Confirmation Bias – Homework Assignment #1
Class Session 1
Introductory Lecture Types of Cognitive Bias 15 minutes Lecture slides with notes are in Supporting File S2. Confirmation Bias – Cognitive Bias Lecture Slides
Puzzle Activity Activity in pairs 10 minutes Students in each pair will each need separate written instructions, depending on their role (“guesser” or “teller”). Supporting File S3. Confirmation Bias – Puzzle Activity Handout
Pairs/Class Discussion Discussion first in pairs, then as a class 20 minutes Discussion/reflection about what happened could follow Kolb’s Experiential Learning Cycle (33) (e.g., What happened? What does it mean? How can we apply this?)
Wrap-up Lecture Summary about Cognitive Bias 5 minutes The rest of the lecture slides from Supporting File S2. Confirmation Bias – Cognitive Bias Lecture Slides
Post-class Assignment
Homework #2 Students Complete second homework after class 30 minutes Students examine their first homework and reflect on confirmation bias in their search methods and conclusions. Supporting File S4. Confirmation Bias – Homework Assignment #2.
Class Session 2 (or more - optional)
Group Presentations Groups of students complete oral presentations about a case study Varies Groups prepare and deliver a presentation on their resolution to the questions for a chosen case study, as well as their methods of how they researched the topic. Supporting File S5. Confirmation Bias – Presentation Instructions and Supporting File S6. Confirmation Bias – Presentation Rubric.

Components of the Class

Homework #1

This pre-class assignment asks students to read one of the case studies and then search on the internet for a few sources about the topic. Students then identify their sources and explain their position on the questions associated with each case. Supporting File S1. Confirmation Bias – Homework Assignment #1 includes three case studies that we have used successfully (i.e., cloning mammoths, GMOs (29) and hydrofracking (30)). Instructors could substitute any case study that is relevant to their class. We highly recommend using and adapting case studies from the National Center for Case Study Teaching in Science. Using case studies that are relevant to the course will help keep the interest of the students and provide avenues to introduce additional content, as needed. The efficacy of using case studies is well established as a way to promote engagement and retention (31, 32).

Class Session

The class session includes: 1) a short interactive lecture on confirmation bias, 2) a puzzle activity focused on critical thinking, and then 3) a class discussion and reflection about the activity:

1. Lecture:

The presentation introduces types of cognitive bias, with some of the psychological evidence behind it. Example slides, notes and additional references are included in Supporting File S2. Confirmation Bias – Cognitive Bias Lecture Slides. The notes also include places where the instructor can pause the presentation and ask questions to make the presentation more interactive.

2. Puzzle Activity:

Students pair up with a neighbor and each partner gets a different set of instructions (Instruction Handout is included in Supporting File S3. Confirmation Bias – Puzzle Activity Handout). In large classes, the instructor may need assistance from a TA or LA to pass these out in a timely manner. Student #1 is the “guesser” whose job it is to guess the “pattern rule” that describes the relationship of a sequence of numbers. This student is given an example sequence that fits the pattern rule: 2, 4, 8, 16. Student #1 can ask their partner if sequences fit the pattern rule or not – as many sequences as they wish – before they have one chance to guess what the pattern rule is. The job of Student #2 (the “teller”) is to tell whether the sequence fits the pattern rule or not. Their role consists of saying “yes” or “no.” The pattern rule is that the numbers are increasing.

The activity is based on the online puzzle from the New York Times.

Here is an example exchange:

Guesser: “Does 3, 6, 12, 24” fit the rule?

Teller: “Yes.”

Guesser: “Does 1, 2, 4, 8, 16” fit the rule?

Teller: “Yes.”

Guesser: “Is the rule that the numbers are doubling?”

Teller: “No, that is not the rule.”

Remember that Guessers can test out as many sequences as they wish, but they only get one guess as to what the pattern rule is.

3. Discussion unpacking the activity:

Most students will (incorrectly) guess that the pattern rule is “doubling” after trying just one or two sequences. They might guess that the pattern rule is “increasing” (correct), but it’s quite unusual for anyone to ever try to get a negative answer – that is, to try a sequence they think will NOT fit the pattern rule, despite the fact that there’s no penalty for testing a sequence that does not fit the pattern rule. No one thinks to intentionally test something they think is wrong! This leads to a broader discussion about the importance of exploring alternative views, including those that you feel are incorrect, and the relevance of testing assumptions as a part of scientific method. The discussion/debrief structure could follow Kolb (33), who describes four stages of experiential learning. First, the experience itself (the puzzle), then three additional stages that can be summarized as 1) What happened? (observation) 2) So what? (meaning) 3) Now what? (abstraction). We have found that guiding the students intentionally through these stages (and sharing the cycle with them) helps shape the discussion.

Homework #2

This assignment (Supporting File S4. Confirmation Bias – Homework Assignment #2) asks the students to go back and re-examine Homework #1, in particular reflecting on their previous sources and whether they tended to pick sources that reinforced their existing view. The homework prompts them to brainstorm strategies for exploring alternatives.

Presentations

Students are assigned to groups of four, then they pick one of the case studies. Depending on the context of the class, these groups could be randomly assigned, lab group partners, student selected, or some other formation. The groups prepare and deliver a presentation on their resolution to the questions for each case study, as well as their methods of how they researched the topic. Note: In our syllabus, we devote parts of several classes leading up to this assignment deadline so groups have time to work and plan (i.e., the last 15-20 minutes of a 75 minute class would be dedicated to “group time” to allow the groups to meet and work on their presentation, ask questions, seek advice from instructor, practice, etc.). The instructions and associated grading rubric are in Supporting File S5. Confirmation Bias – Presentation Instructions and Supporting File S6. Presentation Rubric.

Teaching Discussion

Students were introduced to the idea of confirmation bias in three ways: 1) a short lecture/discussion about the topic of cognitive biases, 2) a logic puzzle that they completed in pairs, and 3) a homework assignment that asked them to re-examine their own research strategy about a case study to look for evidence of confirmation bias. The logic puzzle has such a simple solution that it is surprising that so few (usually < 10%) of the participants guessed the correct solution, and even fewer tried sequences that they thought were incorrect to test their answer. This result often leads to a rich class discussion about why people don’t try to get a negative answer to confirm their solution. Following the learning cycle described by Kolb (33), the discussion first centers on the observation of what happened (How many people guessed the rule wrong? How many sequences did you try before you guessed the rule? Did you try to get a sequence that didn’t fit the rule? Why or why not?), then on what it means (Why do people not try things they think are wrong?) and then finally on how we can generalize this behavior (If we know we behave this way, what does it mean for understanding evidence and data? How can we counteract this behavior? What can you do when researching your case study to alleviate this bias? What practices do scientists use to counteract this form of bias?).

Student Reflections

The second homework assignment asked students to reflect back on the first homework to look for bias in their own searches for evidence about the controversial socioscientific case studies. As a result of this homework, many students recognized that they did not search for multiple viewpoints, but frequently only looked for evidence that matched what they already thought about the topic. When asked to search for more information that supported alternative viewpoints and then reconsider their position, nearly half of the students shifted their ideas at least somewhat as a result. A quote from a student is illustrative: “I found I was looking for reasons hydrofracking was bad for the environment and didn’t even look for information about why it might be good. Even though it didn’t really change my mind, I didn’t even know the other arguments.”

Other students recognized that they missed important aspects of an issue by not searching for alternative viewpoints:

Two out of the three websites that I visited before had supported my view on the topic of de-extinction. My view is that we should only bring animals back for the sake of research and nothing more. However, one of the articles I read was about the passenger pigeon and how bringing it back could positively affect the ecosystem it once lived in. That one article challenged and made me question my view on de-extinction.”

Another student was concerned about how to handle evidence and create a valid argument when not everyone shared the same value system:

“I’m looking for evidence on both sides of the argument. I’m looking for valid reasons to support each side that everyone can understand or relate to. For example, one argument for not bringing back extinct species has a religious affiliation as some say the process is playing God. While this is a respectable opinion, not everyone in science or in the world has a religious belief system so that argument may be invalid to a lot of people. However, an argument based on how the species may hurt or help the environment is an argument that can be understood by many because it is based on fact that doesn’t have any belief bias. Also, when I was working with my group, I brought up the idea that the whole idea of de-extinction could not actually be about bringing back an extinct species but a way to advance biological engineering and genetics as a whole. So I’m also looking for ways that the technology and theory behind de-extinction could impact science in general if it was successful.”

In this case, the student is looking for arguments that he believes will appeal to and be understood by a broader audience. The process of re-examination of evidence also leads him to expand the range of relevant considerations from immediate effects on the target species to broader effects on the ecosystem, on other uses of science and technology, and on ways of life.

Comparison of Cohorts

We also examined whether the pedagogical intervention impacted how the students utilized sources in a subsequent assignment by comparing how students used evidence in an untreated cohort in 2014 (n=12) versus a treated one in 2015 (n=11). IRB approval was obtained, information was gathered in accordance with university IRB standards, and all participating students signed informed consent forms. After the courses ended, a team of instructors and learning assistants scored the reference sources used in each presentation, which had been anonymized. The team evaluated the reliability of the sources cited in each presentation, ranking them either as reliable (e.g., peer-reviewed, written by recognized expert, university publication, etc.) or not reliable (e.g., written by a political or advocacy group, websites with no author listed, blogs, etc.). Each source was also scored to determine if the reference supported or refuted the main conclusion of the presentation.

The students’ quality of evidence increased when there was an intervention. Compared to no intervention, the resultant presentations had a higher percentage of reliable sources (Mann-Whitney p=0.009; Figure 1A) and a lower percentage of citations that only supported their conclusion (Mann-Whitney p=0.003; Figure 1B). The intervention cohort had a more even distribution of citations among sources that supported vs. questioned their conclusions, indicating that they gave more consideration to testing their view against objections (49% of citations supported the view presented versus 63% the previous year). These results indicate that after the intervention exercise, students were discriminating among sources more carefully and were using more sources from the other side of the argument, including presenting more reasons that refute their own ideas. One explanation for why this intervention was more successful than that studied by Kenyon and Beaulac (15) is that it is situated in the subject-matter appropriate context of a science course, thereby presenting a deeper understanding of the nature of scientific inquiry and a stronger motivation to present effective knowledge claims.

Improvements and Adaptations

This activity could be adapted to any course that uses evidence to support conclusions, as confirmation bias is not unique to science. However, the activity is particularly well-suited to examining evidence for socioscientific controversies for which there is a wide range of quality in the evidence available, and for which students often have pre-existing beliefs. The topics of the case studies in the homework assignments can (and should) be adjusted to match appropriate subject matter and level of the students. This is particularly important for non-majors courses where students’ interest is critical; the chosen case studies should be relevant and personal whenever possible. For example, hydrofracking as an energy source may be regionally relevant, while the science behind cloning a mammoth might not elicit as much interest. Additionally, the lessons could be paired with the exercises for non-majors described in Copley et al. (34), which focus on using historical documents to describe the relationship between science and society. The complementary discussions about use of evidence would also reinforce many of the ideas from this lesson.

While their work focuses primarily on historical and civic case studies, the collected materials and advice from Stanford’s History Education group about students’ use of digital evidence may be useful for instructors. For example, a recent article analyzes the advice about evaluating sources from 50 universities’ library websites and offers advice to instructors and librarians about how to help students evaluate digital sources without resorting to a “checklist” approach (35). The website also includes a repository of research about how students examine historical evidence, and also lesson plans for how to address evaluating evidence in the classroom.

For advanced majors courses, the case study topic could be tailored more to the topic of the course. For example, in a Conservation Biology course, topics might include: 1) is releasing a non-native biocontrol agent into an ecosystem to control invasive species desirable? Or 2) should we reintroduce American chestnut trees that have been genetically modified to resist blight into the wild? These topics are relevant to the course, and folding the lessons from the confirmation bias exercise into the discussions can lead to a richer discussion about how decisions are made. The Center for Case Study Teaching in Science has a repository of peer-reviewed case studies on a wide variety of scientific subjects and includes teaching resources.

Additionally, we have found that sharing the grading rubric with students and specifically describing elements of the use of evidence in the rubric has helped clarify and motivate students to carefully select and use sources in their arguments. In subsequent years, we have engaged students in revising the rubric itself (e.g., “What makes a good presentation?” and “How do you recognize good evidence?”) so that it is co-created within the class and students feel they have input into the grading elements.

Conclusions

In conclusion, we found the intervention improved the quality of the sources that students cited as evidence and that students also were more likely to incorporate ideas into a subsequent assignment that were counter to their own conclusions. Guiding students through the process of recognizing their own biases appears to have a greater impact than simply describing the bias and telling them to avoid it. The combination of introductory lecture, activity and reflection/revision of their own work has the potential to help students to be metacognitively aware of their own confirmation bias and take actions to mitigate this bias.

SUPPORTING MATERIALS

  • S1. Confirmation Bias – Homework Assignment #1
  • S2. Confirmation Bias – Cognitive Bias Lecture Slides
  • S3. Confirmation Bias – Puzzle Activity Handout
  • S4. Confirmation Bias – Homework Assignment #2
  • S5. Confirmation Bias – Presentation Instructions
  • S6. Confirmation Bias – Presentation Rubric

Acknowledgments

We would like to thank Dr. Scott Franklin, co-instructor for the course and co-conspirator on many metacognitive adventures. This activity was part of a university curricular initiative in applied critical thinking, and we appreciate the support for EB in her time as the Fram Fellow for Applied Critical Thinking.

We would also like to thank our students and learning assistants, who have participated in the course and contributed to our understanding of how students view cognitive biases. All course material was gathered with approval from and in accordance with RIT’s IRB standards, including signed informed consent from students. The manuscript was improved by the thoughtful suggestions of anonymous reviewers and an editor. This material is based upon work supported by the National Science Foundation under Grant No. #1317450.

References

  1. Gilovich T. 1991. How we know what isn't so: The fallibility of human reason in everyday life. New York: The Free Press.
  2. Kahneman, D. 2011. Thinking, fast and slow. New York: Farrar, Straus, Giroux.
  3. Bazerman M, Tenbrunsel A. 2011. Blind spots: Why we fail to do what’s right and what to do about it. Princeton: Princeton University Press.
  4. Popper K. 1959. The logic of scientific discovery. New York: Basic Books.
  5. Lakatos I. 1970. Falsification and methodology of scientific research programmes, p 91-196. In Lakatos, I & Musgrave A (eds), Criticism and the Growth of Scientific Knowledge. New York: Cambridge University Press.
  6. Popper K. 1972. Objective knowledge. New York: Oxford University Press.
  7. Albe, V. 2008. Students’ positions and considerations of scientific evidence about a controversial socioscientific issue. Sci & Educ 17:805-827. doi 10.1007/s11191-007-9086-6
  8. Grace MM, Ratcliffe M. 2002. The science and values that young people draw upon to make decisions about biological conservation issues. International Journal of Science Education 24:1157-1169.
  9. Sadler TD, Chambers FW, Zeidler DL. 2004. Student conceptualisations of the nature of science in response to a socioscientific issue. International Journal of Science Education 26(4):387-410.
  10. Ennis R. 1989. Critical thinking and subject-specificity: Clarification and needed research. Educ Res 18: 4–10.
  11. McPeck J. 1990. Critical thinking and subject specificity: A reply to Ennis. Educ Res 19: 10–12.
  12. McPherson GR. 2001. Teaching & learning the scientific method. Am Biol Teach. 63:242-5.
  13. Windschitzl M, Thompson J, Braaten M. 2008. Beyond the scientific method: Modelā€based inquiry as a new paradigm of preference for school science investigations. Science Education. 92(5):941-67.
  14. Blachowicz J. 2009. How science textbooks treat scientific method: A philosopher’s perspective. Br J Philos Sci 60:303-344. http://www.jstor.org/stable/25592003
  15. Kenyon T, Beaulac G. 2014. Critical thinking education and debiasing. Informal Logic 34(4): 341-163.
  16. Georghiades G. 2000. Beyond conceptual change learning in science education: Focus on transfer, durability, and metacognition. Educ Res 42, 119–139.
  17. Pintrich PR. 2002. The role of metacognitive knowledge in learning, teaching and assessing. Theory into Practice 41: 219-225. doi 10.1207/s15430421tip4104_3
  18. Tanner K. 2012. Promoting student metacognition. CBE–Lif Sci Ed 11:113-120. https://doi.org/10.1187/cbe.12-03-0033
  19. McGuire SY. 2015. Teach students how to learn: Strategies you can incorporate into any course to improve student metacognition, study skills and motivation. New York: Stylus Publishing.
  20. Avargil S, Lavi R, Dori YJ. 2018. Students’ metacognition and metacognitive strategies in science education. In Dori Y.J., Mevarech Z.R., Baker D.R. (eds), Cognition, Metacognition, and Culture in STEM Education. Innovations in Science Education and Technology, vol 24. Cham: Springer. https://doi.org/10.1007/978-3-319-66659-4_3
  21. Hane E, Franklin S. 2019. Improving inclusivity and diversity in college STEM programs through metacognitive classroom practices. Paper presented at 2019 CoNECD - The Collaborative Network for Engineering and Computing Diversity, Crystal City, Virginia. https://peer.asee.org/31770
  22. Lovett MC. 2013. Make exams worth more than the grade: Using exam wrappers to promote metacognition, p 18-52. In M. Kaplan, N. Silver, D. LaVauge-Manty & D. Meizlish (eds), Using Reflection and Metacognition to Improve Student Learning: Across the Disciplines, Across the Academy.  New York: Stylus Publishing.
  23. Chen P, Chavez O, Ong DC, Gunderson B. 2017. Strategic resource use for learning: A self-administered intervention that guides self-reflection on effective resource use enhanced academic performance. Psychological Science 2: 774-785. doi 10.1177/0956797617696456
  24. Angelo T, Cross K. 1993. Classroom assessment techniques: A handbook for college teachers, 2nd ed., San Francisco, CA: JosseyBass.
  25. Novak JD. 1990. Concept maps and Venn diagrams: Two metacognitive tools to facilitate meaningful learning. Instr Sci 19:29-52. https://doi.org/10.1007/BF00377984
  26. Hane EN, Quiñones de Magalhães RM, Nguyen E, Franklin SV. 2020. Lateral transfer maps as a metacognitive tool in first year STEM courses. CourseSource. https://doi.org/10.24918/cs.2020.45
  27. Dounas-Frazer DR, Reinholz DL. 2015. Attending to lifelong learning skills through guided reflection in a physics class. American Journal of Physics. 83(10):881-91.
  28. Spataro SE, Bloch J. 2018. “Can you repeat that?” Teaching active listening in management education. Journal of Management Education. 42:168-98. doi.org/10.1177/1052562917748696
  29. Carris LM, Jacobson NL. 2010. Banana split: To eat or not to eat. National Center for Case Study Teaching in Science. https://www.nsta.org/ncss-case-study/banana-split-eat-or-not-eat
  30. Larrousse MM. 2014. Setting water on fire: A case study in hydrofracking. National Center for Case Study Teaching in Science. https://www.nsta.org/ncss-case-study/setting-water-fire-case-study-hydrofracking
  31. Herreid CF. 1994. Case studies in science: A novel method for science education. J Coll Sci Teach 23 (4): 221–229.
  32. Dochy F, Segers M, Van den Bossche P, Gijbels D. 2003. Effects of problem-based learning: A meta-analysis. Learning and Instruction 13: 533–568.
  33. Kolb DA. 2014. Experiential learning: Experience as the source of learning and development. Upper Saddle River, NJ: Pearson FT press.
  34. Copley SD, Babbs S, Losoff B. 2021. Science and society: Integrating historical science materials into an undergraduate biology course.  CourseSource. https://doi.org/10.24918/cs.2021.23
  35. Ziv N, Bene E. 2021. Preparing college students for a digital age: A survey of instructional approaches to spotting misinformation. Stanford Digital Repository. Available at https://purl.stanford.edu/qq566ny0539

Article Files

to access supporting documents

  • pdf Hane-Classroom intervention to reduce confirmation bias.pdf(PDF | 232 KB)
  • docx S1. Confirmation Bias - Pre-class homework.docx(DOCX | 18 KB)
  • pptx S2. Confirmation Bias - Lecture slides.pptx(PPTX | 913 KB)
  • docx S3. Confirmation Bias - Puzzle activity handout.docx(DOCX | 13 KB)
  • docx S4. Confirmation Bias - Post-class homework.docx(DOCX | 14 KB)
  • docx S5. Confirmation Bias - Presentation instructions.docx(DOCX | 16 KB)
  • docx S6. Confirmation Bias - Presentation rubric.docx(DOCX | 18 KB)
  • License terms

Authors

Author(s): Elizabeth N. Hane*1, Evelyn Brister1

Rochester Institute of Technology

About the Authors

*Correspondence to:  Gosnell School of Life Sciences, 85 Lomb Memorial Dr., Rochester, NY 14623.  enhsbi@rit.edu

Competing Interests

None of the authors has a financial, personal, or professional conflict of interest related to this work. The authors affirm that they either own the copyright or have received written permission to use the text, figures, tables, artwork, abstract, summaries, and supporting materials.

Comments

Comments

There are no comments on this resource.