Addressing Challenges to a Systematic Thinking Pattern of Scientists: A Literature Review of Cognitive Bias in Scientific Work

RahmiQurota Aini1Aini RahmiQurota  ,  Yustika Sya’bandari  ·2Rusmana Ai Nurlaelasari   ·2하 민수  Minsu Ha2*

Abstract

Scientist has an essential role for making discoveries and come across inventions. However, the way of doing science can obstructed due to the human limitation and individual reasoning. Knowing the process of system thinking is essentials approach to understand how science product are influenced by various factors. Research on cognitive biases can help this interest to understand this limitation among scientists. We discussed about scientist activities that considers on history and philosophy of science and the sustainable future requiring creative problem solving which can be very susceptible to cognitive biases. Another attention is the multilayer of current scientific work, where scientist works individually and collaborate with the colleagues. Thus, this paper discusses the framework of cognitive biases related to scientist works. Alternatively, weighing such consideration, the new framework was made by listing the biases among scientific work. We close with the recommendation of validating this framework to become of potential construct that would be helpful for further study.

Keyword



Introduction

Science and technology play an essential role, by making discoveries and inventions, to make human life more convenient. Besides stereotypical image of scientists, a person who do science, drawn by children are those worked in laboratory with the tools (Buldu, 2006; Quita, 2003), but day-to-day working condition of scientists also involved by spending time to read and seek sources, going to the field for observations, and the notions of all scientific methods cannot be separated from making a decision. To do so, the process of scientific work must be escorted by rational attitude because human endeavors based on existing knowledge are prone to errors. Cognitive bias is a psychological explanation of human thought patterns and rational judgment (Haselton et al., 2015) that correlate with remembering, evaluation, information processing, and decision-making (Hilbert, 2012; Tversky&Kahneman, 1974). The current concern due to the role of scientist is also making a rational decision by considering the cognitive bias.

Knowing that many processes in scientific work have an issue because of a shortcoming in human thinking, historically, early research have discussed about rational choice, found that human make an error when making a decision (Ellis, 2018). Although humans known as “rational animals” , but experiments from Bayesian analysis in the 50s and 60s found that humans can be biased in judgment and make poor decisions (Edwards et al., 1963; Ellis, 2018). For instance, Francis Bacon also perform to promoting science by gathering the data and proving knowledge though doubt and experiments (Williams, 2011). Bacon agree that human tend to interpret data based on what they believe, therefore, Bacon’s philosophy of science is the derived from inductive reasoning to make generalization based on observation (Williams, 2011). What Bacon’s explain about nature of human are also explained as confirmation bias, the tendency of individual to confirm prior belief. In the study of psychology and behavioral economics, recorded similar systematic fallacy thinking patterns called cognitive bias. Furthermore, we will discuss this systematic thinking pattern, which a previous author have worked to listed more than 150 biases and its classification according to causes and strategies (Benson, 2021). The present theories about cognitive bias are challenging because it provides a flaw of thinking rather than the standard of logic (Haselton et al., 2015). Eventually, humans' cognition is prone to error and affects not only decision making, but also in memory, behavior, and judgment.

Our objective is to focus on main challenge for scientists’ systematic thinking pattern that previously received researcher’s attention in behavioral economics, politics, and business. In the current information abundance, scientists and people who directly involve science also should generate the right decision making in this is the era of big data. Main areas of cognitive science and education, with the empirical advance cognitive bias may help to tackle this challenge. The more research in this area might possibly to create a better scientist community and promote responsible scientific research. This current literature works to understand cognitive biases that can influence scientific work. We used critical literature synthesis through the process: 1) defining common scientific activities across discipline supported by previous framework, 2) understanding the process of each cognitive process, and 3) critical literature synthesis of each cognitive biases in each scientific activities.

The Process of Scientific Work

The history and philosophy of science (HPS) can be a guiding framework for us to understand how science done by scientists. Science philosophers such as Francis Bacon, Karl Popper, and Thomas Kuhn portray a significant understanding about the scientific methodologies and implications in science. However, a general scientific work is also differed in detail due to their work in a specific discipline that have own methodologies (Williams, 2011). For example, some geologist and environmental scientist may work in the field to gather data, while some other scientist modeling their ideas in the laboratory. Although science is a human endeavor, a scientist works based on scientific methodologies in order to have a valid and reliable study. Science and scientific work are filled with influences of history and branch of philosophy, culture, and specific discipline. These scientific methods are also multistep process to contribute solving specific problem. Therefore, scientist is required to have creativity and hardworking abilities (Aydemir&Ulusu, 2020). For sustainable future in addition of scientific work, the next generation of scientist and scholars should also advocate to have system thinking and interdisciplinary science. This interdisciplinary science as a part of scientific method should also employ critical thinking, problem solving and creativity (Blatti, 2019). In addition, the contribution of most science program nowadays lays on improving future scientists’ ability to solve the problem and improving the society (Matthews, 2015). However, the way of knowing science, as a human endeavor, may include exploration and investigation to making sense of evidence and having tentative explanations. During the process of scientist work has human elements for creativity, subjectivity, and cultural influences. Therefore, various decision biases identified as the variables that lead to unsuccessful in in creative problem solving (Todd et al., 2019). Being aware to the process of system thinking is essentials approach to understand how science product are influenced by various factors.

Defining Cognitive Bias

The decision-making process in establishing a scientific product by scientists require a rational thinking. Cognitive bias was introduced by Daniel Kahneman and Amos Tversky (1972, 1996) to describe its implications for practical judgment in areas such as economics, management, and finance. In another context, cognitive bias may be used to understand individual reasoning patterns and motivational processes behind their behavior to make a decision. Kahneman and colleague (2016, 2021) make a distinction between judgement noise and bias, although both makes errors, bias have different types in which it has authentic accuracy. For instance, biased illustrated in arcade shoots as an consistent error in the same one place, while noise are shots error in widely scattered location. Therefore, bias can be examined by understanding the accurate response. Haselton (2015) explained that two type of error produced by cognitive mechanism are false positive through adopting a belief that is false, and false negative though failing to adopt a belief that is true. Additionally, people who have such biases are more likely to believe that their thought is right - although it is wrong. Underlies the reason of why it is hard to know its own unknown knowledge, because biases in human thinking are occurred unconsciously (Fiske, 2002).

The notion of this system thinking according to Kahneman explained due to heuristic, or human mental short cut. Kahneman assert two system thinking called 1) System 1 as fast, automatic, emotional, and intuitive thinking, 2) System 2 as show, effortful, conscious, and logical. According to Kahneman (2003, 2011) oftentimes cognitive bias rely even more heavily on system 1 way of thinking that stem from the associations stored in memory. Therefore, other prior framework by Benson (Benson, 2021) have classified known biases according to its case and coping strategies to handle information includes 1) Too much information, 2) Not enough meaning, 3) Need to act fast, and 4) What should we remember? Another framework to study biases also presented by Valdez et al. (2018) according to different stages of cognitive processes may occurred to help the study of cognitive bias in visualization research. Table 1 presented the detail of categorization of its cognitive processing and example of biases.

Table 1. Framework of cognitive bias in visualization research

http://dam.zipot.com:8080/sites/BDL/images/N0230110302_image/Table_BDL_11_03_02_T1.png

In another specific situation, the way individual processing information and making a decision would be different as well. In the case of scientific work, while the methodology are variates, the process of making rational judgment and reasoning should also have a consistent format. Furthermore, it is necessary to ensure that the process is less biased, so it provides rational ideas from the scientist.

Identifying Cognitive Biases in Scientist Activities

As we discussed that the scientist works are influenced by history, philosophy, and spesific dicipline through scientific methodologies and considering the future needs for creative problem solving, further would posit this thinking fallacy in scientist activity. This activities are defined as a common process of scientist work across disciplines. It is also proven that ability to accomplish complex activity such as defining problem and evaluating ideas were substantial with scientific work and creative problem solving (Lonergan et al., 2004; Reiter-Palmon&E. J. Robinson, 2009; Todd et al., 2019). The current scientific activities also requires collaborations within the researcher, scientists, scholars, and students instead of working alone. Therefore, the challenge of systematic thinking pattern of scientists are complex multilayer aspects such as individual cognition, environments, and contextual. At an individual level, to solve the problems, scientists need to have the ability to identify and construct the problem, acquire and find information, generating ideas, evaluate and implement the solution (Carmeli et al., 2013). At the organizational level, it suggested four phases of cognitive processes in creative problem solving which included 1) problem construction, 2) Information search and encoding, 3) generating solutions, 4) evaluating ideas (Reiter-Palmon&Illies, 2004). In this sense, it noted teams’ creative problem-solving issue also need to notice in the future, particularly understanding how problem identification or construction can be occurred in the team level (Reiter-Palmon&Robinson, 2009). On the other hand, Liedka (2015) also documented three stages of design-thinking process models, which linked creativity to encourage innovation by novel problem-solving methodology. Therefore, it is noted flaws of cognitive process in the activity of innovative problem-solving and its consequences (Liedtka, 2015). In regard to the factors that can be an obstacle for scientist, we discussed the relation of how cognitive bias involved in this multilayered process in Table 2.

Table 2. Current framework of cognitive biases in scientific work and creative problem-solving

http://dam.zipot.com:8080/sites/BDL/images/N0230110302_image/Table_BDL_11_03_02_T2.png

Cognitive Biases in Identifying a Problem

Status quo bias can occur when individuals are favoring the current situation, maintaining due to loss aversion, or prefer the familiar thing instead of the willingness to accept something novel (Kahneman et al., 1991; Samuelson&Zeckhauser, 1988). Those specific characteristics can hinder other possibilities, not accept the variation, and give the scientist failure to identify new problems. It will limit them to judge only by those characteristics and hinder creativity.

Pro-innovation bias is excessive belief towards the innovation usefulness throughout society (Rogers, 2010), people often to adopt and accept the innovation while it is failing to identify limitations and weaknesses. An individual with pro-innovation bias often neglects the significance of the current problem and fails to understand the consequences of innovation. Pro-innovation bias among scientists will limit the ability of them to anticipate the undesirable consequences of innovation.

Anchoring is one of psychological heuristic where individuals are being relied on the information that firstly already known or shown as “anchor” (Fiske, 2002). Different starting points yield different estimates, which are biased toward the initial values. Although anchoring as an adjustment can be inclusive to system thinking 2 where it involved deliberate process thinking, but anchoring as priming effect can evoke information and drive people to have tendency to underestimate the probabilities of failure in complex systems (Kahneman, 2011).

Cognitive Biases in Generating Ideas

Projection bias is the tendency to over-predict future preferences will be the same with current preference because of the emotional states, thoughts, and value (Loewenstein et al., 2003). Liedtka (2015) argues that this bias can limit people to generate innovation because it fails to make new ideas, as the decision-maker might predict the future using present states which are not relevant.

Automation bias is the tendency to lean on an automated system such as computers build based on a particular program (Loewenstein et al., 2003). Depending on the system may limiting possibilities because it based on machine training and data inputted to the system. If human depends on the computer, indicate that person is not thinking differently and may cause more or different error, or tend to have discussed less information (Loewenstein et al., 2003). A previous study in the academic field found that it often fails to acknowledge the error in the automation system can introduce (Goddard et al., 2012).

Functional fixedness, related with problem-solving, is one of cognitive bias that drives people to use objects in a standard way and oppose to use in different ways (Duncker&Lees, 1945). In other words, this bias is stipulate on any single solution for a particular problem. The functional fixedness bias makes people unable to recognize alternative approaches and uses of elements constraints creativity, and thus limits the ideation and problem-solving. In pictorial creative generation task, functional fixedness also often demonstrated which cause of examples (Chrysikou&Weisberg, 2005; Chrysikou et al., 2016). Because people keep seeing a particular solution to a problem that would affect the ideas that are generated and considered. It often denotes the problem-solving process, which needs various options or generates alternative object functions (German&Barrett, 2005).

Neglect of probability introduced by Sunstein (2002) is one of the cognitive biases when the individual feels anxious about uncertainty and ignoring the probability. When making novel ideas, taking a risk by uncertainty condition may be required. The uncertainty is a potential condition to bring up creative solutions. However, in this case, the emotion dominates the rationality and lead to avoid considering any possible solution (Sunstein, 2002); and thus, this bias avoids creative processes to generate new and useful ideas.

Cognitive Biases in Ideas Transformation

Bandwagon effect is favoring the ideas or opinions that have already believed by the majority of people, the public, or specific groups (Schmitt‐Beck, 2015). The previous theory refers to the mechanism of dissonance reduction (Carter, 1959) and the spiral of silence that individual may have a fear of social isolation (Scheufle&Moy, 2000). In politics, this bias often occurs in the voting election (Nadeau et al., 1993). Although the most problematic effect may be seen in social science research (Schmitt‐Beck, 2015), with regards to group decision making within scientists also harmful. An individual’s idea may suppress, and a particular alternative of a creative idea cannot be found.

Conformity bias could occur in the group level where the individual tend to give feedback or judgment based on group norm (Moscovici&Faucheux, 1972). Either it because of informational influencing where the tendency to have the correct judgment in the group value in order to gain more trust in the group (Boen et al., 2008), or normative influencing which based on individual fear to be rejected by the group (Auweele et al., 2004). In this case, the group’ norm will suppress the independence of thinking. The choice of the group influences how individuals think about it, even against personal judgment.

Authority bias can also play a role in group opinion because of the person from the higher position because they expected to have a better idea and fewer mistake (Hinnosaar&Hinnosaar, 2012). In Howard's case (2019), the opinion or instruction from authority usually followed and negatively impacted to clinicians. The result of the collective creative problem solving may be the same as the bandwagon effect in which the individual creative ideas are taken back.

Cognitive Biases in Information Seeking and Selecting

Availability heuristic is the tendency to ignore the base rate information or general information and focus on information based on the context and make a mental shortcut (Tversky&Kahneman, 1974). If the tendency of availability heuristic is often found in the process of problem-solving, then neglecting the base rate of general information and focusing on specific information is narrowing the space of possibility in creating creative ideas.

Ostrich effect is a tendency to ignore negative financial information and only consider the positive aspects (Galai&Sade, 2006). Ignoring the challenge lead the mind has been made up to be limited, only believe selectively in light of positive information. Not only for finance situations, eventually, scientists who have an ostrich-effect may also set themselves to stay in a pleasant situation by filtering positive information. As a result, creative problem solving could not occur because of the information provided is not relevant.

Confirmation bias is a tendency to look for, favor, and highlight the information that confirms what we already believe or suspect (Oswald&Grosjean, 2004; Plous, 1993). In this case, scientists can unconsciously ignore which they disagree with and focus on what on the information they believe in. It is also stated that not only the information search for but scientists who fall with confirmation bias can also unintentionally interpret and remembered the information only based on their hypotheses (Oswald&Grosjean, 2004). Rather than challenging the ideas, this will result in innovator to close with new possibilities and creative solutions abilities and creative solutions

Biases in Evaluating Ideas

Categorical thinking and essentialism. As human make a mental short cut in their mind, people tend to link one idea to another idea, either it based on the effect, properties, or any categories of one idea belong to Kahneman (2011). These issues may also apply if the people have categorical thinking, which the consideration of categorical in science ideas should be valid and useful (De Langhe&Fernbach, 2019). Categorical can generate the power of illusions; it would compress ideas and eliminate variety. It is also closely related to essentialism, which the tendency to believe that every object, things, or person has immutable essence and classifies it based on those characteristics (Gelman, 2003). Although categorical thinking help to understand and making sense of the world, but it may lead the decision-makers to make a significant error that ignoring the actual value and favoring one category than others else (De Langhe&Fernbach, 2019. Eventually, thinking categorically can also inhibit people not to break the innovation (De Langhe&Fernbach, 2019).

Framing effect and Availability bias. When the group of scientists decides in which options are the best among the presented data, framing bias may have occurred and people identify they influence by the information presented rather than its information itself, such as the information presented by using negative or positive connotations (Plous, 1993). This is also related to Availability bias (Kahneman&Tversky, 1972) when the example or certain experience are more representatives than actual information. Eventually, both framing bias and availability bias would hinder people from making rational evaluation by that fixed information connotations.

Decoy effect is a tendency to have specific value change after the new inferior option presented (Slaughte et al., 1999). Regards with decision making when evaluating the idea, a study by Slaughter and colleague (2011) indicates that this effect is reflected in loss aversion theory by Tversky and Kahneman. This result will aggravate the organization in implementing which option is more rational to choose.

Semmelweis reflex like is a tendency to reject new evidence, knowledge, or idea because it challenged established norms, beliefs, or paradigms Mortell et al. (2013). This bias is found in the healthcare practice and does not rule out the possibility of this effect occurred in another environment. This bias is associated with the theory-practice gap (Allmark, 1995; Glynn et al., 2011) introduced theory-practice-ethics as a new mediation dimension of the gap in theory and practice outcome. The rejection of new ideas in this context may become a border of the creative process.

http://dam.zipot.com:8080/sites/BDL/images/N0230110302_image/Fig_BDL_11_03_02_F1.png

Fig. 1. Cognitive biases related with scientist activity

Measuring cognitive biases in scientist activity is challenging because the construct are not directly observable which further we called as latent variable. Nevertheless, latent variable can be measure by examining observed variable through items as the indicator Glynn et al., (2011). To sum the biases, we marked biases from reducing Benson’s cognitive bias list (Benson, 2021) which had relation with general scientist activities resulting 135 biases. In addition, we have listed 43 biases from, through the process of rationalizing with our current framework of scientist activities during creative problem solving in individual level and collaboration level. Afterwards, it is important to make sure that this construct are valid to make significant contributions for further research.

Application and Discussion

Measuring Cognitive Bias

While the current framework described earlier is specific to the application of cognitive bias in scientist work, more empirical research on cognitive bias can help to understand obstacle of systematic thinking in scientist. We have argued that challenge of scientist decision making can be partly occurred in individual setting and group work settings. However, most of previous study to measure bias within science work generally examine a specific bias in decision-making process. For instance, based on framing effect, a study measured doctors and teachers’ personal prerequisites of decision-making process using Melbourne Decision Making Questionnaire, Budner’s Tolerance of Ambiguity Scale, and Personality Factors of Decision-Making questionnaires (Kornilova&Kerimova, 2018). To measuring overconfidence in biology students, previous study used confident scale in the biology exam (Rachmatullah&Ha, 2019). Cognitive Reflection Test (CRT) is also popular instrument to measure performance on heuristic and biases task (Frederick, 2005). Furthermore, to examine about automation bias of general practitioners in the UK, a study measure through their trust and frequency of use CDSS simulator (Goddard et al., 2014). Another example to measure individual differences in fixation and functional fixedness, also create 42 items of Need for Closure (NFC) scale (Webster&Kruglanski, 1994).

However, investigating cognitive bias qualitatively in a specific area, which in this case scientists, are still limited in numbers, likewise, measuring the biases in whole scientific work process. For this reason, further study will create observed variable or items to measure cognitive biases in individual and group work setting. Using current framework (Table 2) last list of biases (Fig. 1), factor analysis will be used for questionnaire development to examine relations within items and finding common factors in the set of items. Further empirical observation field in scientist work setting would also help to understand this bridge between theoretical work and field observation. We believe efforts and contribution of more cognitive bias research would benefit for consideration understanding this challenge.

Educational Program to Reducing Bias among Scientists

As we discussed that individuals, even scientist, are susceptible to cognitive biases as they oftentimes rely even more heavily on system 1 thinking making this thinking type is source of cognitive biases and produce poor decision making (Kahneman, 2003, 2011). The detrimental effects of biases urge the importance of reducing bias, or debiasing program. Intervention study to reduce cognitive bias are still limited in number. Noted that people could not fully eliminate biases but some program might help them as well to outsmart or reduce their own biases (Soll et al., 2015). Referred to prior studies, some program may give intervention activities encouraging people to avoid or reduce biased thinking that could be in real or laboratory setting (Hacker et al., 2008; Nietfeld et al., 2006). A study to reduce overconfidence bias and improve metacognitive accuracy, participants were given monitoring exercise and weekly feedback (Nietfeld et al., 2006). Other example, in a study which participants were given awareness training, analogical sensitization, and analogical training to reduce 10 biases such as sunk cost fallacy, outcome bias, framing bias, planning fallacy (Aczel et al., 2015). Furthermore, the use of cognitive strategies such as asking people to consider the opposite outcome not only their desired outcomes when making judgment, is found to be effectively reduce anchoring bias (Mussweiler et al., 2000).

Intervention program by Rusmana et al. (2020) introduce full framework of KAAR model (stands for Knowledge, Awareness, Action, and Reflection) to reduce preservice science teachers’ biases. Although the participants of the study are college students, the framework are flexible to be implemented across settings, includes scientist. The first step of this process is knowing the presence of bias (Babcock&Loewenstein, 1997). Knowledge of bias, its causes, and its danger could be the initial caution for people to avoid bias (Soll et al., 2015). The second stage is building people’s awareness to deliberately think some potential biases in past experiences which may drive them to become more aware of the pitfalls of bias and less bias (van Geene et al., 2016) Therefore, giving information about cognitive bias is process of giving knowledge and awareness in KAAR model, to build upon scientist awareness to work to decrease their bias. Moreover, it is also pointed that awareness alone is not enough; they emphasize “action” as the essential phase of reducing bias that can be done individually and collectively (Kahneman et al., 2011). One can use other people’s Type 2 thinking to spot biases in their Type 1 thinking. This idea indicates the importance of group activities which in their study they conducted a group discussion discussion as intervention activity for this stage. Last stage is reflection which is in form of Type 2 thinking leading people to feature evidence rather feelings or emotions in decision making. The intervention activities of reflection in prior study are writing reflective statement and study plan (Rusmana et al., 2020).

Conclusion

Research on cognitive biases can help this interest to understanding limitation for systematic thinking pattern among scientists. We have argued in this study that scientific work are variates not only based on historically and philosophically, not as rigid as one way of scientific methodology. We also consider for sustainable future in addition of scientific work that employing creative problem solving. Another attention is the multilayer of current scientific work, where scientist collaborate with the teams or colleagues. Alternatively, weighing such consideration, the new framework was made by listing the biases among scientific work. The framework should be examined and validated so that this potential construct would be helpful for next study.

Acknowledgements

This research was funded by the National Research Foundation of Korea (NRF-2020R1C1C1006167)

Authors Information

Aini, Rahmi Qurota: Middle Tennessee State University, Doctoral student, First author

Sya’bandari, Yustika: Kangwon National University, Researcher, Co-author

Rusmana, Ai Nurlaelasari: Kangwon National University, Researcher, Co-author

Ha, Minsu: Kangwon National University, Professor, Corresponding author

References

1 Aczel, B., Bago, B., Szollosi, A., Foldes, A.,&Lukacs, B. (2015). Is it time for studying real-life debiasing? Evaluation of the effectiveness of an analogical intervention technique. Frontiers in psychology, 6, 1120.  

2 Allmark, P. (1995). A classical view of the theory‐practice gap in nursing. Journal of Advanced Nursing, 22, 18-23.  

3 Auweele, Y. V., Boen, F., De Geest, A.,&Feys, J. (2004). Judging bias in synchronized swimming: Open feedback leads to nonperformance-based conformity. Journal of Sport and Exercise Psychology, 26, 561-571.  

4 Aydemir, D.,&Ulusu, N. N. (2020). Identifying and solving scientific problems in the medicine: Key to become a competent scientist. Turkish Journal of Biochemistry, 45.  

5 Babcock, L.,&Loewenstein, G. (1997). Explaining bargaining impasse: The role of self-serving biases. Journal of Economic perspectives, 11, 109-126.  

6 Benson, B. (2021, February, 15) “Conitive bias codex,” cognitive biases. Retrieved from https://busterbenson.com/piles/cognitive-biases  

7 Blatti, J. L., Garcia, J., Cave, D., Monge, F., Cuccinello, A., Portillo, J., ... Schwebel, F. (2019). Systems thinking in science education and outreach toward a sustainable future. Journal of Chemical Education, 96, 2852-2862.  

8 Boen, F., Van Hoye, K., Vanden Auweele, Y., Feys, J.,&Smits, T. (2008). Open feedback in gymnastic judging causes conformity bias based on informational influencing. Journal of Sports Sciences, 26, 621-628.  

9 Buldu, M. (2006). Young children's perceptionsof scientists: A preliminary study. Educational Research, 48, 121-132.  

10 Carmeli, A., Gelbard, R.,&Reiter‐Palmon, R. (2013). Leadership, creative problem‐solving capacity, and creative performance: The importance of knowledge sharing. Human Resource Management, 52, 95-121.  

11 Carter, R. F. (1959). Bandwagon and sandbagging effects: Some measures of dissonance reduction. Public Opinion Quarterly, 23, 279-287.  

12 Chrysikou, E. G.,&Weisberg, R. W. (2005). Following the wrong footsteps: Fixation effects of pictorial examples in a design problem-solving task. Journal of Experimental Psychology: Learning, Memory, and Cognition, 31, 1134.  

13 Chrysikou, E. G., Motyka, K., Nigro, C., Yang, S. I.,&Thompson-Schill, S. L. (2016). Functional fixedness in creative thinking tasks depends on stimulus modality. Psychology of Aesthetics, Creativity, and the Arts, 10, 425.  

14 De Langhe, B.,&Fernbach, P. (2019). The dangers of categorical thinking: We're hardwired to sort information into buckets-and that can hamper our ability to make good decisions. Harvard Business Review, 97, 80-92.  

15 Duncker, K.,&Lees, L. S. (1945). On problem-solving. Psychological Monographs, 58, i.  

16 Edwards, W., Lindman, H.,&Savage, L. J. (1963). Bayesian statistical inference for psychological research. Psychological Review, 70, 193.  

17 Ellis, G. (2018). Cognitive Biases in Visualizations (Ed.). New York, NY, USA: Springer.  

18 Fiske, S. T. (2002). What we know now about bias and intergroup conflict, the problem of the century. Current Directions in Psychological Science, 11, 123-128.  

19 Frederick, S. (2005). Cognitive reflection and decision making. Journal of Economic Perspectives, 19, 25-42.  

20 Galai, D.,&Sade, O. (2006). The “ostrich effect” and the relationship between the liquidity and the yields of financial assets. The Journal of Business, 79, 2741-2759.  

21 Gelman, S. A. (2003). The Essential Child: Origins of Essentialism in Everyday Thought. Oxford University Press, USA.  

22 German, T. P.,&Barrett, H. C. (2005). Functional fixedness in a technologically sparse culture. Psychological Science, 16, 1-5.  

23 Glynn, S. M., Brickman, P., Armstrong, N.,&Taasoobshirazi, G. (2011). Science motivation questionnaire II: Validation with science majors and nonscience majors. Journal of Research in Science Teaching, 48, 1159-1176.  

24 Goddard, K., Roudsari, A.,&Wyatt, J. C. (2012). Automation bias: A systematic review of frequency, effect mediators, and mitigators. Journal of the American Medical Informatics Association, 19, 121-127.  

25 Goddard, K., Roudsari, A.,&Wyatt, J. C. (2014). Automation bias: Empirical results assessing influencing factors. International journal of medical informatics, 83, 368-375.  

26 Hacker, D. J., Bol, L.,&Bahbahani, K. (2008). Explaining calibration accuracy in classroom contexts: The effects of incentives, reflection, and explanatory style. Metacognition and Learning, 3, 101-121.  

27 Haselton, M. G., Nettle, D.,&Murray, D. R. (2015). The evolution of cognitive bias. The Handbook of Evolutionary Psychology, 1-20.  

28 Hilbert, M. (2012). Toward a synthesis of cognitive biases: How noisy information processing can bias human decision making. Psychological Bulletin, 138, 211.  

29 Hinnosaar, M.,&Hinnosaar, T. (2012). Authority Bias.  

30 Howard, J. (2019). Bandwagon effect and authority bias. In Cognitive Errors and Diagnostic Mistakes (pp. 21-56). Springer, Cham.  

31 Kahneman, D., Sibony, O.,&Sunstein, C. R. (2021). Noise: A Flaw in Human Judgment. Brown Spark: New York.  

32 Kahneman, D. (2003). A perspective on judgment and choice: Mapping bounded rationality. American Psychologist, 58, 697.  

33 Kahneman, D. (2011). Thinking, Fast and Slow. Macmillan.  

34 Kahneman, D.,&Tversky, A. (1972). Subjective probability: A judgment of representativeness. Cognitive Psychology, 3, 430-454.  

35 Kahneman, D.,&Tversky, A. (1996). On the reality of cognitive illusions.  

36 Kahneman, D., Knetsch, J. L.,&Thaler, R. H. (1991). Anomalies: The endowment effect, loss aversion, and status quo bias. Journal of Economic Perspectives, 5, 193-206.  

37 Kahneman, D., Lovallo, D.,&Sibony, O. (2011). Before you make that big decision. Harvard Business Review, 89, 50-60.  

38 Kahneman, D., Rosenfield, A. M., Gandhi, L.,&Blaser, T. (2016). “Noise”. Harvard Business Review. 38-46.  

39 Kornilova, T.,&Kerimova, S. (2018). Specifics of personal prerequisites of decision-making process (based on the framing effect) in doctors and teachers sample groups. psychology. Journal of Higher School of Economics, 15, 22-38.  

40 Liedtka, J. (2015). Perspective: Linking design thinking with innovation outcomes through cognitive bias reduction. Journal of Product Innovation Management, 32, 925-938.  

41 Loewenstein, G., O'Donoghue, T.,&Rabin, M. (2003). Projection bias in predicting future utility. The Quarterly Journal of Economics, 118, 1209-1248.  

42 Lonergan, D. C., Scott, G. M.,&Mumford, M. D. (2004). Evaluative aspects of creative thought: Effects of appraisal and revision standards. Creativity Research Journal, 16, 231-246.  

43 Matthews, M. R. (2014). Science Teaching: The Contribution of History and Philosophy of Science. Routledge.  

44 Mortell, M., Balkhy, H. H., Tannous, E. B.,&Jong, M. T. (2013). Physician ‘defiance’ towards hand hygiene compliance: Is there a theory–practice–ethics gap? Journal of the Saudi Heart Association, 25, 203-208.  

45 Moscovici, S.,&Faucheux, C. (1972). Social influence, conformity bias, and the study of active minorities. In Advances in experimental Social Psychology, 6, 149-202.  

46 Mumford, M. D., Hester, K. S.,&Robledo, I. C. (2012). Creativity in organizations: Importance and approaches. In Handbook of Organizational Creativity (pp. 3-16). Academic Press.  

47 Mussweiler, T., Strack, F.,&Pfeiffer, T. (2000). Overcoming the inevitable anchoring effect: Considering the opposite compensates for selective accessibility. Personality and Social Psychology Bulletin, 26, 1142-1150.  

48 Nadeau, R., Cloutier, E.,&Guay, J. H. (1993). New evidence about the existence of a bandwagon effect in the opinion formation process. International Political Science Review, 14, 203-213.  

49 Nietfeld, J. L., Cao, L.,&Osborne, J. W. (2006). The effect of distributed monitoring exercises and feedback on performance, monitoring accuracy, and self-efficacy. Metacognition and Learning, 1, 159.  

50 Oswald, M. E.,&Grosjean, S. (2004). Confirmation bias. In R. F. Pohl (Ed.). Cognitive Illusions. A Handbook on Fallacies and Biases in Thinking, Judgement and Memory. Hove and N.Y: Psychology Press  

51 Plous, S. (1993). The Psychology of Judgment and Decision Making. Mcgraw-Hill Book Company.  

52 Quita, I. N. (2003). What is a scientist? Perspectives of teachers of color. Multicultural Education, 11, 29.  

53 Rachmatullah, A.,&Ha, M. (2019). Examining high-school students’ overconfidence bias in biology exam: A focus on the effects of country and gender. International Journal of Science Education, 41, 652-673.  

54 Reiter-Palmon, R.,&Illies, J. J. (2004). Leadership and creativity: Understanding leadership from a creative problem-solving perspective. The Leadership Quarterly, 15, 55-77.  

55 Reiter-Palmon, R.,&Robinson, E. J. (2009). Problem identification and construction: What do we know, what is the future? Psychology of Aesthetics, Creativity, and the Arts, 3, 43.  

56 Rogers, E. M. (2010). Diffusion of Innovations. Simon and Schuster.  

57 Rusmana, A. N., Roshayanti, F.,&Ha, M. (2020). Debiasing overconfidence among Indonesian undergraduate students in the biology classroom: An intervention study of the KAAR model. Asia-Pacific Science Education, 6, 228-254.  

58 Samuelson, W.,&Zeckhauser, R. (1988). Status quo bias in decision making. Journal of Risk and Uncertainty, 1, 7-59.  

59 Scheufle, D. A.,&Moy, P. (2000). Twenty-five years of the spiral of silence: A conceptual review and empirical outlook. International Journal of Public Opinion Research, 12, 3-28.  

60 Schmitt‐Beck, R. (2015). Bandwagon effect. The International Encyclopedia of Political Communication, 1-5.  

61 Skitka, L. J., Mosier, K. L.,&Burdick, M. (1999). Does automation bias decision-making? International Journal of Human-Computer Studies, 51, 991-1006.  

62 Slaughter, J. E., Kausel, E. E.,&Quiñones, M. A. (2011). The decoy effect as a covert influence tactic. Journal of Behavioral Decision Making, 24, 249-266.  

63 Slaughter, J. E., Sinar, E. F.,&Highhouse, S. (1999). Decoy effects and attribute-level inferences. Journal of Applied Psychology, 84, 823.  

64 Soll, J. B., Milkman, K. L.,&Payne, J. W. (2015). Outsmart your own biases. Harvard Business Review, 93, 64-71.  

65 Sunstein, C. R. (2002). Probability neglect: Emotions, worst cases, and law. The Yale Law Journal, 112, 61-107.  

66 Todd, E. M., Higgs, C. A.,&Mumford, M. D. (2019). Bias and bias remediation in creative problem-solving: Managing biases through forecasting. Creativity Research Journal, 31, 1-14.  

67 Tversky, A.,&Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 1124-1131.  

68 Valdez, A. C., Ziefle, M.,&Sedlmair, M. (2018). Studying biases in visualization research: Framework and methods. In Cognitive Biases in Visualizations (pp. 13-27). Springer, Cham.  

69 van Geene, K., de Groot, E., Erkelens, C.,&Zwart, D. (2016). Raising awareness of cognitive biases during diagnostic reasoning. Perspectives on Medical Education, 5, 182-185.  

70 Webster, D. M.,&Kruglanski, A. W. (1994). Individual differences in need for cognitive closure. Journal of Personality and Social Psychology, 67, 1049.  

71 Williams, J. D. (2011). How Science Works: Teaching and Learning in the Science Classroom. A& Black.