Introduction
Science and technology play an essential role, by making discoveries and inventions, to make human life more convenient. Besides stereotypical image of scientists, a person who do science, drawn by children are those worked in laboratory with the tools (Buldu, 2006; Quita, 2003), but day-to-day working condition of scientists also involved by spending time to read and seek sources, going to the field for observations, and the notions of all scientific methods cannot be separated from making a decision. To do so, the process of scientific work must be escorted by rational attitude because human endeavors based on existing knowledge are prone to errors. Cognitive bias is a psychological explanation of human thought patterns and rational judgment (Haselton et al., 2015) that correlate with remembering, evaluation, information processing, and decision-making (Hilbert, 2012; Tversky&Kahneman, 1974). The current concern due to the role of scientist is also making a rational decision by considering the cognitive bias.
Knowing that many processes in scientific work have an issue because of a shortcoming in human thinking, historically, early research have discussed about rational choice, found that human make an error when making a decision (Ellis, 2018). Although humans known as “rational animals” , but experiments from Bayesian analysis in the 50s and 60s found that humans can be biased in judgment and make poor decisions (Edwards et al., 1963; Ellis, 2018). For instance, Francis Bacon also perform to promoting science by gathering the data and proving knowledge though doubt and experiments (Williams, 2011). Bacon agree that human tend to interpret data based on what they believe, therefore, Bacon’s philosophy of science is the derived from inductive reasoning to make generalization based on observation (Williams, 2011). What Bacon’s explain about nature of human are also explained as confirmation bias, the tendency of individual to confirm prior belief. In the study of psychology and behavioral economics, recorded similar systematic fallacy thinking patterns called cognitive bias. Furthermore, we will discuss this systematic thinking pattern, which a previous author have worked to listed more than 150 biases and its classification according to causes and strategies (Benson, 2021). The present theories about cognitive bias are challenging because it provides a flaw of thinking rather than the standard of logic (Haselton et al., 2015). Eventually, humans' cognition is prone to error and affects not only decision making, but also in memory, behavior, and judgment.
Our objective is to focus on main challenge for scientists’ systematic thinking pattern that previously received researcher’s attention in behavioral economics, politics, and business. In the current information abundance, scientists and people who directly involve science also should generate the right decision making in this is the era of big data. Main areas of cognitive science and education, with the empirical advance cognitive bias may help to tackle this challenge. The more research in this area might possibly to create a better scientist community and promote responsible scientific research. This current literature works to understand cognitive biases that can influence scientific work. We used critical literature synthesis through the process: 1) defining common scientific activities across discipline supported by previous framework, 2) understanding the process of each cognitive process, and 3) critical literature synthesis of each cognitive biases in each scientific activities.
The Process of Scientific Work
The history and philosophy of science (HPS) can be a guiding framework for us to understand how science done by scientists. Science philosophers such as Francis Bacon, Karl Popper, and Thomas Kuhn portray a significant understanding about the scientific methodologies and implications in science. However, a general scientific work is also differed in detail due to their work in a specific discipline that have own methodologies (Williams, 2011). For example, some geologist and environmental scientist may work in the field to gather data, while some other scientist modeling their ideas in the laboratory. Although science is a human endeavor, a scientist works based on scientific methodologies in order to have a valid and reliable study. Science and scientific work are filled with influences of history and branch of philosophy, culture, and specific discipline. These scientific methods are also multistep process to contribute solving specific problem. Therefore, scientist is required to have creativity and hardworking abilities (Aydemir&Ulusu, 2020). For sustainable future in addition of scientific work, the next generation of scientist and scholars should also advocate to have system thinking and interdisciplinary science. This interdisciplinary science as a part of scientific method should also employ critical thinking, problem solving and creativity (Blatti, 2019). In addition, the contribution of most science program nowadays lays on improving future scientists’ ability to solve the problem and improving the society (Matthews, 2015). However, the way of knowing science, as a human endeavor, may include exploration and investigation to making sense of evidence and having tentative explanations. During the process of scientist work has human elements for creativity, subjectivity, and cultural influences. Therefore, various decision biases identified as the variables that lead to unsuccessful in in creative problem solving (Todd et al., 2019). Being aware to the process of system thinking is essentials approach to understand how science product are influenced by various factors.
Defining Cognitive Bias
The decision-making process in establishing a scientific product by scientists require a rational thinking. Cognitive bias was introduced by Daniel Kahneman and Amos Tversky (1972, 1996) to describe its implications for practical judgment in areas such as economics, management, and finance. In another context, cognitive bias may be used to understand individual reasoning patterns and motivational processes behind their behavior to make a decision. Kahneman and colleague (2016, 2021) make a distinction between judgement noise and bias, although both makes errors, bias have different types in which it has authentic accuracy. For instance, biased illustrated in arcade shoots as an consistent error in the same one place, while noise are shots error in widely scattered location. Therefore, bias can be examined by understanding the accurate response. Haselton (2015) explained that two type of error produced by cognitive mechanism are false positive through adopting a belief that is false, and false negative though failing to adopt a belief that is true. Additionally, people who have such biases are more likely to believe that their thought is right - although it is wrong. Underlies the reason of why it is hard to know its own unknown knowledge, because biases in human thinking are occurred unconsciously (Fiske, 2002).
The notion of this system thinking according to Kahneman explained due to heuristic, or human mental short cut. Kahneman assert two system thinking called 1) System 1 as fast, automatic, emotional, and intuitive thinking, 2) System 2 as show, effortful, conscious, and logical. According to Kahneman (2003, 2011) oftentimes cognitive bias rely even more heavily on system 1 way of thinking that stem from the associations stored in memory. Therefore, other prior framework by Benson (Benson, 2021) have classified known biases according to its case and coping strategies to handle information includes 1) Too much information, 2) Not enough meaning, 3) Need to act fast, and 4) What should we remember? Another framework to study biases also presented by Valdez et al. (2018) according to different stages of cognitive processes may occurred to help the study of cognitive bias in visualization research. Table 1 presented the detail of categorization of its cognitive processing and example of biases.
In another specific situation, the way individual processing information and making a decision would be different as well. In the case of scientific work, while the methodology are variates, the process of making rational judgment and reasoning should also have a consistent format. Furthermore, it is necessary to ensure that the process is less biased, so it provides rational ideas from the scientist.
Identifying Cognitive Biases in Scientist Activities
As we discussed that the scientist works are influenced by history, philosophy, and spesific dicipline through scientific methodologies and considering the future needs for creative problem solving, further would posit this thinking fallacy in scientist activity. This activities are defined as a common process of scientist work across disciplines. It is also proven that ability to accomplish complex activity such as defining problem and evaluating ideas were substantial with scientific work and creative problem solving (Lonergan et al., 2004; Reiter-Palmon&E. J. Robinson, 2009; Todd et al., 2019). The current scientific activities also requires collaborations within the researcher, scientists, scholars, and students instead of working alone. Therefore, the challenge of systematic thinking pattern of scientists are complex multilayer aspects such as individual cognition, environments, and contextual. At an individual level, to solve the problems, scientists need to have the ability to identify and construct the problem, acquire and find information, generating ideas, evaluate and implement the solution (Carmeli et al., 2013). At the organizational level, it suggested four phases of cognitive processes in creative problem solving which included 1) problem construction, 2) Information search and encoding, 3) generating solutions, 4) evaluating ideas (Reiter-Palmon&Illies, 2004). In this sense, it noted teams’ creative problem-solving issue also need to notice in the future, particularly understanding how problem identification or construction can be occurred in the team level (Reiter-Palmon&Robinson, 2009). On the other hand, Liedka (2015) also documented three stages of design-thinking process models, which linked creativity to encourage innovation by novel problem-solving methodology. Therefore, it is noted flaws of cognitive process in the activity of innovative problem-solving and its consequences (Liedtka, 2015). In regard to the factors that can be an obstacle for scientist, we discussed the relation of how cognitive bias involved in this multilayered process in Table 2.
Cognitive Biases in Identifying a Problem
Status quo bias can occur when individuals are favoring the current situation, maintaining due to loss aversion, or prefer the familiar thing instead of the willingness to accept something novel (Kahneman et al., 1991; Samuelson&Zeckhauser, 1988). Those specific characteristics can hinder other possibilities, not accept the variation, and give the scientist failure to identify new problems. It will limit them to judge only by those characteristics and hinder creativity.
Pro-innovation bias is excessive belief towards the innovation usefulness throughout society (Rogers, 2010), people often to adopt and accept the innovation while it is failing to identify limitations and weaknesses. An individual with pro-innovation bias often neglects the significance of the current problem and fails to understand the consequences of innovation. Pro-innovation bias among scientists will limit the ability of them to anticipate the undesirable consequences of innovation.
Anchoring is one of psychological heuristic where individuals are being relied on the information that firstly already known or shown as “anchor” (Fiske, 2002). Different starting points yield different estimates, which are biased toward the initial values. Although anchoring as an adjustment can be inclusive to system thinking 2 where it involved deliberate process thinking, but anchoring as priming effect can evoke information and drive people to have tendency to underestimate the probabilities of failure in complex systems (Kahneman, 2011).
Cognitive Biases in Generating Ideas
Projection bias is the tendency to over-predict future preferences will be the same with current preference because of the emotional states, thoughts, and value (Loewenstein et al., 2003). Liedtka (2015) argues that this bias can limit people to generate innovation because it fails to make new ideas, as the decision-maker might predict the future using present states which are not relevant.
Automation bias is the tendency to lean on an automated system such as computers build based on a particular program (Loewenstein et al., 2003). Depending on the system may limiting possibilities because it based on machine training and data inputted to the system. If human depends on the computer, indicate that person is not thinking differently and may cause more or different error, or tend to have discussed less information (Loewenstein et al., 2003). A previous study in the academic field found that it often fails to acknowledge the error in the automation system can introduce (Goddard et al., 2012).
Functional fixedness, related with problem-solving, is one of cognitive bias that drives people to use objects in a standard way and oppose to use in different ways (Duncker&Lees, 1945). In other words, this bias is stipulate on any single solution for a particular problem. The functional fixedness bias makes people unable to recognize alternative approaches and uses of elements constraints creativity, and thus limits the ideation and problem-solving. In pictorial creative generation task, functional fixedness also often demonstrated which cause of examples (Chrysikou&Weisberg, 2005; Chrysikou et al., 2016). Because people keep seeing a particular solution to a problem that would affect the ideas that are generated and considered. It often denotes the problem-solving process, which needs various options or generates alternative object functions (German&Barrett, 2005).
Neglect of probability introduced by Sunstein (2002) is one of the cognitive biases when the individual feels anxious about uncertainty and ignoring the probability. When making novel ideas, taking a risk by uncertainty condition may be required. The uncertainty is a potential condition to bring up creative solutions. However, in this case, the emotion dominates the rationality and lead to avoid considering any possible solution (Sunstein, 2002); and thus, this bias avoids creative processes to generate new and useful ideas.
Cognitive Biases in Ideas Transformation
Bandwagon effect is favoring the ideas or opinions that have already believed by the majority of people, the public, or specific groups (Schmitt‐Beck, 2015). The previous theory refers to the mechanism of dissonance reduction (Carter, 1959) and the spiral of silence that individual may have a fear of social isolation (Scheufle&Moy, 2000). In politics, this bias often occurs in the voting election (Nadeau et al., 1993). Although the most problematic effect may be seen in social science research (Schmitt‐Beck, 2015), with regards to group decision making within scientists also harmful. An individual’s idea may suppress, and a particular alternative of a creative idea cannot be found.
Conformity bias could occur in the group level where the individual tend to give feedback or judgment based on group norm (Moscovici&Faucheux, 1972). Either it because of informational influencing where the tendency to have the correct judgment in the group value in order to gain more trust in the group (Boen et al., 2008), or normative influencing which based on individual fear to be rejected by the group (Auweele et al., 2004). In this case, the group’ norm will suppress the independence of thinking. The choice of the group influences how individuals think about it, even against personal judgment.
Authority bias can also play a role in group opinion because of the person from the higher position because they expected to have a better idea and fewer mistake (Hinnosaar&Hinnosaar, 2012). In Howard's case (2019), the opinion or instruction from authority usually followed and negatively impacted to clinicians. The result of the collective creative problem solving may be the same as the bandwagon effect in which the individual creative ideas are taken back.
Cognitive Biases in Information Seeking and Selecting
Availability heuristic is the tendency to ignore the base rate information or general information and focus on information based on the context and make a mental shortcut (Tversky&Kahneman, 1974). If the tendency of availability heuristic is often found in the process of problem-solving, then neglecting the base rate of general information and focusing on specific information is narrowing the space of possibility in creating creative ideas.
Ostrich effect is a tendency to ignore negative financial information and only consider the positive aspects (Galai&Sade, 2006). Ignoring the challenge lead the mind has been made up to be limited, only believe selectively in light of positive information. Not only for finance situations, eventually, scientists who have an ostrich-effect may also set themselves to stay in a pleasant situation by filtering positive information. As a result, creative problem solving could not occur because of the information provided is not relevant.
Confirmation bias is a tendency to look for, favor, and highlight the information that confirms what we already believe or suspect (Oswald&Grosjean, 2004; Plous, 1993). In this case, scientists can unconsciously ignore which they disagree with and focus on what on the information they believe in. It is also stated that not only the information search for but scientists who fall with confirmation bias can also unintentionally interpret and remembered the information only based on their hypotheses (Oswald&Grosjean, 2004). Rather than challenging the ideas, this will result in innovator to close with new possibilities and creative solutions abilities and creative solutions
Biases in Evaluating Ideas
Categorical thinking and essentialism. As human make a mental short cut in their mind, people tend to link one idea to another idea, either it based on the effect, properties, or any categories of one idea belong to Kahneman (2011). These issues may also apply if the people have categorical thinking, which the consideration of categorical in science ideas should be valid and useful (De Langhe&Fernbach, 2019). Categorical can generate the power of illusions; it would compress ideas and eliminate variety. It is also closely related to essentialism, which the tendency to believe that every object, things, or person has immutable essence and classifies it based on those characteristics (Gelman, 2003). Although categorical thinking help to understand and making sense of the world, but it may lead the decision-makers to make a significant error that ignoring the actual value and favoring one category than others else (De Langhe&Fernbach, 2019. Eventually, thinking categorically can also inhibit people not to break the innovation (De Langhe&Fernbach, 2019).
Framing effect and Availability bias. When the group of scientists decides in which options are the best among the presented data, framing bias may have occurred and people identify they influence by the information presented rather than its information itself, such as the information presented by using negative or positive connotations (Plous, 1993). This is also related to Availability bias (Kahneman&Tversky, 1972) when the example or certain experience are more representatives than actual information. Eventually, both framing bias and availability bias would hinder people from making rational evaluation by that fixed information connotations.
Decoy effect is a tendency to have specific value change after the new inferior option presented (Slaughte et al., 1999). Regards with decision making when evaluating the idea, a study by Slaughter and colleague (2011) indicates that this effect is reflected in loss aversion theory by Tversky and Kahneman. This result will aggravate the organization in implementing which option is more rational to choose.
Semmelweis reflex like is a tendency to reject new evidence, knowledge, or idea because it challenged established norms, beliefs, or paradigms Mortell et al. (2013). This bias is found in the healthcare practice and does not rule out the possibility of this effect occurred in another environment. This bias is associated with the theory-practice gap (Allmark, 1995; Glynn et al., 2011) introduced theory-practice-ethics as a new mediation dimension of the gap in theory and practice outcome. The rejection of new ideas in this context may become a border of the creative process.
Measuring cognitive biases in scientist activity is challenging because the construct are not directly observable which further we called as latent variable. Nevertheless, latent variable can be measure by examining observed variable through items as the indicator Glynn et al., (2011). To sum the biases, we marked biases from reducing Benson’s cognitive bias list (Benson, 2021) which had relation with general scientist activities resulting 135 biases. In addition, we have listed 43 biases from, through the process of rationalizing with our current framework of scientist activities during creative problem solving in individual level and collaboration level. Afterwards, it is important to make sure that this construct are valid to make significant contributions for further research.
Application and Discussion
Measuring Cognitive Bias
While the current framework described earlier is specific to the application of cognitive bias in scientist work, more empirical research on cognitive bias can help to understand obstacle of systematic thinking in scientist. We have argued that challenge of scientist decision making can be partly occurred in individual setting and group work settings. However, most of previous study to measure bias within science work generally examine a specific bias in decision-making process. For instance, based on framing effect, a study measured doctors and teachers’ personal prerequisites of decision-making process using Melbourne Decision Making Questionnaire, Budner’s Tolerance of Ambiguity Scale, and Personality Factors of Decision-Making questionnaires (Kornilova&Kerimova, 2018). To measuring overconfidence in biology students, previous study used confident scale in the biology exam (Rachmatullah&Ha, 2019). Cognitive Reflection Test (CRT) is also popular instrument to measure performance on heuristic and biases task (Frederick, 2005). Furthermore, to examine about automation bias of general practitioners in the UK, a study measure through their trust and frequency of use CDSS simulator (Goddard et al., 2014). Another example to measure individual differences in fixation and functional fixedness, also create 42 items of Need for Closure (NFC) scale (Webster&Kruglanski, 1994).
However, investigating cognitive bias qualitatively in a specific area, which in this case scientists, are still limited in numbers, likewise, measuring the biases in whole scientific work process. For this reason, further study will create observed variable or items to measure cognitive biases in individual and group work setting. Using current framework (Table 2) last list of biases (Fig. 1), factor analysis will be used for questionnaire development to examine relations within items and finding common factors in the set of items. Further empirical observation field in scientist work setting would also help to understand this bridge between theoretical work and field observation. We believe efforts and contribution of more cognitive bias research would benefit for consideration understanding this challenge.
Educational Program to Reducing Bias among Scientists
As we discussed that individuals, even scientist, are susceptible to cognitive biases as they oftentimes rely even more heavily on system 1 thinking making this thinking type is source of cognitive biases and produce poor decision making (Kahneman, 2003, 2011). The detrimental effects of biases urge the importance of reducing bias, or debiasing program. Intervention study to reduce cognitive bias are still limited in number. Noted that people could not fully eliminate biases but some program might help them as well to outsmart or reduce their own biases (Soll et al., 2015). Referred to prior studies, some program may give intervention activities encouraging people to avoid or reduce biased thinking that could be in real or laboratory setting (Hacker et al., 2008; Nietfeld et al., 2006). A study to reduce overconfidence bias and improve metacognitive accuracy, participants were given monitoring exercise and weekly feedback (Nietfeld et al., 2006). Other example, in a study which participants were given awareness training, analogical sensitization, and analogical training to reduce 10 biases such as sunk cost fallacy, outcome bias, framing bias, planning fallacy (Aczel et al., 2015). Furthermore, the use of cognitive strategies such as asking people to consider the opposite outcome not only their desired outcomes when making judgment, is found to be effectively reduce anchoring bias (Mussweiler et al., 2000).
Intervention program by Rusmana et al. (2020) introduce full framework of KAAR model (stands for Knowledge, Awareness, Action, and Reflection) to reduce preservice science teachers’ biases. Although the participants of the study are college students, the framework are flexible to be implemented across settings, includes scientist. The first step of this process is knowing the presence of bias (Babcock&Loewenstein, 1997). Knowledge of bias, its causes, and its danger could be the initial caution for people to avoid bias (Soll et al., 2015). The second stage is building people’s awareness to deliberately think some potential biases in past experiences which may drive them to become more aware of the pitfalls of bias and less bias (van Geene et al., 2016) Therefore, giving information about cognitive bias is process of giving knowledge and awareness in KAAR model, to build upon scientist awareness to work to decrease their bias. Moreover, it is also pointed that awareness alone is not enough; they emphasize “action” as the essential phase of reducing bias that can be done individually and collectively (Kahneman et al., 2011). One can use other people’s Type 2 thinking to spot biases in their Type 1 thinking. This idea indicates the importance of group activities which in their study they conducted a group discussion discussion as intervention activity for this stage. Last stage is reflection which is in form of Type 2 thinking leading people to feature evidence rather feelings or emotions in decision making. The intervention activities of reflection in prior study are writing reflective statement and study plan (Rusmana et al., 2020).
Conclusion
Research on cognitive biases can help this interest to understanding limitation for systematic thinking pattern among scientists. We have argued in this study that scientific work are variates not only based on historically and philosophically, not as rigid as one way of scientific methodology. We also consider for sustainable future in addition of scientific work that employing creative problem solving. Another attention is the multilayer of current scientific work, where scientist collaborate with the teams or colleagues. Alternatively, weighing such consideration, the new framework was made by listing the biases among scientific work. The framework should be examined and validated so that this potential construct would be helpful for next study.
Acknowledgements
This research was funded by the National Research Foundation of Korea (NRF-2020R1C1C1006167)Authors Information
Aini, Rahmi Qurota: Middle Tennessee State University, Doctoral student, First author
Sya’bandari, Yustika: Kangwon National University, Researcher, Co-author
Rusmana, Ai Nurlaelasari: Kangwon National University, Researcher, Co-author
Ha, Minsu: Kangwon National University, Professor, Corresponding author