Heuristic
A heuristic (/hjʊˈrɪstɪk/; from Ancient Greek εὑρίσκω (heurískō) 'I find, discover'), or heuristic technique, is an approach to problem solving or self-discovery using 'a calculated guess' derived from previous experiences. Heuristics are mental shortcuts that ease the cognitive load of making a decision.[1][2] Usually the opposite process to heuristics is the application of algorithms. Algorithms involve calculated answers and guesswork is eliminated.[3]
Examples that employ heuristics include using trial and error, a rule of thumb or an educated guess.
Heuristics are the strategies derived from previous experiences with similar problems. These strategies depend on using readily accessible, though loosely applicable, information to control problem solving in human beings, machines and abstract issues.[4][5] When an individual applies a heuristic in practice, it generally performs as expected. However it can alternatively create systematic errors.[6]
The most fundamental heuristic is trial and error, which can be used in everything from matching nuts and bolts to finding the values of variables in algebra problems. In mathematics, some common heuristics involve the use of visual representations, additional assumptions, forward/backward reasoning and simplification. Here are a few commonly used heuristics from George Pólya's 1945 book, How to Solve It:[7]
- When experiencing a difficulty in understanding a problem, draw the architecture from all directions e.g. top-view, side-view, front-view.
- If you can't find a solution, try assuming that you have a solution and seeing what you can derive from that ("working backward"). AKA "what shape would it have" aka system-requirements.
- If the problem is abstract, try examining a concrete example.
- Try solving a more general problem first (the "inventor's paradox": the more ambitious plan may have more chances of success).
This is because only the general problem can provide to a specific problem—a context from which to draw meaning.
In psychology, heuristics are simple, efficient rules, either learned or inculcated by evolutionary processes. These psychological heuristics have been proposed to explain how people make decisions, come to judgements, and solve problems. These rules typically come into play when people face complex problems or incomplete information. Researchers employ various methods to test whether people use these rules. The rules have been shown to work well under most circumstances, but in certain cases can lead to systematic errors or cognitive biases.[8]
History
The study of heuristics in human decision-making was developed in the 1970s and the 1980s by the psychologists Amos Tversky and Daniel Kahneman,[9] although the concept had been originally introduced by the Nobel laureate Herbert A. Simon. Simon's original primary object of research was problem solving that showed that we operate within what he calls bounded rationality. He coined the term satisficing, which denotes a situation in which people seek solutions, or accept choices or judgements, that are "good enough" for their purposes although they could be optimised.[10]
Rudolf Groner analysed the history of heuristics from its roots in ancient Greece up to contemporary work in cognitive psychology and artificial intelligence,[11] proposing a cognitive style "heuristic versus algorithmic thinking", which can be assessed by means of a validated questionnaire.[12]
Adaptive toolbox
Gerd Gigerenzer and his research group argued that models of heuristics need to be formal to allow for predictions of behavior that can be tested.[13] They study the fast and frugal heuristics in the "adaptive toolbox" of individuals or institutions, and the ecological rationality of these heuristics; that is, the conditions under which a given heuristic is likely to be successful.[14] The descriptive study of the "adaptive toolbox" is done by observation and experiment, the prescriptive study of the ecological rationality requires mathematical analysis and computer simulation. Heuristics – such as the recognition heuristic, the take-the-best heuristic and fast-and-frugal trees – have been shown to be effective in predictions, particularly in situations of uncertainty. It is often said that heuristics trade accuracy for effort but this is only the case in situations of risk. Risk refers to situations where all possible actions, their outcomes and probabilities are known. In the absence of this information, that is under uncertainty, heuristics can achieve higher accuracy with lower effort.[15] This finding, known as a less-is-more effect, would not have been found without formal models. The valuable insight of this program is that heuristics are effective not despite their simplicity — but because of it. Furthermore, Gigerenzer and Wolfgang Gaissmaier found that both individuals and organisations rely on heuristics in an adaptive way.[16]
Cognitive-experiential self-theory
Heuristics, through greater refinement and research, have begun to be applied to other theories, or be explained by them. For example, the cognitive-experiential self-theory (CEST) also is an adaptive view of heuristic processing. CEST breaks down two systems that process information. At some times, roughly speaking, individuals consider issues rationally, systematically, logically, deliberately, effortfully and verbally. On other occasions, individuals consider issues intuitively, effortlessly, globally, and emotionally.[17] From this perspective, heuristics are part of a larger experiential processing system that is often adaptive, but vulnerable to error in situations that require logical analysis.[18]
Attribute substitution
In 2002, Daniel Kahneman and Shane Frederick proposed that cognitive heuristics work by a process called attribute substitution, which happens without conscious awareness.[19] According to this theory, when somebody makes a judgement (of a "target attribute") that is computationally complex, a more easily calculated "heuristic attribute" is substituted. In effect, a cognitively difficult problem is dealt with by answering a rather simpler problem, without being aware of this happening.[19] This theory explains cases where judgements fail to show regression toward the mean.[20] Heuristics can be considered to reduce the complexity of clinical judgments in health care.[21]
Psychology
Informal models of heuristics
- Affect heuristic: a mental shortcut which uses emotion to influence the decision. Emotion is the effect that plays the lead role that makes the decision or solves the problem quickly or efficiently. It is used while judging the risks and benefits of something, depending on the positive or negative feelings that people associate with a stimulus. It can also be considered the gut decision since if the gut feeling is right, then the benefits are high and the risks are low.[22]
- Anchoring and adjustment: describes the common human tendency to rely more heavily on the first piece of information offered (the "anchor") when making decisions. For example, in a study done with children, the children were told to estimate the number of jellybeans in a jar. Groups of children were given either a high or low "base" number (anchor). Children estimated the number of jellybeans to be closer to the anchor number that they were given.[23]
- Availability heuristic: a mental shortcut that occurs when people make judgements about the probability of events by the ease with which examples come to mind. For example, in a 1973 Tversky & Kahneman experiment, the majority of participants reported that there were more words in the English language that start with the letter K than for which K was the third letter. There are actually twice as many words in the English Language that have K as the third letter as those that start with K, but words that start with K are much easier to recall and bring to mind.[24]
- Balance heuristic: applies to when an individual balances the negative and positive effects from a decision which makes the choice obvious.[25]
- Base rate heuristic: when a decision involves probability this is a mental shortcut that uses relevant data to determine the probability of an outcome occurring. When using this Heuristic there is a common issue where individuals misjudge the likelihood of a situation. For example, if there is a test for a disease which has an accuracy of 90%, people may think it's a 90% they have the disease even though the disease only affects 1 in 500 people.[26]
- Common sense heuristic: used frequently by individuals when the potential outcomes of a decision appear obvious. For example, when your television remote stops working, you would probably change the batteries.[25]
- Contagion heuristic: follows the Law of Contagion or Similarity. This leads people to avoid others that are viewed as "contaminated" to the observer. This happens due to the fact of the observer viewing something that is seen as bad or to seek objects that have been associated with what seems good. Some things one can view as harmful can tend not to really be. This sometimes leads to irrational thinking on behalf of the observer.[27]
- Default heuristic: in real world models, it is common for consumers to apply this heuristic when selecting the default option regardless of whether the option was their preference.[28]
- Educated guess heuristic: when an individual responds to a decision using relevant information they have stored relating to the problem.[29]
- Effort heuristic: the worth of an object is determined by the amount of effort put into the production of the object. Objects that took longer to produce are more valuable while the objects that took less time are deemed not as valuable. Also applies to how much effort is put into achieving the product. This can be seen as the difference of working and earning the object versus finding the object on the side of the street. It can be the same object but the one found will not be deemed as valuable as the one that we earned.
- Escalation of commitment: describes the phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the cost, starting today, of continuing the decision outweighs the expected benefit. This is related to the sunk cost fallacy.
- Fairness heuristic: applies to the reaction of an individual to a decision from an authoritative figure. If the decision is enacted in a fair manner the likelihood of the individual to comply voluntarily is higher than if it is unfair.[30]
- Familiarity heuristic: a mental shortcut applied to various situations in which individuals assume that the circumstances underlying the past behavior still hold true for the present situation and that the past behavior thus can be correctly applied to the new situation. Especially prevalent when the individual experiences a high cognitive load.[31]
- Naïve diversification: when asked to make several choices at once, people tend to diversify more than when making the same type of decision sequentially.
- Peak–end rule: a person's subjective perceptions during the most intense and final moments of an event are averaged together into a single judgment.[32] For example, a person might judge the difficulty of a workout by taking into consideration only the most demanding part of the workout (e.g., Tabata sprints) and what happens at the very end (e.g., a cool-down). In this way, a difficult workout such as the one described here could be perceived as "easier" than a more relaxed workout that did not vary in intensity (e.g., 45 minutes of cycling in aerobic zone 3, without cool-down).
- Representativeness heuristic: a mental shortcut used when making judgements about the probability of an event under uncertainty. Or, judging a situation based on how similar the prospects are to the prototypes the person holds in his or her mind. For example, in a 1982 Tversky and Kahneman experiment,[9] participants were given a description of a woman named Linda. Based on the description, it was likely that Linda was a feminist. Eighty to ninety percent of participants, choosing from two options, chose that it was more likely for Linda to be a feminist and a bank teller than only a bank teller. The likelihood of two events cannot be greater than that of either of the two events individually. For this reason, the representativeness heuristic is exemplary of the conjunction fallacy.[24]
- Scarcity heuristic: as in economics, the scarcer an object or event is, the more value is attributed to the object or event. The lack of abundance is an indicator of value and provides a mental shortcut that influences the subjective valuation based on how easily the thing might be replaced or lost to competitors. The scarcity heuristic is a cognitive rule that the more difficult it is to acquire an item, the more value that item must have. In many situations we use an item's availability, its perceived abundance, to quickly estimate quality and/or utility. This can lead to systematic judgement errors or cognitive bias.[33]
- Simulation heuristic: a simplified mental strategy in which people determine the likelihood of an event happening based on how easy it is to mentally picture the event happening. People regret the events that are easier to imagine over the ones that would be harder to. It is also thought that people will use this heuristic to predict the likelihood of another's behavior happening. This shows that people are constantly simulating everything around them in order to be able to predict the likelihood of events around them. It is believed that people do this by mentally undoing events that they have experienced and then running mental simulations of the events with the corresponding input values of the altered model.[34]
- Social proof: also known as the informational social influence which was named by Robert Cialdini in his 1984 book Influence. It is where people copy the actions of others. It is more prominent when people are uncertain how to behave, especially in ambiguous social situations.[35]
- Working backward heuristic: when an individual assumes they have already solved a problem they work backwards in order to find how to achieve the solution they originally figured out.[26]
Formal models of heuristics
- Elimination by aspects heuristic
- Fast-and-frugal trees
- Fluency heuristic
- Gaze heuristic
- Recognition heuristic
- Satisficing
- Similarity heuristic
- Take-the-best heuristic
Cognitive maps
Heuristics were also found to be used in the manipulation and creation of cognitive maps.[36] Cognitive maps are internal representations of our physical environment, particularly associated with spatial relationships. These internal representations are used by our memory as a guide in our external environment. It was found that when questioned about maps imaging, distancing, etc., people commonly made distortions to images. These distortions took shape in the regularisation of images (i.e., images are represented as more like pure abstract geometric images, though they are irregular in shape).
There are several ways that humans form and use cognitive maps, with visual intake being an especially key part of mapping: the first is by using landmarks, whereby a person uses a mental image to estimate a relationship, usually distance, between two objects. The second is route-road knowledge, and is generally developed after a person has performed a task and is relaying the information of that task to another person. The third is a survey, whereby a person estimates a distance based on a mental image that, to them, might appear like an actual map. This image is generally created when a person's brain begins making image corrections. These are presented in five ways:
- Right-angle bias: when a person straightens out an image, like mapping an intersection, and begins to give everything 90-degree angles, when in reality it may not be that way.
- Symmetry heuristic: when people tend to think of shapes, or buildings, as being more symmetrical than they really are.
- Rotation heuristic: when a person takes a naturally (realistically) distorted image and straightens it out for their mental image.
- Alignment heuristic: similar to the previous, where people align objects mentally to make them straighter than they really are.
- Relative-position heuristic: people do not accurately distance landmarks in their mental image based on how well they remember them.
Another method of creating cognitive maps is by means of auditory intake based on verbal descriptions. Using the mapping based from a person's visual intake, another person can create a mental image, such as directions to a certain location.[37]
Philosophy
A heuristic device is used when an entity X exists to enable understanding of, or knowledge concerning, some other entity Y.
A good example is a model that, as it is never identical with what it models, is a heuristic device to enable understanding of what it models. Stories, metaphors, etc., can also be termed heuristic in this sense. A classic example is the notion of utopia as described in Plato's best-known work, The Republic. This means that the "ideal city" as depicted in The Republic is not given as something to be pursued, or to present an orientation-point for development. Rather, it shows how things would have to be connected, and how one thing would lead to another (often with highly problematic results), if one opted for certain principles and carried them through rigorously.
Heuristic is also often used as a noun to describe a rule-of-thumb, procedure, or method.[38] Philosophers of science have emphasised the importance of heuristics in creative thought and the construction of scientific theories.[39] Seminal works include Karl Popper's The Logic of Scientific Discovery and others by Imre Lakatos,[40] Lindley Darden, and William C. Wimsatt.
Law
In legal theory, especially in the theory of law and economics, heuristics are used in the law when case-by-case analysis would be impractical, insofar as "practicality" is defined by the interests of a governing body.[41]
The present securities regulation regime largely assumes that all investors act as perfectly rational persons. In truth, actual investors face cognitive limitations from biases, heuristics, and framing effects. For instance, in all states in the United States the legal drinking age for unsupervised persons is 21 years, because it is argued that people need to be mature enough to make decisions involving the risks of alcohol consumption. However, assuming people mature at different rates, the specific age of 21 would be too late for some and too early for others. In this case, the somewhat arbitrary deadline is used because it is impossible or impractical to tell whether an individual is sufficiently mature for society to trust them with that kind of responsibility. Some proposed changes, however, have included the completion of an alcohol education course rather than the attainment of 21 years of age as the criterion for legal alcohol possession. This would put youth alcohol policy more on a case-by-case basis and less on a heuristic one, since the completion of such a course would presumably be voluntary and not uniform across the population.
The same reasoning applies to patent law. Patents are justified on the grounds that inventors must be protected so they have incentive to invent. It is therefore argued that it is in society's best interest that inventors receive a temporary government-granted monopoly on their idea, so that they can recoup investment costs and make economic profit for a limited period. In the United States, the length of this temporary monopoly is 20 years from the date the patent application was filed, though the monopoly does not actually begin until the application has matured into a patent. However, like the drinking-age problem above, the specific length of time would need to be different for every product to be efficient. A 20-year term is used because it is difficult to tell what the number should be for any individual patent. More recently, some, including University of North Dakota law professor Eric E. Johnson, have argued that patents in different kinds of industries – such as software patents – should be protected for different lengths of time.[42]
Stereotyping
Stereotyping is a type of heuristic that people use to form opinions or make judgements about things they have never seen or experienced.[43] They work as a mental shortcut to assess everything from the social status of a person (based on their actions),[2] to whether a plant is a tree based on the assumption that it is tall, has a trunk and has leaves (even though the person making the evaluation might never have seen that particular type of tree before).
Stereotypes, as first described by journalist Walter Lippmann in his book Public Opinion (1922), are the pictures we have in our heads that are built around experiences as well as what we are told about the world.[44][45]
Artificial intelligence
A heuristic can be used in artificial intelligence systems while searching a solution space. The heuristic is derived by using some function that is put into the system by the designer, or by adjusting the weight of branches based on how likely each branch is to lead to a goal node.
See also
- Algorithm
- Behavioral economics
- Failure mode and effects analysis
- Heuristics in judgment and decision-making
- Ideal type
- List of biases in judgment and decision making
- Neuroheuristics
- Predictive coding
- Priority heuristic
- Social heuristics
- Thought experiment
References
- Myers, David G. (2010). Social psychology (Tenth ed.). New York, NY: McGraw-Hill. p. 94. ISBN 978-0-07337-066-8. OCLC 667213323.
- "Heuristics—Explanation and examples". Conceptually. Retrieved 23 October 2019.
- https://www.google.com/search?q=heuristic+vs.+algorithm&sxsrf=ALiCzsZXbuEEFtykYLRXuygYhH_oYNBrEg%3A1666502370803&ei=4s5UY83UMLWe5NoPvKW8iA4&ved=0ahUKEwjNzdCOzfX6AhU1D1kFHbwSD-EQ4dUDCBA&uact=5&oq=heuristic+vs.+algorithm&gs_lcp=Cgdnd3Mtd2l6EAMyBQgAEIAEMgYIABAWEB4yBggAEBYQHjIGCAAQFhAeMgYIABAWEB4yBggAEBYQHjIGCAAQFhAeMgYIABAWEB4yBggAEBYQHjIGCAAQFhAeOgoIABBHENYEELADOgcIABCABBANOgYIABAeEA06CAgAEB4QDxANOggIABAIEB4QDToKCAAQCBAeEA8QDToFCAAQhgM6BAgjECc6BAgAEB46BggAEB4QDzoGCAAQCBAeOggIABAIEB4QDzoHCAAQsQMQQzoFCAAQkQI6BAgAEEM6BggAEAcQHjoICAAQgAQQsQM6CwgAEIAEELEDEIMBOgoIABCABBCHAhAUOgoIABCRAhBGEPkBOggIABAWEB4QCkoECE0YAUoECEEYAEoECEYYAFDmDVj_a2CTb2gBcAB4AIABZogB5gmSAQQxNy4xmAEAoAEByAEIwAEB&sclient=gws-wiz
- Pearl, Judea (1983). Heuristics: Intelligent Search Strategies for Computer Problem Solving. New York, NY: Addison-Wesley. p. vii. ISBN 978-0-201-05594-8.
- Emiliano, Ippoliti (2015). Heuristic Reasoning: Studies in Applied Philosophy, Epistemology and Rational Ethics. Switzerland: Springer International Publishing. pp. 1–2. ISBN 978-3-319-09159-4.
- Sunstein, Cass (2005). "Moral Heuristics". The Behavioral and Brain Sciences. 28 (4): 531–542. doi:10.1017/S0140525X05000099. PMID 16209802. S2CID 231738548.
- Pólya, George (1945) How to Solve It: A New Aspect of Mathematical Method, Princeton, NJ: Princeton University Press. ISBN 0-691-02356-5 ISBN 0-691-08097-6
- Gigerenzer, Gerd (1991). "How to Make Cognitive Illusions Disappear: Beyond "Heuristics and Biases"" (PDF). European Review of Social Psychology. 2: 83–115. CiteSeerX 10.1.1.336.9826. doi:10.1080/14792779143000033. Retrieved 14 October 2012.
- Kahneman, Daniel; Slovic, Paul; Tversky, Amos, eds. (30 April 1982). Judgment Under Uncertainty. Cambridge, UK: Cambridge University Press. doi:10.1017/cbo9780511809477. ISBN 978-0-52128-414-1.
- Heuristics and heuristic evaluation. Interaction-design.org. Retrieved 1 September 2013.
- Groner, Rudolf; Groner, Marina; Bischof, Walter F. (1983). Methods of Heuristics. Hillsdale, NJ: Lawrence Erlbaum.
- Groner, Rudolf; Groner, Marina (1991). "Heuristische versus algorithmische Orientierung als Dimension des individuellen kognitiven Stils" [Heuristic versus algorithmic orientation as a dimension of the individual cognitive style]. In K. Grawe; N. Semmer; R. Hänni (eds.). Über die richtige Art, Psychologie zu betreiben [About the right way to do psychology] (in German). Göttingen: Hogrefe. ISBN 978-3-80170-415-5.
- Gigerenzer, Gerd; Todd, Peter M.; and the ABC Research Group (1999). Simple Heuristics That Make Us Smart. Oxford, UK: Oxford University Press. ISBN 978-0-19512-156-8.
- Gigerenzer, Gerd; Selten, Reinhard, eds. (2002). Bounded Rationality: The Adaptive Toolbox. Cambridge, MA: MIT Press. ISBN 978-0-26257-164-7.
- Gigerenzer, Gerd; Hertwig, Ralph; Pachur, Thorsten (15 April 2011). Heuristics: The Foundations of Adaptive Behavior. Oxford University Press. doi:10.1093/acprof:oso/9780199744282.001.0001. hdl:11858/00-001M-0000-0024-F172-8. ISBN 978-0-19989-472-7.
- Gigerenzer, Gerd; Gaissmaier, Wolfgang (January 2011). "Heuristic Decision Making". Annual Review of Psychology. 62: 451–482. doi:10.1146/annurev-psych-120709-145346. hdl:11858/00-001M-0000-0024-F16D-5. PMID 21126183. SSRN 1722019.
- De Neys, Wim (18 October 2008). "Cognitive experiential self theory". Perspectives on Psychological Science. 7 (1): 28–38. doi:10.1177/1745691611429354. PMID 26168420. S2CID 32261626. Archived from the original on 31 July 2013.
- Epstein, S.; Pacini, R.; Denes-Raj, V.; Heier, H. (1996). "Individual differences in intuitive-experiential and analytical-rational thinking styles". Journal of Personality and Social Psychology. 71 (2): 390–405. doi:10.1037/0022-3514.71.2.390. PMID 8765488.
- Kahneman, Daniel; Frederick, Shane (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Thomas Gilovich; Dale Griffin; Daniel Kahneman (eds.). Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge, UK: Cambridge University Press. pp. 49–81. ISBN 978-0-52179-679-8. OCLC 47364085.
- Kahneman, Daniel (December 2003). "Maps of Bounded Rationality: Psychology for Behavioral Economics" (PDF). American Economic Review. 93 (5): 1449–1475. CiteSeerX 10.1.1.194.6554. doi:10.1257/000282803322655392. ISSN 0002-8282. Archived from the original (PDF) on 19 February 2018.
- Cioffi, Jane (1997). "Heuristics, servants to intuition, in clinical decision making". Journal of Advanced Nursing. 26 (1): 203–208. doi:10.1046/j.1365-2648.1997.1997026203.x. PMID 9231296.
- Slovic, Paul; Finucane, Melissa L.; Peters, Ellen; MacGregor, Donald G. (March 2007). "The affect heuristic". European Journal of Operational Research. 177 (3): 1333–1352. doi:10.1016/j.ejor.2005.04.006. ISSN 0377-2217.
- Smith, H. (1999). "Use of the anchoring and adjustment heuristic by children". Current Psychology. 18 (3): 294–300. doi:10.1007/s12144-999-1004-4. S2CID 144901306.
- Harvey, N (2007). "Use of heuristics: Insights from forecasting research". Thinking & Reasoning. 13 (1): 5–24. doi:10.1080/13546780600872502. S2CID 62523068.
- Ross, Derek (2012). "Ambiguous Weighting and Nonsensical Sense: The Problems of "Balance" and "Common Sense" as Commonplace Concepts and Decision-making Heuristics in Environmental Rhetoric". Social Epistemology. 26: 115–144. doi:10.1080/02691728.2011.634530. S2CID 145239368.
- Dale, Stephen (29 July 2018). "Heuristics and Biases – The Science Of Decision Making". The Future Of Work. Retrieved 25 April 2021.
- Rozin, Paul; Nemeroff, Carol (8 July 2002). "Sympathetic Magical Thinking: The Contagion and Similarity "Heuristics"". Heuristics and Biases. Cambridge University Press. pp. 201–216. doi:10.1017/cbo9780511808098.013. ISBN 978-0-52179-260-8.
- Bateman, Hazel (2017). "Default and naive diversification heuristics in annuity choice". Australian Journal of Management. 42: 32–57. doi:10.1177/0312896215617225. S2CID 220081277.
- Nadeau, Richard (1995). "Educated Guesses: The Process of Answering Factual Knowledge Questions in Surveys". Public Opinion Quarterly. 59 (3): 323–346. doi:10.1086/269480.
- van Dijke, Marius (2010). "Trust in authorities as a boundary condition to procedural fairness effects on tax compliance". Journal of Economic Psychology. 31: 80–91. doi:10.1016/j.joep.2009.10.005.
- Park, C. Whan; Lessig, V. Parker (September 1981). "Familiarity and Its Impact on Consumer Decision Biases and Heuristics". Journal of Consumer Research. 8 (2): 223. doi:10.1086/208859. hdl:1808/10100. ISSN 0093-5301.
- Kahneman, Daniel; Fredrickson, Barbara L.; Schreiber, Charles A.; Redelmeier, Donald A. (1993). "When more pain is preferred to less: Adding a better end". Psychological Science. 4 (6): 401–405. doi:10.1111/j.1467-9280.1993.tb00589.x. S2CID 8032668.
- Lynn, Michael (March 1992). "The Psychology of Unavailability: Explaining Scarcity and Cost Effects on Value". Basic and Applied Social Psychology. 13 (1): 3–7. doi:10.1207/s15324834basp1301_2. hdl:1813/71653. ISSN 0197-3533.
- Kahneman, Daniel; Tversky, Amos (15 May 1981). "Variants of Uncertainty". Cognition. Fort Belvoir, VA. 11 (2): 143–157. doi:10.21236/ada099503. PMID 7198958.
- Cialdini, Robert B.; Wosinska, Wilhelmina; Barrett, Daniel W.; Butner, Jonathan; Gornik-Durose, Malgorzata (October 1999). "Compliance with a Request in Two Cultures: The Differential Influence of Social Proof and Commitment/Consistency on Collectivists and Individualists". Personality and Social Psychology Bulletin. 25 (10): 1242–1253. doi:10.1177/0146167299258006. ISSN 0146-1672. S2CID 143225569.
- McNaughton, Bruce L.; Battaglia, Francesco P.; Jensen, Ole; Moser, Edvard I; Moser, May-Britt (August 2006). "Path integration and the neural basis of the 'cognitive map'". Nature Reviews Neuroscience. 7 (8): 663–678. doi:10.1038/nrn1932. ISSN 1471-003X. PMID 16858394. S2CID 16928213.
- Sternberg, Robert J.; Sternberg, Karin (2012). Cognitive Psychology (6th ed.). Belmont, CA: Wadsworth, Cengage Learning. pp. 310–315. ISBN 978-1-111-34476-4.
- Jaszczolt, K. M. (2006). "Defaults in Semantics and Pragmatics". Stanford Encyclopedia of Philosophy. ISSN 1095-5054.
- Frigg, Roman; Hartmann, Stephan (2006). "Models in Science". Stanford Encyclopedia of Philosophy. ISSN 1095-5054.
- Kiss, Olga (2006). "Heuristic, Methodology or Logic of Discovery? Lakatos on Patterns of Thinking". Perspectives on Science. 14 (3): 302–317. doi:10.1162/posc.2006.14.3.302. S2CID 57559578.
- Gigerenzer, Gerd; Engel, Christoph, eds. (2007). Heuristics and the Law. Cambridge, MA: MIT Press. ISBN 978-0-262-07275-5.
- Johnson, Eric E. (2006). "Calibrating Patent Lifetimes" (PDF). Santa Clara Computer & High Technology Law Journal. 22: 269–314.
- Bodenhausen, Galen V.; et al. (1999). "On the Dialectics of Discrimination: Dual Processes in Social Stereotyping". In Chaiken, Shelly; Trope, Yaacov (eds.). Dual-process Theories in Social Psychology. New York, NY: Guilford Press. pp. 271–292. ISBN 978-1-57230-421-5.
- Kleg, Milton (1993). Hate Prejudice and Racism. Albany, NY: State University of New York Press. p. 135. ISBN 978-0-79141-536-8.
- Gökçen, Sinan (20 November 2007). "Pictures in Our Heads". European Roma Rights Centre. Retrieved 24 March 2015.
Further reading
- How To Solve It: Modern Heuristics, Zbigniew Michalewicz and David B. Fogel, Springer Verlag, 2000. ISBN 3-540-66061-5
- Russell, Stuart J.; Norvig, Peter (2003), Artificial Intelligence: A Modern Approach (2nd ed.), Upper Saddle River, New Jersey: Prentice Hall, ISBN 0-13-790395-2
- The Problem of Thinking Too Much, 11 December 2002, Persi Diaconis