Decision making is the cognitive process that results in the selection of a course of action or belief from several possibilities. It can be thought of as a particular type of problem solving; the problem is considered solved when a solution that is deemed satisfactory is reached.
Heuristics
Heuristics are simple rules of thumb that people often use to form judgments and make decisions; think of them as mental shortcuts. Heuristics can be very useful in reducing the time and mental effort it takes to make most decisions and judgments; however, because they are shortcuts, they don't take into account all information and can thus lead to errors.
The Availability Heuristic
In psychology, availability is the ease with which a particular idea can be brought to mind. When people estimate how likely or how frequent an event is on the basis of its availability, they are using the availability heuristic. When an infrequent event can be brought easily and vividly to mind, this heuristic overestimates its likelihood. For example, people overestimate their likelihood of dying in a dramatic event such as a tornado or a terrorist attack. Dramatic, violent deaths are usually more highly publicized and therefore have a higher availability. On the other hand, common but mundane events (like heart attacks and diabetes) are harder to bring to mind, so their likelihood tends to be underestimated. This heuristic is one of the reasons why people are more easily swayed by a single, vivid story than by a large body of statistical evidence. It affects decision making in a number of ways: people decide not to fly on a plane after hearing about a plane crash, but if their doctor says they should change their diet or they'll be at risk for heart disease, they may think "Well, it probably won't happen." Since the former leaps to mind more easily than the latter, people perceive it as more likely.
Lottery ticket
Lotteries take advantage of the availability heuristic: winning the lottery is a more vivid mental image than losing the lottery, and thus people perceive winning the lottery as being more likely than it is.
The Representativeness Heuristic and the Base-Rate Fallacy
The representativeness heuristic is seen when people use categories—when deciding, for example,whether or not a person is a criminal. An individual object or person has a high representativeness for a category if that object or person is very similar to a prototype of that category. When people categorize things on the basis of representativeness, they are using the representativeness heuristic. While it is effective for some problems, this heuristic involves attending to the particular characteristics of the individual, ignoring how common those categories are in the population (called the base rates). Thus, people can overestimate the likelihood that something has a very rare property, or underestimate the likelihood of a very common property. This is called the base-rate fallacy, and it is the cause of many negative stereotypes based on outward appearance. Representativeness explains many of the ways in which human judgments break the laws of probability.
The Anchoring-and-Adjustment Heuristic
Anchoring and adjustment is a heuristic used in situations where people must estimate a number. It involves starting from a readily available number—the "anchor"—and shifting either up or down to reach an answer that seems plausible. However, people do not shift far enough away from the anchor to be random; thus, it seems that the anchor contaminates the estimate, even if it is clearly irrelevant. In one experiment, subjects watched a number being selected from a spinning "wheel of fortune." They had to say whether a given quantity was larger or smaller than that number. For instance, they were asked, "Is the percentage of African countries that are members of the United Nations larger or smaller than 65%?" They then tried to guess the true percentage. Their answers correlated with the arbitrary number they had been given. Insufficient adjustment from an anchor is not the only explanation for this effect. The anchoring effect has been demonstrated by a wide variety of experiments, both in laboratories and in the real world. It remains when the subjects are offered money as an incentive to be accurate, or when they are explicitly told not to base their judgment on the anchor. The effect is stronger when people have to make their judgments quickly. Subjects in these experiments lack introspective awareness of the heuristic—that is, they deny that the anchor affected their estimates.
The Framing Effect
The framing effect is a phenomenon that affects how people make decisions. It is an example of cognitive bias, in which people react to a choice in different ways depending on how it is presented (e.g., as a loss or a gain). A very famous example of the framing effect comes from a 1981 experiment in which subjects were asked to choose between two treatments for an imaginary 600 people affected by a deadly disease. Treatment A was predicted to result in 400 deaths, whereas Treatment B had a 33% chance that no one would die but a 66% chance that everyone would die. This choice was then presented to participants either with positive framing (how many people would live) or negative framing (how many people would die), as delineated here: Positive framing: "Treatment A will save 200 lives; Treatment B has a 33% chance of saving all 600 people and a 66% chance of saving no one." Negative framing: "Treatment A will let 400 people die; Treatment B has a 33% chance of no one dying and a 66% chance of everyone dying." Treatment A was chosen by 72% of participants when it was presented with positive framing, but only by 22% of participants when it was presented with negative framing, despite the fact that it was the same treatment both times. The framing effect has a huge impact on how people make decisions. People tend to be risk-averse: They won't gamble for a gain, but they will gamble to avoid a certain loss (e.g., choosing Treatment B when presented with negative framing).