Decision-making

In psychology, decision-making (also spelled decision making and decisionmaking) is regarded as the cognitive process resulting in the selection of a belief or a course of action among several possible alternative options. It could be either rational or irrational. The decision-making process is a reasoning process based on assumptions of values, preferences and beliefs of the decision-maker.[1] Every decision-making process produces a final choice, which may or may not prompt action.

Sample flowchart representing a decision process when confronted with a lamp that fails to light.

Research about decision-making is also published under the label problem solving, particularly in European psychological research.[2]

Overview

Decision-making can be regarded as a problem-solving activity yielding a solution deemed to be optimal, or at least satisfactory. It is therefore a process which can be more or less rational or irrational and can be based on explicit or tacit knowledge and beliefs. Tacit knowledge is often used to fill the gaps in complex decision-making processes.[3] Usually, both of these types of knowledge, tacit and explicit, are used together in the decision-making process.

Human performance has been the subject of active research from several perspectives:

  • Psychological: examining individual decisions in the context of a set of needs, preferences and values the individual has or seeks.
  • Cognitive: the decision-making process is regarded as a continuous process integrated in the interaction with the environment.
  • Normative: the analysis of individual decisions concerned with the logic of decision-making, or communicative rationality, and the invariant choice it leads to.[4]

A major part of decision-making involves the analysis of a finite set of alternatives described in terms of evaluative criteria. Then the task might be to rank these alternatives in terms of how attractive they are to the decision-maker(s) when all the criteria are considered simultaneously. Another task might be to find the best alternative or to determine the relative total priority of each alternative (for instance, if alternatives represent projects competing for funds) when all the criteria are considered simultaneously. Solving such problems is the focus of multiple-criteria decision analysis (MCDA). This area of decision-making, although very old, has attracted the interest of many researchers and practitioners and is still highly debated as there are many MCDA methods which may yield very different results when they are applied to exactly the same data.[5] This leads to the formulation of a decision-making paradox. Logical decision-making is an important part of all science-based professions, where specialists apply their knowledge in a given area to make informed decisions. For example, medical decision-making often involves a diagnosis and the selection of appropriate treatment. But naturalistic decision-making research shows that in situations with higher time pressure, higher stakes, or increased ambiguities, experts may use intuitive decision-making rather than structured approaches. They may follow a recognition-primed decision that fits their experience, and arrive at a course of action without weighing alternatives.[6]

The decision-maker's environment can play a part in the decision-making process. For example, environmental complexity is a factor that influences cognitive function.[7] A complex environment is an environment with a large number of different possible states which come and go over time.[8] Studies done at the University of Colorado have shown that more complex environments correlate with higher cognitive function, which means that a decision can be influenced by the location. One experiment measured complexity in a room by the number of small objects and appliances present; a simple room had less of those things. Cognitive function was greatly affected by the higher measure of environmental complexity, making it easier to think about the situation and make a better decision.[7]

Problem solving vs. decision making

It is important to differentiate between problem solving, or problem analysis, and decision-making. Problem solving is the process of investigating the given information and finding all possible solutions through invention or discovery. Traditionally, it is argued that problem solving is a step towards decision making, so that the information gathered in that process may be used towards decision-making.[9]

Characteristics of problem solving
  • Problems are merely deviations from performance standards.
  • Problems must be precisely identified and described
  • Problems are caused by a change from a distinctive feature
  • Something can always be used to distinguish between what has and has not been affected by a cause
  • Causes of problems can be deduced from relevant changes found in analyzing the problem
  • Most likely cause of a problem is the one that exactly explains all the facts, while having the fewest (or weakest) assumptions (Occam's razor).
Characteristics of decision-making
  • Objectives must first be established
  • Objectives must be classified and placed in order of importance
  • Alternative actions must be developed
  • The alternatives must be evaluated against all the objectives
  • The alternative that is able to achieve all the objectives is the tentative decision
  • The tentative decision is evaluated for more possible consequences
  • The decisive actions are taken, and additional actions are taken to prevent any adverse consequences from becoming problems and starting both systems (problem analysis and decision-making) all over again
  • There are steps that are generally followed that result in a decision model that can be used to determine an optimal production plan[10]
  • In a situation featuring conflict, role-playing may be helpful for predicting decisions to be made by involved parties[11]

Analysis paralysis

When a group or individual is unable to make it through the problem-solving step on the way to making a decision, they could be experiencing analysis paralysis. Analysis paralysis is the state that a person enters where they are unable to make a decision, in effect paralyzing the outcome.[12][13] Some of the main causes for analysis paralysis is the overwhelming flood of incoming data or the tendency to overanalyze the situation at hand.[14] There are said to be three different types of analysis paralysis.[15]

  • The first is analysis process paralysis. This type of paralysis is often spoken of as a cyclical process. One is unable to make a decision because they get stuck going over the information again and again for fear of making the wrong decision.
  • The second is decision precision paralysis. This paralysis is cyclical, just like the first one, but instead of going over the same information, the decision-maker will find new questions and information from their analysis and that will lead them to explore into further possibilities rather than making a decision.
  • The third is risk uncertainty paralysis. This paralysis occurs when the decision-maker wants to eliminate any uncertainty but the examination of provided information is unable to get rid of all uncertainty.

Extinction by instinct

On the opposite side of analysis paralysis is the phenomenon called extinction by instinct. Extinction by instinct is the state that a person is in when they make careless decisions without detailed planning or thorough systematic processes.[16] Extinction by instinct can possibly be fixed by implementing a structural system, like checks and balances into a group or one's life. Analysis paralysis is the exact opposite where a group's schedule could be saturated by too much of a structural checks and balance system.[16]

Extinction by instinct in a group setting

Groupthink is another occurrence that falls under the idea of extinction by instinct. Groupthink is when members in a group become more involved in the “value of the group (and their being part of it) higher than anything else”; thus, creating a habit of making decisions quickly and unanimously. In other words, a group stuck in groupthink is participating in the phenomenon of extinction by instinct.[17]

Information overload

Information overload is "a gap between the volume of information and the tools we have to assimilate" it.[18] Information used in decision-making is to reduce or eliminate the uncertainty.[19] Excessive information affects problem processing and tasking, which affects decision-making.[20] Psychologist George Armitage Miller suggests that humans' decision making becomes inhibited because human brains can only hold a limited amount of information.[21] Crystal C. Hall and colleagues described an "illusion of knowledge", which means that as individuals encounter too much knowledge, it can interfere with their ability to make rational decisions.[22] Other names for information overload are information anxiety, information explosion, infobesity, and infoxication.[23][24][25][26]

Decision fatigue

Decision fatigue is when a sizable amount of decision-making leads to a decline in decision-making skills. People who make decisions in an extended period of time begin to lose mental energy needed to analyze all possible solutions. Impulsive decision-making and decision avoidance are two possible paths that extend from decision fatigue. Impulse decisions are made more often when a person is tired of analysis situations or solutions; the solution they make is to act and not think.[27] Decision avoidance is when a person evades the situation entirely by not ever making a decision. Decision avoidance is different from analysis paralysis because this sensation is about avoiding the situation entirely, while analysis paralysis is continually looking at the decisions to be made but still unable to make a choice.[28]

Post-decision analysis

Evaluation and analysis of past decisions is complementary to decision-making. See also mental accounting and Postmortem documentation.

Neuroscience

Decision-making is a region of intense study in the fields of systems neuroscience, and cognitive neuroscience. Several brain structures, including the anterior cingulate cortex (ACC), orbitofrontal cortex, and the overlapping ventromedial prefrontal cortex are believed to be involved in decision-making processes. A neuroimaging study[29] found distinctive patterns of neural activation in these regions depending on whether decisions were made on the basis of perceived personal volition or following directions from someone else. Patients with damage to the ventromedial prefrontal cortex have difficulty making advantageous decisions.[30]

A common laboratory paradigm for studying neural decision-making is the two-alternative forced choice task (2AFC), in which a subject has to choose between two alternatives within a certain time. A study of a two-alternative forced choice task involving rhesus monkeys found that neurons in the parietal cortex not only represent the formation of a decision[31] but also signal the degree of certainty (or "confidence") associated with the decision.[32] A 2012 study found that rats and humans can optimally accumulate incoming sensory evidence, to make statistically optimal decisions.[33] Another study found that lesions to the ACC in the macaque resulted in impaired decision-making in the long run of reinforcement guided tasks suggesting that the ACC may be involved in evaluating past reinforcement information and guiding future action.[34] It has recently been argued that the development of formal frameworks will allow neuroscientists to study richer and more naturalistic paradigms than simple 2AFC decision tasks; in particular, such decisions may involve planning and information search across temporally extended environments.[35]

Emotions

Emotion appears able to aid the decision-making process. Decision-making often occurs in the face of uncertainty about whether one's choices will lead to benefit or harm (see also Risk). The somatic marker hypothesis is a neurobiological theory of how decisions are made in the face of uncertain outcomes.[36] This theory holds that such decisions are aided by emotions, in the form of bodily states, that are elicited during the deliberation of future consequences and that mark different options for behavior as being advantageous or disadvantageous. This process involves an interplay between neural systems that elicit emotional/bodily states and neural systems that map these emotional/bodily states.[37] A recent lesion mapping study of 152 patients with focal brain lesions conducted by Aron K. Barbey and colleagues provided evidence to help discover the neural mechanisms of emotional intelligence.[38][39][40]

Decision-making techniques

Decision-making techniques can be separated into two broad categories: group decision-making techniques and individual decision-making techniques. Individual decision-making techniques can also often be applied by a group.

Group

  • Consensus decision-making tries to avoid "winners" and "losers". Consensus requires that a majority approve a given course of action, but that the minority agree to go along with the course of action. In other words, if the minority opposes the course of action, consensus requires that the course of action be modified to remove objectionable features.
  • Voting-based methods:
    • Majority requires support from more than 50% of the members of the group. Thus, the bar for action is lower than with consensus. See also Condorcet method.
    • Plurality, where the largest faction in a group decides, even if it falls short of a majority.
    • Score voting (or range voting) lets each member score one or more of the available options, specifying both preference and intensity of preference information. The option with the highest total or average is chosen. This method has experimentally been shown to produce the lowest Bayesian regret among common voting methods, even when voters are strategic.[41] It addresses issues of voting paradox and majority rule. See also approval voting.
    • Quadratic voting allows participants to cast their preference and intensity of preference for each decision (as opposed to a simple for or against decision). As in score voting, it addresses issues of voting paradox and majority rule.
  • Delphi method is a structured communication technique for groups, originally developed for collaborative forecasting but has also been used for policy making.[42]
  • Dotmocracy is a facilitation method that relies on the use of special forms called Dotmocracy. They are sheets that allows large groups to collectively brainstorm and recognize agreements on an unlimited number of ideas they have each wrote.[43]
  • Participative decision-making occurs when an authority opens up the decision-making process to a group of people for a collaborative effort.
  • Decision engineering uses a visual map of the decision-making process based on system dynamics and can be automated through a decision modeling tool, integrating big data, machine learning, and expert knowledge as appropriate.

Individual

Steps

A variety of researchers have formulated similar prescriptive steps aimed at improving decision-making.

GOFER

In the 1980s, psychologist Leon Mann and colleagues developed a decision-making process called GOFER, which they taught to adolescents, as summarized in the book Teaching Decision Making To Adolescents.[45] The process was based on extensive earlier research conducted with psychologist Irving Janis.[46] GOFER is an acronym for five decision-making steps:[47]

  1. Goals clarification: Survey values and objectives.
  2. Options generation: Consider a wide range of alternative actions.
  3. Facts-finding: Search for information.
  4. Consideration of Effects: Weigh the positive and negative consequences of the options.
  5. Review and implementation: Plan how to review the options and implement them.

Other

In 2007, Pam Brown of Singleton Hospital in Swansea, Wales, divided the decision-making process into seven steps:[48]

  1. Outline the goal and outcome.
  2. Gather data.
  3. Develop alternatives (i.e., brainstorming).
  4. List pros and cons of each alternative.
  5. Make the decision.
  6. Immediately take action to implement it.
  7. Learn from and reflect on the decision.

In 2008, Kristina Guo published the DECIDE model of decision-making, which has six parts:[49]

  1. Define the problem
  2. Establish or Enumerate all the criteria (constraints)
  3. Consider or Collect all the alternatives
  4. Identify the best alternative
  5. Develop and implement a plan of action
  6. Evaluate and monitor the solution and examine feedback when necessary

In 2009, professor John Pijanowski described how the Arkansas Program, an ethics curriculum at the University of Arkansas, used eight stages of moral decision-making based on the work of James Rest:[50]:6

  1. Establishing community: Create and nurture the relationships, norms, and procedures that will influence how problems are understood and communicated. This stage takes place prior to and during a moral dilemma.
  2. Perception: Recognize that a problem exists.
  3. Interpretation: Identify competing explanations for the problem, and evaluate the drivers behind those interpretations.
  4. Judgment: Sift through various possible actions or responses and determine which is more justifiable.
  5. Motivation: Examine the competing commitments which may distract from a more moral course of action and then prioritize and commit to moral values over other personal, institutional or social values.
  6. Action: Follow through with action that supports the more justified decision.
  7. Reflection in action.
  8. Reflection on action.

Group stages

There are four stages or phases that should be involved in all group decision-making:[51]

  • Orientation. Members meet for the first time and start to get to know each other.
  • Conflict. Once group members become familiar with each other, disputes, little fights and arguments occur. Group members eventually work it out.
  • Emergence. The group begins to clear up vague opinions by talking about them.
  • Reinforcement. Members finally make a decision and provide justification for it.

It is said that establishing critical norms in a group improves the quality of decisions, while the majority of opinions (called consensus norms) do not.[52]

Conflicts in socialization are divided in to functional and dysfunctional types. Functional conflicts are mostly the questioning the managers assumptions in their decision making and dysfunctional conflicts are like personal attacks and every action which decrease team effectiveness. Functional conflicts are the better ones to gain higher quality decision making caused by the increased team knowledge and shared understanding.[53]

Rational and irrational

In economics, it is thought that if humans are rational and free to make their own decisions, then they would behave according to rational choice theory.[54]:368–370 Rational choice theory says that a person consistently makes choices that lead to the best situation for themselves, taking into account all available considerations including costs and benefits; the rationality of these considerations is from the point of view of the person themselves, so a decision is not irrational just because someone else finds it questionable.

In reality, however, there are some factors that affect decision-making abilities and cause people to make irrational decisions  for example, to make contradictory choices when faced with the same problem framed in two different ways (see also Allais paradox).

Rational decision making is a multi-step process for making choices between alternatives. The process of rational decision making favors logic, objectivity, and analysis over subjectivity and insight. Irrational decision is more counter to logic. The decisions are made in haste and outcomes are not considered.[55]

One of the most prominent theories of decision making is subjective expected utility (SEU) theory, which describes the rational behavior of the decision maker.[56] The decision maker assesses different alternatives by their utilities and the subjective probability of occurrence.[56]

Rational decision-making is often grounded on experience and theories that are able to put this approach on solid mathematical grounds so that subjectivity is reduced to a minimum, see e.g. scenario optimization.

Rational decision is generally seen as the best or most likely decision to achieve the set goals or outcome.[57]

Children, adolescents, and adults

Children

It has been found that, unlike adults, children are less likely to have research strategy behaviors. One such behavior is adaptive decision-making, which is described as funneling and then analyzing the more promising information provided if the number of options to choose from increases. Adaptive decision-making behavior is somewhat present for children, ages 11–12 and older, but decreases in presence the younger they are.[58] The reason children are not as fluid in their decision making is because they lack the ability to weigh the cost and effort needed to gather information in the decision-making process. Some possibilities that explain this inability are knowledge deficits and lack of utilization skills. Children lack the metacognitive knowledge necessary to know when to use any strategies they do possess to change their approach to decision-making.[58]

When it comes to the idea of fairness in decision making, children and adults differ much less. Children are able to understand the concept of fairness in decision making from an early age. Toddlers and infants, ranging from 9–21 months, understand basic principles of equality. The main difference found is that more complex principles of fairness in decision making such as contextual and intentional information do not come until children get older.[59]

Adolescents

During their adolescent years, teens are known for their high-risk behaviors and rash decisions. Research[60] has shown that there are differences in cognitive processes between adolescents and adults during decision-making. Researchers have concluded that differences in decision-making are not due to a lack of logic or reasoning, but more due to the immaturity of psychosocial capacities that influence decision-making. Examples of their undeveloped capacities which influence decision-making would be impulse control, emotion regulation, delayed gratification and resistance to peer pressure. In the past, researchers have thought that adolescent behavior was simply due to incompetency regarding decision-making. Currently, researchers have concluded that adults and adolescents are both competent decision-makers, not just adults. However, adolescents' competent decision-making skills decrease when psychosocial capacities become present.

Research[61] has shown that risk-taking behaviors in adolescents may be the product of interactions between the socioemotional brain network and its cognitive-control network. The socioemotional part of the brain processes social and emotional stimuli and has been shown to be important in reward processing. The cognitive-control network assists in planning and self-regulation. Both of these sections of the brain change over the course of puberty. However, the socioemotional network changes quickly and abruptly, while the cognitive-control network changes more gradually. Because of this difference in change, the cognitive-control network, which usually regulates the socioemotional network, struggles to control the socioemotional network when psychosocial capacities are present.

When adolescents are exposed to social and emotional stimuli, their socioemotional network is activated as well as areas of the brain involved in reward processing. Because teens often gain a sense of reward from risk-taking behaviors, their repetition becomes ever more probable due to the reward experienced. In this, the process mirrors addiction. Teens can become addicted to risky behavior because they are in a high state of arousal and are rewarded for it not only by their own internal functions but also by their peers around them. A recent study suggests that adolescents have difficulties adequately adjusting beliefs in response to bad news (such as reading that smoking poses a greater risk to health than they thought), but do not differ from adults in their ability to alter beliefs in response to good news.[62] This creates biased beliefs, which may lead to greater risk taking.[63]

Adults

Adults are generally better able to control their risk-taking because their cognitive-control system has matured enough to the point where it can control the socioemotional network, even in the context of high arousal or when psychosocial capacities are present. Also, adults are less likely to find themselves in situations that push them to do risky things. For example, teens are more likely to be around peers who peer pressure them into doing things, while adults are not as exposed to this sort of social setting.[64][65]

Cognitive and personal biases

Biases usually affect decision-making processes. They appear more when decision task has time pressure, is done under high stress and/or task is highly complex.[66]

Here is a list of commonly debated biases in judgment and decision-making:

  • Selective search for evidence (also known as confirmation bias): People tend to be willing to gather facts that support certain conclusions but disregard other facts that support different conclusions. Individuals who are highly defensive in this manner show significantly greater left prefrontal cortex activity as measured by EEG than do less defensive individuals.[67]
  • Premature termination of search for evidence: People tend to accept the first alternative that looks like it might work.
  • Cognitive inertia is the unwillingness to change existing thought patterns in the face of new circumstances.
  • Selective perception: People actively screen out information that they do not think is important (see also Prejudice). In one demonstration of this effect, discounting of arguments with which one disagrees (by judging them as untrue or irrelevant) was decreased by selective activation of right prefrontal cortex.[68]
  • Wishful thinking is a tendency to want to see things in a certain  usually positive  light, which can distort perception and thinking.[69]
  • Choice-supportive bias occurs when people distort their memories of chosen and rejected options to make the chosen options seem more attractive.
  • Recency: People tend to place more attention on more recent information and either ignore or forget more distant information (see Semantic priming). The opposite effect in the first set of data or other information is termed primacy effect.[70]
  • Repetition bias is a willingness to believe what one has been told most often and by the greatest number of different sources.
  • Anchoring and adjustment: Decisions are unduly influenced by initial information that shapes our view of subsequent information.
  • Groupthink is peer pressure to conform to the opinions held by the group.
  • Source credibility bias is a tendency to reject a person's statement on the basis of a bias against the person, organization, or group to which the person belongs. People preferentially accept statements by others that they like (see also Prejudice).
  • Incremental decision-making and escalating commitment: People look at a decision as a small step in a process, and this tends to perpetuate a series of similar decisions. This can be contrasted with zero-based decision-making (see Slippery slope).
  • Attribution asymmetry: People tend to attribute their own success to internal factors, including abilities and talents, but explain their failures in terms of external factors such as bad luck. The reverse bias is shown when people explain others' success or failure.
  • Role fulfillment is a tendency to conform to others' decision-making expectations.
  • Underestimating uncertainty and the illusion of control: People tend to underestimate future uncertainty because of a tendency to believe they have more control over events than they really do.
  • Framing bias: This is best avoided by increasing numeracy and presenting data in several formats (for example, using both absolute and relative scales).[71]
    • Sunk-cost fallacy is a specific type of framing effect that affects decision-making. It involves an individual making a decision about a current situation based on what they have previously invested in the situation.[54]:372 An example of this would be an individual that is refraining from dropping a class that they are most likely to fail, due to the fact that they feel as though they have done so much work in the course thus far.
  • Prospect theory involves the idea that when faced with a decision-making event, an individual is more likely to take on a risk when evaluating potential losses, and are more likely to avoid risks when evaluating potential gains. This can influence one's decision-making depending if the situation entails a threat, or opportunity.[54]:373
  • Optimism bias is a tendency to overestimate the likelihood of positive events occurring in the future and underestimate the likelihood of negative life events.[72] Such biased expectations are generated and maintained in the face of counter-evidence through a tendency to discount undesirable information.[73] An optimism bias can alter risk perception and decision-making in many domains, ranging from finance to health.
  • Reference class forecasting was developed to eliminate or reduce cognitive biases in decision-making.

Cognitive limitations in groups

In groups, people generate decisions through active and complex processes. One method consists of three steps: initial preferences are expressed by members; the members of the group then gather and share information concerning those preferences; finally, the members combine their views and make a single choice about how to face the problem. Although these steps are relatively ordinary, judgements are often distorted by cognitive and motivational biases, include "sins of commission", "sins of omission", and "sins of imprecision".[74]

Cognitive styles

Optimizing vs. satisficing

Herbert A. Simon coined the phrase "bounded rationality" to express the idea that human decision-making is limited by available information, available time and the mind's information-processing ability. Further psychological research has identified individual differences between two cognitive styles: maximizers try to make an optimal decision, whereas satisficers simply try to find a solution that is "good enough". Maximizers tend to take longer making decisions due to the need to maximize performance across all variables and make tradeoffs carefully; they also tend to more often regret their decisions (perhaps because they are more able than satisficers to recognize that a decision turned out to be sub-optimal).[75]

Intuitive vs. rational

The psychologist Daniel Kahneman, adopting terms originally proposed by the psychologists Keith Stanovich and Richard West, has theorized that a person's decision-making is the result of an interplay between two kinds of cognitive processes: an automatic intuitive system (called "System 1") and an effortful rational system (called "System 2"). System 1 is a bottom-up, fast, and implicit system of decision-making, while system 2 is a top-down, slow, and explicit system of decision-making.[76] System 1 includes simple heuristics in judgment and decision-making such as the affect heuristic, the availability heuristic, the familiarity heuristic, and the representativeness heuristic.

Combinatorial vs. positional

Styles and methods of decision-making were elaborated by Aron Katsenelinboigen, the founder of predispositioning theory. In his analysis on styles and methods, Katsenelinboigen referred to the game of chess, saying that "chess does disclose various methods of operation, notably the creation of predisposition-methods which may be applicable to other, more complex systems."[77]:5

Katsenelinboigen states that apart from the methods (reactive and selective) and sub-methods (randomization, predispositioning, programming), there are two major styles: positional and combinational. Both styles are utilized in the game of chess. The two styles reflect two basic approaches to uncertainty: deterministic (combinational style) and indeterministic (positional style). Katsenelinboigen's definition of the two styles are the following.

The combinational style is characterized by:

  • a very narrow, clearly defined, primarily material goal; and
  • a program that links the initial position with the outcome.

In defining the combinational style in chess, Katsenelinboigen wrote: "The combinational style features a clearly formulated limited objective, namely the capture of material (the main constituent element of a chess position). The objective is implemented via a well-defined, and in some cases, unique sequence of moves aimed at reaching the set goal. As a rule, this sequence leaves no options for the opponent. Finding a combinational objective allows the player to focus all his energies on efficient execution, that is, the player's analysis may be limited to the pieces directly partaking in the combination. This approach is the crux of the combination and the combinational style of play.[77]:57

The positional style is distinguished by:

  • a positional goal; and
  • a formation of semi-complete linkages between the initial step and final outcome.

"Unlike the combinational player, the positional player is occupied, first and foremost, with the elaboration of the position that will allow him to develop in the unknown future. In playing the positional style, the player must evaluate relational and material parameters as independent variables. ... The positional style gives the player the opportunity to develop a position until it becomes pregnant with a combination. However, the combination is not the final goal of the positional player  it helps him to achieve the desirable, keeping in mind a predisposition for the future development. The pyrrhic victory is the best example of one's inability to think positionally."[78]

The positional style serves to:

  • create a predisposition to the future development of the position;
  • induce the environment in a certain way;
  • absorb an unexpected outcome in one's favor; and
  • avoid the negative aspects of unexpected outcomes.

Influence of Myers–Briggs type

According to Isabel Briggs Myers, a person's decision-making process depends to a significant degree on their cognitive style.[79] Myers developed a set of four bi-polar dimensions, called the Myers–Briggs Type Indicator (MBTI). The terminal points on these dimensions are: thinking and feeling; extroversion and introversion; judgment and perception; and sensing and intuition. She claimed that a person's decision-making style correlates well with how they score on these four dimensions. For example, someone who scored near the thinking, extroversion, sensing, and judgment ends of the dimensions would tend to have a logical, analytical, objective, critical, and empirical decision-making style. However, some psychologists say that the MBTI lacks reliability and validity and is poorly constructed.[80][81]

Other studies suggest that these national or cross-cultural differences in decision-making exist across entire societies. For example, Maris Martinsons has found that American, Japanese and Chinese business leaders each exhibit a distinctive national style of decision-making.[82]

The Myers–Briggs typology has been the subject of criticism regarding its poor psychometric properties.[83][84][85]

General decision-making style (GDMS)

In the general decision-making style (GDMS) test developed by Suzanne Scott and Reginald Bruce, there are five decision-making styles: rational, intuitive, dependent, avoidant, and spontaneous.[86][87] These five different decision-making styles change depending on the context and situation, and one style is not necessarily better than any other. In the examples below, the individual is working for a company and is offered a job from a different company.

  • The rational style is an in-depth search for, and a strong consideration of, other options and/or information prior to making a decision. In this style, the individual would research the new job being offered, review their current job, and look at the pros and cons of taking the new job versus staying with their current company.
  • The intuitive style is confidence in one's initial feelings and gut reactions. In this style, if the individual initially prefers the new job because they have a feeling that the work environment is better suited for them, then they would decide to take the new job. The individual might not make this decision as soon as the job is offered.
  • The dependent style is asking for other people's input and instructions on what decision should be made. In this style, the individual could ask friends, family, coworkers, etc., but the individual might not ask all of these people.
  • The avoidant style is averting the responsibility of making a decision. In this style, the individual would not make a decision. Therefore, the individual would stick with their current job.
  • The spontaneous style is a need to make a decision as soon as possible rather than waiting to make a decision. In this style, the individual would either reject or accept the job as soon as it is offered.

Organizational vs. individual level

There are a few characteristics that differentiate organizational decision-making from individual decision-making as studied in lab experiments:[88]

  1. Unlike most lab studies of individual decision-making, ambiguity is pervasive in organizations. There is often only ambiguous information, and there is ambiguity about preferences as well as about interpreting the history of decisions.
  2. Decision-making in and by organizations is embedded in a longitudinal context, meaning that participants in organizational decision-making are a part of ongoing processes. Even if they do not take on active roles in all phases of decision-making, they are part of the Decision Process and its consequences. Decisions in organizations are made in a sequential manner, and commitment may be more important in such processes than judgmental accuracy. In contrast, most lab studies of individual decision-making are conducted in artificial settings (lab) that are not connected to the subjects' ongoing activities.
  3. Incentives play an important role in organizational decision-making. Incentives, penalties, and their ramifications are real and may have long-lasting effects. These effects are intensified due to the longitudinal nature of decision-making in organizational settings. Incentives and penalties are very salient in organizations, and often they command managerial attention.
  4. Many executives, especially in middle management, may make repeated decisions on similar issues. Managers may develop a sense of using his/her skills (which may be faulty) and a sense of having control and using one's skills are pervasive in managerial thinking about risk taking. Several repeated decisions are made by following rules rather than by using pure information processing modes.
  5. Conflict is pervasive in organizational decision-making. Many times power considerations and agenda setting determine decisions rather than calculations based on the decision's parameters. The nature of authority relations may have a large impact on the way decisions are made in organizations, which are basically political systems.

See also

References

  1. Herbert Alexander Simon (1977). The New Science of Management Decision. Prentice-Hall. ISBN 978-0136161448.
  2. Frensch, Peter A.; Funke, Joachim, eds. (1995). Complex problem solving: the European perspective. Hillsdale, NJ: Lawrence Erlbaum Associates. ISBN 978-0805813364. OCLC 32131412.
  3. Brockmann, Erich N.; Anthony, William P. (December 2016). "Tacit knowledge and strategic decision making". Group & Organization Management. 27 (4): 436–455. doi:10.1177/1059601102238356. S2CID 145110719.
  4. Kahneman, Daniel; Tversky, Amos, eds. (2000). Choices, values, and frames. New York; Cambridge, UK: Russell Sage Foundation; Cambridge University Press. p. 211. ISBN 978-0521621724. OCLC 42934579.
  5. Triantaphyllou, Evangelos (2000). Multi-criteria decision making methods: a comparative study. Applied optimization. Vol. 44. Dordrecht, Netherlands: Kluwer Academic Publishers. p. 320. doi:10.1007/978-1-4757-3157-6. ISBN 978-0792366072.
  6. Klein, Gary (2008). "Naturalistic Decision Making". Human Factors: The Journal of the Human Factors and Ergonomics Society. 50 (3): 456–460. doi:10.1518/001872008x288385. ISSN 0018-7208. PMID 18689053. S2CID 11251289.
  7. Davidson, Alice Ware; Bar-Yam, Yaneer (2006) [2000]. "Environmental complexity: information for human–environment well-being" (PDF). In Bar-Yam, Yaneer; Minai, Ali (eds.). Unifying themes in complex systems. Berlin; New York: Springer. pp. 157–168. CiteSeerX 10.1.1.33.7118. doi:10.1007/978-3-540-35866-4_16. ISBN 978-3540358640.
  8. Godfrey-Smith, Peter (2001). "Environmental complexity and the evolution of cognition" (PDF). In Sternberg, Robert J.; Kaufman, James C. (eds.). The evolution of intelligence. Mahwah, NJ: Lawrence Erlbaum Associates. pp. 223–250. ISBN 978-0805832679. OCLC 44775038.
  9. Kepner, Charles Higgins; Tregoe, Benjamin B. (1997) [1965]. The new rational manager: an updated edition for a new world (Updated ed.). Princeton, NJ: Princeton Research Press. OCLC 37666447.
  10. Monahan, George E. (2000). Management decision making: spreadsheet modeling, analysis, and application. Cambridge, UK; New York: Cambridge University Press. pp. 33–40. ISBN 978-0521781183. OCLC 42921287.
  11. Armstrong, Jon Scott (2001). "Role playing: a method to forecast decisions". In Armstrong, Jon Scott (ed.). Principles of forecasting: a handbook for researchers and practitioners. International series in operations research & management science. Vol. 30. Boston, MA: Kluwer Academic Publishers. pp. 15–30. CiteSeerX 10.1.1.464.5677. doi:10.1007/978-0-306-47630-3_2. ISBN 978-0792379300.
  12. "analysis paralysis | Definition of analysis paralysis in US English by Oxford Dictionaries". Oxford Dictionaries | English. Archived from the original on January 7, 2018. Retrieved 2018-11-10.
  13. "Analysis Paralysis | Definition of Analysis Paralysis by Lexico". Lexico Dictionaries | English. Archived from the original on July 29, 2020. Retrieved 2020-04-09.
  14. "Avoid Analysis Paralysis—Use Data to Enable Decision-Making and Growth". TechNative. 2019-03-06. Retrieved 2020-04-09.
  15. Roberts, Lon (2010). Analysis paralysis: a case of terminological inexactitude. Defense AT&L. pp. 21–22.
  16. "Between 'Paralysis by analysis' and 'Extinction by instinct'". Long Range Planning. 28 (4): 127. August 1995. doi:10.1016/0024-6301(95)94294-9. ISSN 0024-6301.
  17. Hart, Paul't (June 1991). "Irving L. Janis' Victims of Groupthink". Political Psychology. 12 (2): 247–278. doi:10.2307/3791464. JSTOR 3791464.
  18. Paul Saffo quoted in: Foley, John (30 October 1995). "Managing information: infoglut". InformationWeek. Archived from the original on 2001-02-22. Retrieved 2015-07-26.
  19. Duncan (1972). "Characteristics of organizational environments and perceived environment uncertainty". Administrative Science Quarterly. 17 (3): 313–27. doi:10.2307/2392145. JSTOR 2392145.
  20. Kutty, Ambalika D.; Kumar Shee, Himanshu; Pathak, R. D. (November 2007). "Decision-making: too much info!". Monash Business Review. 3 (3): 8–9. doi:10.2104/mbr07056.
  21. Miller, George A. (1956). "The magical number seven, plus or minus two: some limits on our capacity for processing information". Psychological Review. 63 (2): 81–97. doi:10.1037/h0043158. hdl:11858/00-001M-0000-002C-4646-B. ISSN 1939-1471. PMID 13310704. S2CID 15654531.
  22. Hall, Crystal C.; Ariss, Lynn; Todorov, Alexander (July 2007). "The illusion of knowledge: when more information reduces accuracy and increases confidence" (PDF). Organizational Behavior and Human Decision Processes. 103 (2): 277–290. doi:10.1016/j.obhdp.2007.01.003.
  23. "Enemy of the good". Nature. 503 (7477): 438. November 2013. doi:10.1038/503438a. ISSN 0028-0836. PMID 24298564.
  24. Chamorro-Premuzic, Tomas; Furnham, Adrian (2014-04-08). Personality and Intellectual Competence. doi:10.4324/9781410612649. ISBN 978-1410612649.
  25. "Richard Saul Wurman: Information, Mapping, and Understanding", Architectural Intelligence, The MIT Press, 2017, doi:10.7551/mitpress/10971.003.0004, ISBN 978-0262343428
  26. Buckland, Michael. Information and society. Cambridge, Massachusetts. ISBN 978-0262339544. OCLC 978295031.
  27. Szalavitz, Maia (2011-08-23). "Mind over Mind? Decision Fatigue Saps Willpower — if We Let It". Time. ISSN 0040-781X. Retrieved 2020-04-09.
  28. McSweeney, Alan (2019-05-21), Stopping Analysis Paralysis And Decision Avoidance In Business Analysis And Solution Design, doi:10.13140/RG.2.2.21841.38243
  29. Walton, Mark E.; Devlin, Joseph T.; Rushworth, Matthew F. S. (November 2004). "Interactions between decision making and performance monitoring within prefrontal cortex". Nature Neuroscience. 7 (11): 1259–1265. doi:10.1038/nn1339. PMID 15494729. S2CID 26711881.
  30. Damasio, Antonio R. (1994). Descartes' error: emotion, reason, and the human brain. New York: Putnam. ISBN 978-0399138942. OCLC 30780083.
  31. Gold, Joshua I.; Shadlen, Michael N. (2007). "The neural basis of decision making". Annual Review of Neuroscience. 30: 535–574. doi:10.1146/annurev.neuro.29.051605.113038. PMID 17600525.
  32. Kiani, Roozbeh; Shadlen, Michael N. (May 2009). "Representation of confidence associated with a decision by neurons in the parietal cortex". Science. 324 (5928): 759–764. Bibcode:2009Sci...324..759K. doi:10.1126/science.1169405. PMC 2738936. PMID 19423820.
  33. Brunton, Bingni W.; Botvinick, Matthew M.; Brody, Carlos D. (April 2013). "Rats and humans can optimally accumulate evidence for decision-making" (PDF). Science. 340 (6128): 95–98. Bibcode:2013Sci...340...95B. doi:10.1126/science.1233912. PMID 23559254. S2CID 13098239. Archived from the original (PDF) on 2016-03-05.
  34. Kennerley, Steven W.; Walton, Mark E.; Behrens, Timothy E. J.; Buckley, Mark J.; Rushworth, Matthew F. S. (July 2006). "Optimal decision making and the anterior cingulate cortex". Nature Neuroscience. 9 (7): 940–947. doi:10.1038/nn1724. PMID 16783368. S2CID 8868406.
  35. Hunt, L. T.; Daw, N. D.; Kaanders, P.; MacIver, M. A.; Mugan, U.; Procyk, E.; Redish, A. D.; Russo, E.; Scholl, J.; Stachenfeld, K.; Wilson, C. R. E.; Kolling, N. (21 June 2021). "Formalizing planning and information search in naturalistic decision-making" (PDF). Nature Neuroscience. 24 (8): 1051–1064. doi:10.1038/s41593-021-00866-w. PMID 34155400. S2CID 235596957.
  36. Reimann, Martin; Bechara, Antoine (October 2010). "The somatic marker framework as a neurological theory of decision-making: review, conceptual comparisons, and future neuroeconomics research". Journal of Economic Psychology. 31 (5): 767–776. doi:10.1016/j.joep.2010.03.002.
  37. Naqvi, Nasir; Shiv, Baba; Bechara, Antoine (October 2006). "The role of emotion in decision making: a cognitive neuroscience perspective". Current Directions in Psychological Science. 15 (5): 260–264. CiteSeerX 10.1.1.137.4677. doi:10.1111/j.1467-8721.2006.00448.x. S2CID 14789591.
  38. Barbey, Aron K.; Colom, Roberto; Grafman, Jordan (March 2014). "Distributed neural system for emotional intelligence revealed by lesion mapping". Social Cognitive and Affective Neuroscience. 9 (3): 265–272. doi:10.1093/scan/nss124. PMC 3980800. PMID 23171618.
  39. Yates, Diana. "Researchers map emotional intelligence in the brain". University of Illinois News Bureau. University of Illinois.
  40. HealthDay (2013-01-28). "Scientists complete 1st map of 'emotional intelligence' in the brain". U.S. News & World Report.
  41. Verma, Dem (2009). DECISION MAKING STYLE: Social and Creative Dimensions. New Delhi: Global India Publications Pvt Ltd. p. 43. ISBN 978-9380228303.
  42. Landeta, Jon (2006-06-01). "Current validity of the Delphi method in social sciences". Technological Forecasting and Social Change. 73 (5): 467–482. doi:10.1016/j.techfore.2005.09.002. ISSN 0040-1625. S2CID 143757211.
  43. Diceman, Jason (2010). Dotmocracy Handbook. Jason Diceman. pp. 1–2. ISBN 978-1451527087.
  44. Franklin, Benjamin (1975) [1772]. "To Joseph Priestley". In Willcox, William Bradford (ed.). The papers of Benjamin Franklin: January 1 through December 31, 1772. Vol. 19. New Haven: Yale University Press. pp. 299–300. ISBN 978-0300018653. OCLC 310601.
  45. Mann, Leon; Harmoni, Ros; Power, Colin (1991). "The GOFER course in decision making". In Baron, Jonathan; Brown, Rex V. (eds.). Teaching decision making to adolescents. Hillsdale, NJ: Lawrence Erlbaum Associates. pp. 61–78. ISBN 978-0805804973. OCLC 22507012. See also: Mann, Leon (July 1989). "Becoming a better decision maker". Australian Psychologist. 24 (2): 141–155. doi:10.1080/00050068908259558.
  46. Janis, Irving L.; Mann, Leon (1977). Decision making: a psychological analysis of conflict, choice, and commitment. New York: Free Press. ISBN 978-0029161609. OCLC 2542340.
  47. Mann, Leon; Harmoni, Ros; Power, Colin; Beswick, Gery; Ormond, Cheryl (July 1988). "Effectiveness of the GOFER course in decision making for high school students". Journal of Behavioral Decision Making. 1 (3): 159–168. doi:10.1002/bdm.3960010304.
  48. Brown, Pam (November 29, 2007), Career coach: decision-making, Pulse, retrieved July 12, 2012 (subscription required)
  49. Guo, Kristina L. (June 2008). "DECIDE: a decision-making model for more effective decision making by health care managers". The Health Care Manager. 27 (2): 118–127. doi:10.1097/01.HCM.0000285046.27290.90. PMID 18475113. S2CID 24492631.
  50. Pijanowski, John (February 2009). "The role of learning theory in building effective college ethics curricula". Journal of College and Character. 10 (3): 1–13. doi:10.2202/1940-1639.1088.
  51. Griffin, Emory A. (1991). "Interact system model of decision emergence of B. Aubrey Fisher" (PDF). A first look at communication theory (1st ed.). New York: McGraw-Hill. pp. 253–262. ISBN 978-0070227781. OCLC 21973427.
  52. Postmes, T; Spears, Russell; Cihangir, Sezgin (2001). "Quality of decision making and group norms". Journal of Personality and Social Psychology. 80 (6): 918–930. doi:10.1037/0022-3514.80.6.918. PMID 11414374.
  53. Brockmann, E.; Anthony, W. (2002). "Tacit knowledge and strategic decision making". Group & Organization Management. 27 (4): 436–455. doi:10.1177/1059601102238356. S2CID 145110719.
  54. Schacter, Daniel L.; Gilbert, Daniel Todd; Wegner, Daniel M. (2011) [2009]. Psychology (2nd ed.). New York: Worth Publishers. ISBN 978-1429237192. OCLC 755079969.
  55. Boundless. (n.d.). Boundless Management. Retrieved December 11, 2020, from https://courses.lumenlearning.com/boundless-management/chapter/rational-and-nonrational-decision-making/
  56. Crozier, W. Ray; Ranyard, Rob (1997). "Cognitive process models and explanations of decision making". In Ranyard, Rob; Crozier, W. Ray; Svenson, Ola (eds.). Decision making: cognitive models and explanations. Frontiers of cognitive science. London; New York: Routledge. pp. 5–20. ISBN 978-0415158183. OCLC 37043834.
  57. Djulbegovic, B. (2017) Rational decision making in medicine: Implications for Overuse and Underuse
  58. Gregan‐Paxton, Jennifer; John, Deborah Roedder (June 1997). "The Emergence of Adaptive Decision Making in Children". Journal of Consumer Research. 24 (1): 43–56. doi:10.1086/209492. ISSN 0093-5301.
  59. Jaroslawska, Agnieszka J.; McCormack, Teresa; Burns, Patrick; Caruso, Eugene M. (January 2020). "Outcomes versus intentions in fairness-related decision making: School-aged children's decisions are just like those of adults". Journal of Experimental Child Psychology. 189: 104704. doi:10.1016/j.jecp.2019.104704. ISSN 0022-0965. PMID 31634734.
  60. Steinberg, Laurence (March 2008). "A social neuroscience perspective on adolescent risk-taking". Developmental Review. 28 (1): 78–106. doi:10.1016/j.dr.2007.08.002. PMC 2396566. PMID 18509515.
  61. Steinberg, Laurence (March 2008). "A social neuroscience perspective on adolescent risk-taking". Developmental Review. 28 (1): 78–106. doi:10.1016/j.dr.2007.08.002. PMC 2396566. PMID 18509515.
  62. Moutsiana, Christina; Garrett, Neil; Clarke, Richard C.; Lotto, R. Beau; Blakemore, Sarah-Jayne; Sharot, Tali (October 2013). "Human development of the ability to learn from bad news". Proceedings of the National Academy of Sciences. 110 (41): 16396–16401. Bibcode:2013PNAS..11016396M. doi:10.1073/pnas.1305631110. PMC 3799330. PMID 24019466.
  63. Reyna, Valerie F. (November 2013). "Psychology: Good and bad news on the adolescent brain". Nature. 503 (7474): 48–49. Bibcode:2013Natur.503...48R. doi:10.1038/nature12704. PMID 24172899. S2CID 205236138.
  64. Gardner, Margo; Steinberg, Laurence (July 2005). "Peer influence on risk taking, risk preference, and risky decision making in adolescence and adulthood: an experimental study" (PDF). Developmental Psychology. 41 (4): 625–635. CiteSeerX 10.1.1.556.4973. doi:10.1037/0012-1649.41.4.625. PMID 16060809.
  65. Steinberg, Laurence (April 2007). "Risk taking in adolescence: new perspectives from brain and behavioral science". Current Directions in Psychological Science. 16 (2): 55–59. CiteSeerX 10.1.1.519.7099. doi:10.1111/j.1467-8721.2007.00475.x. S2CID 18601508.
  66. T, Maqsood; A, Finegan; D, Walker (2004). "Biases and heuristics in judgment and decision making: The dark side of tacit knowledge". Issues in Informing Science and Information Technology. 1: 0295–0301. doi:10.28945/740. ISSN 1547-5840.
  67. Blackhart, G. C.; Kline, J. P. (2005). "Individual differences in anterior EEG asymmetry between high and low defensive individuals during a rumination/distraction task". Personality and Individual Differences. 39 (2): 427–437. doi:10.1016/j.paid.2005.01.027.
  68. Drake, R. A. (1993). "Processing persuasive arguments: 2. Discounting of truth and relevance as a function of agreement and manipulated activation asymmetry". Journal of Research in Personality. 27 (2): 184–196. doi:10.1006/jrpe.1993.1013.
  69. Chua, E. F.; Rand-Giovannetti, E.; Schacter, D. L.; Albert, M.; Sperling, R. A. (2004). "Dissociating confidence and accuracy: Functional magnetic resonance imaging shows origins of the subjective memory experience" (PDF). Journal of Cognitive Neuroscience. 16 (7): 1131–1142. doi:10.1162/0898929041920568. PMID 15453969. S2CID 215728618.
  70. Plous, Scott (1993). The psychology of judgment and decision making. Philadelphia: Temple University Press. ISBN 978-0877229131. OCLC 26548229.
  71. Perneger, Thomas V.; Agoritsas, Thomas (December 2011). "Doctors and patients' susceptibility to framing bias: a randomized trial". Journal of General Internal Medicine. 26 (12): 1411–1417. doi:10.1007/s11606-011-1810-x. PMC 3235613. PMID 21792695.
  72. Sharot, Tali (2011). The optimism bias: a tour of the irrationally positive brain (1st ed.). New York: Pantheon Books. ISBN 978-0307378484. OCLC 667609433.
  73. Sharot, Tali; Korn, Christoph W.; Dolan, Raymond J. (October 2011). "How unrealistic optimism is maintained in the face of reality". Nature Neuroscience. 14 (11): 1475–1479. doi:10.1038/nn.2949. PMC 3204264. PMID 21983684.
  74. Forsyth, Donelson R. (2014) [1983]. Group dynamics (6th ed.). Belmont, CA: Wadsworth Cengage Learning. ISBN 978-1133956532. OCLC 826872491.
  75. Sparks, Erin (2007). "Satisficing". In Baumeister, Roy F.; Vohs, Kathleen D. (eds.). Encyclopedia of social psychology. Thousand Oaks, CA: SAGE Publications. pp. 776–778. ISBN 978-1412916707. OCLC 123119782.
  76. Kahneman, Daniel (2011). Thinking, fast and slow. New York: Farrar, Straus, and Giroux. ISBN 978-0374275631. OCLC 706020998.
  77. Katsenelinboigen, Aron (1997). The concept of indeterminism and its applications: economics, social systems, ethics, artificial intelligence, and aesthetics (PDF). Westport, CT: Praeger. ISBN 978-0275957889. OCLC 36438766. Archived from the original (PDF) on 2011-07-23. Retrieved 2015-07-27.
  78. Ulea, Vera (2002). A concept of dramatic genre and the comedy of a new type: chess, literature, and film. Carbondale: Southern Illinois University Press. pp. 17–18. ISBN 978-0809324521. OCLC 51301095.
  79. Myers, Isabel Briggs; Kirby, Linda K.; Myers, Katharine D. (1998) [1976]. Introduction to type: a guide to understanding your results on the Myers–Briggs Type Indicator. Introduction to type series (6th ed.). Palo Alto, CA: Consulting Psychologists Press. OCLC 40336039.
  80. Pittenger, David J. (2005). "Cautionary comments regarding the Myers–Briggs Type Indicator". Consulting Psychology Journal: Practice and Research. 57 (3): 210–221. doi:10.1037/1065-9293.57.3.210.
  81. Hogan, Robert (2007). Personality and the fate of organizations. Mahwah, NJ: Lawrence Erlbaum Associates. p. 28. ISBN 978-0805841428. OCLC 65400436. Most personality psychologists regard the MBTI as little more than an elaborate Chinese fortune cookie...
  82. Martinsons, Maris G. (December 2006). "Comparing the decision styles of American, Chinese and Japanese business leaders". Best Paper Proceedings of Academy of Management Meetings, Washington, DC, August 2001. SSRN 952292.
  83. Pittenger, David (1993). "Measuring the MBTI ... And Coming Up Short" (PDF). Journal of Career Planning and Employment. 54 (1): 48–52. Archived from the original (PDF) on 2006-12-06. Retrieved 2020-03-06.
  84. Schuwirth, Lambert; Cantillon, Peter (2004-05-22). "What the educators are saying". BMJ. 328 (7450): 1244. doi:10.1136/bmj.328.7450.1244. ISSN 0959-8138.
  85. Pittenger, David J. (2005). "Cautionary comments regarding the Myers–Briggs Type Indicator". Consulting Psychology Journal: Practice and Research. 57 (3): 210–221. doi:10.1037/1065-9293.57.3.210. ISSN 1939-0149.
  86. Scott, Susanne G.; Bruce, Reginald A. (1995). "Decision-making style: the development and assessment of a new measure". Educational and Psychological Measurement. 55 (5): 818–831. doi:10.1177/0013164495055005017. S2CID 143479230.
  87. Thunholm, Peter (March 2004). "Decision-making style: habit, style or both?". Personality and Individual Differences. 36 (4): 931–944. doi:10.1016/S0191-8869(03)00162-4.
  88. Shapira, Z. (2002). "Organizational Decision Making. Cambridge Series on Judgment and Decision Making", Cambridge University Press: pp. 4–6. ISBN 978-0521890502
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.