Decision-making - Decision Theories and Individual Methods
Understand rational and irrational decision theories, common cognitive biases, and the stages of moral decision‑making.
Summary
Read Summary
Flashcards
Save Flashcards
Quiz
Take Quiz
Quick Practice
How do individuals make decisions according to Rational Choice Theory?
1 of 7
Summary
Rational and Irrational Decision Theories
Introduction
Decision theory examines how people make choices. There are two main perspectives: theories about how people rationally should make decisions, and theories about how people actually make decisions (which often involves bias and shortcuts). Understanding both is crucial because exams typically test whether you can recognize when real decision-making deviates from rational theory—and why.
Rational Choice Theory
Rational choice theory assumes that individuals consistently choose options that maximize their personal benefit. The key word here is "consistently"—a rational decision-maker follows a logical process every time.
The core idea is straightforward: before choosing, you identify all options, evaluate the costs and benefits of each, and pick the one with the highest net benefit. This assumes you have perfect information and always act in your own self-interest.
Why it matters: This theory forms the foundation for how economists and organizations traditionally expect people to behave. However, the next sections will show why real human behavior often contradicts these assumptions.
Subjective Expected Utility Theory
This theory builds on rational choice theory by adding two important elements: utility and subjective probability.
Utility is the satisfaction or value you get from an outcome—not necessarily money. For example, winning $100 might have different utility for a wealthy person versus someone with little money. Subjective probability means you use your own beliefs about how likely something is to occur, rather than objective statistics.
Under subjective expected utility theory, you evaluate each option by multiplying its utility by your subjective probability that you'll receive it, then add up all these weighted outcomes. You choose the option with the highest expected utility.
Example: When deciding whether to buy lottery tickets, a rational actor using subjective expected utility would multiply the jackpot amount by their honest belief about winning probability, then compare this to the ticket cost. Most people overestimate their chance of winning (subjective probability bias), which partly explains why they buy tickets.
Why this matters: This theory better explains human behavior than simple rational choice because it acknowledges that people use their own perceptions and values, not objective facts. But it still assumes people think through their decisions carefully.
Scenario Optimization
Mathematical models can ground decision-making by removing subjective elements. Rather than relying on someone's feelings or estimates, you can use optimization techniques to find the best choice based on constraints and defined outcomes.
For instance, the flowchart above shows scenario optimization for fixing a broken lamp. Instead of guessing what's wrong, you work through decision points systematically: Is the lamp plugged in? Is the bulb burned out? This structured approach minimizes guessing and emotion.
Why it matters: This demonstrates that rational decision-making doesn't require psychology—it requires a clear system. However, real-world decisions often lack such clear decision trees.
Illusions and Paradoxes: When Rational Theory Breaks Down
This is where decision theory gets interesting—and where exams often focus. Real people violate rational choice theory in predictable ways.
Framing Effects
One powerful illusion is the framing effect: the way a choice is presented dramatically changes how people decide, even though the actual outcomes are identical.
Classic example (the Asian Disease Problem): Imagine 600 people will die from a disease. You must choose between:
Option A (positive frame): "200 people will be saved."
Option B (positive frame): "There's a 1/3 chance all 600 are saved, 2/3 chance nobody is saved."
Most people choose Option A (the certain outcome). But now reframe the same problem:
Option A (negative frame): "400 people will die."
Option B (negative frame): "There's a 1/3 chance nobody dies, 2/3 chance all 600 die."
Most people now choose Option B (the gamble). The options are mathematically identical, but changing the frame from "lives saved" to "lives lost" flips how people decide.
The Sunk-Cost Fallacy
The sunk-cost fallacy occurs when people base current decisions on past investments rather than future outcomes. A sunk cost is money or effort already spent that cannot be recovered.
Example: You paid $50 for a concert ticket, but you're now sick on the day of the concert. A rational decision-maker would think: "The $50 is gone regardless; will I enjoy the concert despite being sick?" But many people think: "I paid $50, so I should go!" This logic is irrational—the $50 doesn't become more valuable if you suffer through the concert.
Why this is tricky: Humans naturally think about "getting their money's worth," which actually prevents rational decision-making. The money is already sunk; only future outcomes matter.
Prospect Theory
Prospect theory explains how people actually evaluate risk, and it reveals a striking asymmetry: people behave very differently when facing potential losses versus potential gains.
Risk-Aversion for Gains
When evaluating potential gains, people are generally risk-averse. They prefer a certain smaller gain to a gamble with a larger potential gain.
Example: Most people prefer a guaranteed $100 over a 50% chance to win $250 (and 50% chance to win $0), even though the gamble has higher expected value ($125).
Risk-Seeking for Losses
Interestingly, this flips when facing losses. People become risk-seeking—they'd rather gamble on avoiding a loss than accept a sure loss.
Example: Most people prefer a 50% chance to lose $0 and 50% chance to lose $250 over a guaranteed loss of $100, even though the gamble has worse expected value (–$125).
Why this matters: This is a critical concept on exams. It shows that people's risk tolerance depends on whether they frame a situation as gains or losses, not on the objective outcomes. This connects to framing effects—the frame determines whether you're in "gain mode" or "loss mode."
Optimism Bias
Optimism bias is the tendency to overestimate the likelihood of positive events and underestimate negative events. This directly influences risk perception and decision-making.
People consistently believe:
They're less likely to experience negative events than average (accidents, illness, job loss)
They're more likely to experience positive events than average (success, happiness, good health)
Example: When surveyed, 90% of drivers rate themselves as "above average" drivers, which is statistically impossible. This optimism bias leads them to underestimate accident risk and, consequently, take more driving risks.
Why this matters on exams: Optimism bias explains why people sometimes make seemingly irrational decisions—they don't believe the negative outcomes will happen to them, so they underestimate true risk.
<extrainfo>
One particularly interesting aspect: optimism bias is sometimes protective (it can reduce anxiety and depression), but it can also lead to poor decision-making in financial, health, and safety contexts. Most exams won't test this nuance, but understanding that optimism bias has both benefits and drawbacks deepens comprehension.
</extrainfo>
Prescriptive Decision-Making Models
While the previous section described how people actually decide (irrationally), this section covers how people should decide. These are prescriptive models—they prescribe a better approach.
Moral Decision-Making Stages (University of Arkansas Model)
This model focuses specifically on decisions with ethical dimensions. It emphasizes two types of reflection:
Reflection in Action
Reflection in action means evaluating your decisions while you're making them. This is real-time self-awareness: pausing to consider, "Is this decision aligned with my values? Am I overlooking something?"
This is more difficult than it sounds because it requires emotional regulation and the ability to slow down your automatic thinking.
Reflection on Action
Reflection on action means evaluating decisions after you've made them. You review what happened, consider whether you'd decide differently with hindsight, and extract lessons for future decisions.
Why this matters: These two reflective stages help counteract the automatic, biased thinking described earlier. By deliberately pausing to reflect—both during and after decisions—you can catch yourself falling into traps like the sunk-cost fallacy or overconfidence.
Practical implication: If an exam asks how to improve decision-making quality, "incorporating reflection in action and on action" is a textbook answer for ethical decision-making.
Flashcards
How do individuals make decisions according to Rational Choice Theory?
By consistently choosing options that maximize personal benefit while considering all costs and benefits.
Upon what factors do decision makers assess alternatives in Subjective Expected Utility Theory?
The utilities of the alternatives and the subjective probability of their occurrence.
How can rational decision-making minimize subjectivity according to Scenario Optimization?
By grounding decisions in mathematical models.
How does the sunk-cost fallacy influence individual decision-making?
It causes individuals to base current decisions on past investments rather than future outcomes.
How does risk perception change based on potential outcomes in Prospect Theory?
Risk-seeking when evaluating potential losses
Risk-averse when evaluating potential gains
How does Optimism Bias influence an individual's perception of risk?
Individuals overestimate the likelihood of positive events and underestimate the likelihood of negative events.
What is the definition of "reflection in action" in the context of moral decision-making?
Evaluating decisions while they are currently being made.
Quiz
Decision-making - Decision Theories and Individual Methods Quiz Question 1: In the Moral Decision‑Making Stages, what does “reflection in action” involve?
- Evaluating decisions while they are being made. (correct)
- Assessing decisions after they have been completed.
- Generating possible actions before recognizing a problem.
- Prioritizing community norms over personal values.
Decision-making - Decision Theories and Individual Methods Quiz Question 2: Which statement is NOT an assumption of rational choice theory?
- Decisions are driven primarily by emotional impulses. (correct)
- Individuals evaluate all costs and benefits before choosing.
- People aim to maximize personal utility.
- Choices result from systematic comparison of alternatives.
Decision-making - Decision Theories and Individual Methods Quiz Question 3: According to subjective expected utility theory, how is the value of an option determined?
- By summing each outcome’s utility multiplied by its perceived probability. (correct)
- By choosing the option with the highest objective probability.
- By ranking outcomes solely by their monetary value.
- By selecting the alternative with the least risk.
Decision-making - Decision Theories and Individual Methods Quiz Question 4: In prospect theory, how do people typically behave when confronting potential losses?
- They become risk‑seeking. (correct)
- They become risk‑averse.
- They ignore the loss entirely.
- They treat losses the same as gains.
In the Moral Decision‑Making Stages, what does “reflection in action” involve?
1 of 4
Key Concepts
Decision-Making Theories
Rational Choice Theory
Subjective Expected Utility Theory
Prospect Theory
Cognitive Biases
Framing Effect
Optimism Bias
Decision Optimization
Scenario Optimization
Moral Decision‑Making Stages
Definitions
Rational Choice Theory
A framework asserting that individuals consistently select options that maximize their personal benefit after weighing all costs and benefits.
Subjective Expected Utility Theory
A decision model where choices are evaluated based on personal utility values and the decision maker’s perceived probabilities of outcomes.
Scenario Optimization
A mathematical approach to decision‑making that seeks optimal solutions while minimizing reliance on subjective judgments.
Framing Effect
A cognitive bias where people’s decisions are influenced by how information is presented, such as the sunk‑cost fallacy.
Prospect Theory
A behavioral economics theory describing how people are risk‑seeking for losses and risk‑averse for gains.
Optimism Bias
The tendency to overestimate the likelihood of positive events and underestimate the likelihood of negative ones.
Moral Decision‑Making Stages
A prescriptive model outlining reflective evaluation of actions during (reflection in action) and after (reflection on action) decision processes.