Tversky & Kahneman (1974) emphasized
use of heuristics to make up for lack of
information
Strategies that can be applied
easily to a wide variety of
situations and often lead to
reasonable decisions
They provide plausible conjectures, but not irrefutable conclusions.
Availablity Huristic.
Decisions based on the most "available" Memories
Nota:
Judgments based on ease with which relevant instances can be retrieved from memory.
– E.g., Estimate in 7 seconds how many flowers, or Russian novelists you could name in two minutes.
Preference for recent anecdotal evidence
Is the letter 'r' more commonly the first or the third letter in words?
Media coverage makes certain causes of
death seem more likely then others
Generally we are less confident of decisions when asked
to produce more arguments in support (Bless & Pham,
2011).
Representativeness Heuristic
If something or someone appears to fit a category, you will use
what you know about that that category to make judgments.
coin flips and subset judgements
We should take statistics seriously
Nota:
We aren’t good with probabilities.
– Overconfidence takes over and we tend to think we can
beat the odds
– “statistics happen to other people.”
In risky financial markets this can get people into a lot of trouble.
E.g., most people lose their money in futures markets
– but the spectacular profits that can be gained draw in people who believe they will be the ones to win.
Biases
Overconfident
Confidence in decisions
climbs as more information
is obtained, even if
information is dubious
Nota:
This bias greater in more difficult tasks.
– Estimating our potential productivity.
– “I can do the assigned paper in 3 hours, no problem”
However, an under-confidence bias may be even more problematic.
– May never make any decisions.
Loss Aversive
We weigh prospect of losses
more heavily. Sell gains, hold
losses.
Kynamen called this Prospect theory
The Endowment Effect
Place higher value on what’s mine. – Bias may be
adaptive because losses could threaten survival.
Framing of the Problem
Framing Effects
We judge choices by comparing them to
others in the same category
Marketers often use products that no one wants
We tend to ignore base
rates, even when stated
explicitly
Nota:
Implications of analysis
Testing the whole population for HIV may kill more people than it saves.
Should you get a full-body scan that can randomly look for many diseases?
– Initial diagnosis effectively raises base-rate, thus makes specific tests more accurate.
Should we develop nation-wide databases for fingerprints and DNA?
– Only if we understand limitations.
– E.g., Man from US state of Oregon whose fingerprints matched some in Madrid after 2004 train bombing.
Favor guaranteed option when framed as a
gain, risky option when framed as loss.
Framing effects important decisions, like
organ donation. opt in v opt out.
Evidence of the effect of un important information
influence of faulty information
Distortions in Judgements
Status Quo Bias
may be maintained by Loss Aversion
Adaptivity
We have limited memory, cognitive capacity, and time, so make
the best decisions we can rather best that are possible.
We pick-up a lot of valid information from environment
Problems with Expected Utility theory
Nota:
Often doesn’t fit to empirical data. – Leads to various paradoxes.
– “Sunk cost” fallacy
Probabilities and utilities may be subjective, based on our own experience.
– Could represent individual beliefs
– Savage (1954) developed subjective expected
utility theory.
Can think of expected utility theory as a normative
theory
– what people should do, given certain assumptions.
Often doesn’t fit to empirical data
Probabilities and utilities
may be subjective,
based on our own
experience
Can think of expected
utility theory as a
normative theory