In gambling and economics, there’s an observed phenomenon favorite-long shot bias.
Here’s how it works …
In gambling and economics, there’s an observed phenomenon favorite-long shot bias.
Here’s how it works …
A classic “framing” question from Kahneman’s Thinking Fast, Thinking Slow …
Here’s the situation:
A woman has bought two $80 tickets to the theater.
When she arrives at the theater, she opens her wallet and discovers that the tickets are missing.
$80 tickets are still available at the box office.
Will she buy two more tickets to see the play?
Most (but, not all) survey respondents answer that the woman will go home without seeing the show.
Let’s try another situation …
interesting piece from the WSJ …
Psychology researchers have studied how people make decisions and concluded there are two basic styles.
“Maximizers” like to take their time and weigh a wide range of options—sometimes every possible one—before choosing.
“Satisficers” would rather be fast than thorough; they prefer to quickly choose the option that fills the minimum criteria (the word “satisfice” blends “satisfy” and “suffice”).
“Maximizers are people who want the very best.
Satisficers are people who want good enough,”
Take the quick test below to see if you’re a maximizer or satisficer…. and see what the implications are..
Dan Lovallo, a professor and decision-making researcher says, “Confirmation bias is probably the single biggest problem in business, because even the most sophisticated people get it wrong. People go out and they’re collecting the data, and they don’t realize they’re cooking the books.”
What’s this “confirmation bias” that Lovello is talking about?
No surprise, people tend to seek out information that supports their existing beliefs.
You know, liberals watch MSNBC, read the NY Times listen to BBC podcasts; conservatives watch FOX, read the WSJ and listen to Rush.
Behavioral psychologists call the he dynamic “confirmation bias”.
=====
In socio-politics, the confirmation bias tends to harden polarized positions. People just gather debate fodder rather than probing both sides of issues.
In the realm of decision making, confirmation bias has a dysfunctional effect: it leads to bad decisions.
I know that Andy Grove of Intel says “only the paranoid survive”.
But, work relationships are sometimes corrupted by negative assumptions that take on a life all their own.
A jabrone speaks out against your idea in a meeting, and you naturally assume that he’s trying to sabotage your or embarrass you in front of the boss.
If this situation happens a couple of times, you might declare war and go on the offensive to neutralize or defeat him.
To interrupt this cycle, some organizational leaders urge their employees to “assume positive intent,”
Here’s an interesting study excepted from Kahneman’s Thinking Fast, Thinking Slow…
Let’s start with some background … straight from Wiki:
The “seven-year itch” is a psychological term that suggests that happiness in a relationship declines after around year seven of a marriage.
The phrase was first used to describe an inclination to become unfaithful after seven years of marriage in the play The Seven Year Itch by George Axelrod, and gained popularity following the 1955 film adaptation starring Marilyn Monroe and Tom Ewell.
The phrase has since expanded to indicate cycles of dissatisfaction not only in interpersonal relationships but in any situation such as working a full-time job or buying a house, where a decrease in happiness and satisfaction is often seen over long periods of time.
* * * * *
OK, so is the 7-year itch just folklore for real?
We’ve all been there …
We’re in meetings pitching an idea when some jabrone pipes in:
“Let me play the role of devil’s advocate …”
He then blasts your idea with half-baked criticisms.
As you aggressively defend your cherished idea, he backs off:
“Hey man, I’m just playing devil’s advocate”.
“Say, what? You mean your just made up those cheap shots?”
I’ve been reading books on decision making this summer.
A couple have praised the use of so-called devil’s advocates to validate ideas and arguments.
Here’s what they’re talking about …
A classic “framing” question from Kahneman’s Thinking Fast, Thinking Slow …
Here’s the situation:
A woman has bought two $80 tickets to the theater.
When she arrives at the theater, she opens her wallet and discovers that the tickets are missing.
$80 tickets are still available at the box office.
Will she buy two more tickets to see the play?
Most (but, not all) survey respondents answer that the woman will go home without seeing the show.
Let’s try another situation …
Behavioral theorists have long observed that most people are risk adverse and, due in part to an “endowment effect”, they “value” losses greater than gains.
Endowment Effect: People tend to ascribe a higher value to things that they already own than to comparable things that they don’t own. For example, a car-seller might think his sleek machine is “worth” $10,000 even though credible appraisers say it’s worth $7,500. Sometimes the difference is due to information asymmetry (e.g. the owner knows more about the car’s fine points), but usually it’s just a cognitive bias – the Endowment Effect.
The chart below illustrates the gains & losses concept.
=====
For example, would you take any of these coin flip gambles?
Most people pass on #1 and #2, but would hop on #3 and #4.
OK, now let’s show how all of this relates to ObamaCare.
Here’s a classic test of intuitive skills excepted from Daniel Kahneman’s Thinking Fast, Thinking Slow…
As you consider this question, please assume that Steve – the subject — was selected at random from a representative sample.
Steve has been described by a neighbor as follows: “Steve is very shy and withdrawn, invariably helpful but with little interest in people or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail .”
* * * * *
Is Steve more likely to be a librarian or a farmer?
(more…)
Interesting study on cognitive biases from Daniel Kahneman’s Thinking, Fast and Slow …
Patients undergoing a painful medical procedure – think, colonoscopy without anesthesia – recorded their pain levels during the procedure on a range from no pain (zero) to excruciating (10).
Some of the procedures were short in duration … others were longer.
Below is the pain chart for 2 representative patients.
The patients were asked – after the fact—how painful the procedure was.
What’s your bet? Which patient claimed to have undergone the more painful procedure?
According to Chip & Dan Heath in Rotman Management article “The 4 Villains of Decision Making” …
“Research in Psychology over the last 40 years has identified a broad set of biases in our thinking that doom our decision making. If we aspire to make better choices, we must learn how these biases work and how to fight them.”
According to the Heath Brothers – academics & popular authors – there are 4 decision making villains that have to be confronted
In gambling and economics, there’s an observed phenomenon favorite-long shot bias.
Here’s how it works …
I often confess to my students that I thought about 51% of the business decisions were correct … and that was despite my analytical pre-disposition and the benefit of a highly proficient team of managers and analysts.
Apparently, my record was about on par.
According to decision scientists Chip & Dan Heath in Decisive: How to Make Better Choices …
OK, here’s a test for you …
Rank the the following by the odds that somebody who is in the group or who is exposed to the risk is likely to die.
Make #1 the highest risk of dying in the next year; make #7 the lowest risk circumstance
And the answer is …
In gambling and economics, there’s an observed phenomenon favorite-long shot bias.
Here’s how it works …
I’ll explain the picture later, but first, the back story.
A couple of interesting dots got connected last week.
First, I started watching The Voice.
I liked the talent and the bantering among the coaches, but wondered why they used the turning chairs gimmick. You know, judges can’t see the the performers, they can just hear them.
Became apparent when Usher turned his chair and was surprised to see that the high-pitched soul singer was a big white guy.
Hmmm.
=====
Second, for the course I’m currently teaching, I’ve been reading a book called The Art of Thinking Clearly — a series of short essays on cognitive biases – those sneaky psychological effects that impair our decision-making.
Since psychological studies first began, people have given themselves top marks for most positive traits.
While most people do well at assessing others, they are wildly positive about their own abilities.
The phenomenon is known as illusory superiority.
Illusory superiority is everywhere
Ironically, the most incompetent are also the most likely to overestimate their skills, while the ace performers are more likely to underrate themselves.
Psychologists say the illusory superiority happens for several reasons:
The remedy for illusory superiority ?
Since people are generally more accurate in assessing other people (than assessing themselves), get — and take to heart — constructive criticism from others.
Yeah, right.
Source: Why We’re All Above Average
* * * * *
Follow on Twitter @KenHoma >> Latest Posts
President Obama is clearly perplexed on why the dogs aren’t eating the ObamaCare food.
He’s trying to give people a better “product” … and they just don’t get it.
What the heck is going on?
Well, shoving the roll-out snafus aside, much of the answer lies in good old behavioral economics.
Last week we talked “loss aversion” and the “endowment effect”.
======
Today, let’s look at the “developers curse” …
President Obama is clearly perplexed on why the dogs aren’t eating the ObamaCare food.
He’s trying to give people a better “product” … and they just don’t get it.
What the heck is going on?
Well, shoving the roll-out snafus aside, much of the answer lies in good old behavioral economics.
======
Let me explain …
Answer: Real valuable.
Perennial question for Ivy-aimed high-schoolers is “which is better an A in a regular course or B in an AP course?”
Admission officers always say “Take the AP course and get an A in it”.
Easier said than done sometimes.
Fast forward to college and b-school admissions.
If you want to get into a highly ranked b-school, Is it better to get average grades in hard courses at an academically challenging college …. or high grades in easier courses at an easier or grade-inflated school?
Here’s the answer …
OK, here’s a test for you …
Rank the the following by the odds that somebody who is in the group or who is exposed to the risk is likely to die.
Make #1 the highest risk of dying in the next year; make #7 the lowest risk circumstance
And the answer is …
The effect is called “anchoring” … and it’s a well known cognitive bias.
When somebody is “primed” with a number, they will tend to internalize it and sub-consciously anchor their minds on the number.
Any estimates they then make are more often than not fine tuning adjustments around the anchor point.
“Any number that you are asked to consider as a possible solution to an estimation problem will induce an anchoring effect.”
For example, researchers consistently find that home appraisals and offer bids are invariably influenced by listing prices … even if objective, professional agents are involved … and even if they’re explicitly told to ignore the listing price.
Anchoring effects explain why, for example, arbitrary rationing is an effective marketing ploy.
A few years ago, supermarket shoppers in Sioux City, Iowa, encountered a sales promotion for Campbell’s soup at about 10% off the regular price.
On some days, a sign on the shelf said limit of 12 per person.
On other days, the sign said no limit per person. Shoppers purchased an average of 7 cans when the limit was in force, twice as many as they bought when the limit was removed.
Anchoring is not the sole explanation.
Rationing also implies that the goods are flying off the shelves, and shoppers should feel some urgency about stocking up.
But we also know that the mention of 12 cans as a possible purchase
So, to boost sales, tell customers that there’s a limit on the number of items they can buy.
They’ll get anchored on the limiting number … and often buy up to the limit.
The same effect occurs when products are priced as multiples … say, 3 for $6.
Shoppers will tend to buy 3, even if the retailer is only charging $2 each regrdless of how many are sold.
Excerpted from Kahneman, Thinking, Fast and Slow