Most popular posts
- What makes great boards great
- The fate of control
- March Madness and the availability heuristic
- When business promotes honesty
- Due diligence: mine, yours, and ours
- Alligator Alley and the Flagler (?!) Dolphins
- Untangling skill and luck in sports
- The Southeastern Growth Corridors
- Dead cats and iterative collaboration
- Empirical evidence: power corrupts?
- A startup culture poses unique ethical challenges
- Warren Buffett and after-tax returns
- Is the secret to national prosperity large corporations or start-ups?
- This is the disclosure gap worrying the SEC?
- "We challenged the dogma, and it was incorrect"
- Our column in the Tampa Bay Business Journal
- Our letter in the Wall Street Journal
Other sites we recommend
It’s unwise to rely on one’s instincts to decide when to rely on one’s instincts, redux
We’ve written frequently on the subject of cognitive biases and how to design decision making processes to account for them. A good process will entail astute management of the social, political and emotional aspects of decision making and address or at least understand the underlying biases of the participants.
We recently came across this piece in the archives at HBS Working Knowledge which introduces research on “fundamental attribution bias” (a.k.a. snap judgments), and how resistant that bias is to cures. Apparently it is so deeply rooted in our decision making processes that even highly trained people, warned explicitly of its dangers, remain susceptible.
People make snap judgments all the time. That woman in the sharp business suit must be intelligent and successful; the driver who just cut me off is a rude jerk.
These instant assessments, when we attribute a person’s behavior to innate characteristics rather than external circumstances, happen so frequently that psychologists have a name for them: “fundamental attribution errors.” Unable to know every aspect of a stranger’s back-story, yet still needing to make a primal designation between friend and foe, we watch for surface cues: expensive pants—friend; aggressive driving—foe.
The research looks at highly trained professionals – college admissions officers and hiring managers – and finds “how difficult it was to counteract the fundamental attribution error, and, particularly, how strongly its effects could be seen in these records.”
The first study asked professional university admissions officers to evaluate nine fictional applicants, whose high schools were reportedly uniform in quality and selectivity. Only one major point of variance existed between the schools: grading standards, which ranged from lenient to harsh. Predictably, students from “lenient” schools had higher GPAs than students from “harsh” schools—and, just as predictably, those fictional applicants got accepted at much higher rates than their peers.”We see that admissions officers tend to pick a candidate who performed well on easy tasks rather than a candidate who performed less well at difficult tasks,” says Gino, noting that even seasoned professionals discount information about the candidate’s situation, attributing behavior to innate ability.
Similar results can be seen for the second study, in which the researchers asked business executives to evaluate twelve fictional candidates for promotion. In this scenario, certain candidates had performed well at an easier job (managing a relatively calm airport), while others had performed less well at a harder job (managing an unruly airport).
As with the admissions officers, the executives consistently favored employees whose performance had benefited from the easier situation—which, while fortuitous for those lucky employees, can be disastrous on a company-wide scale. When executives promote employees based primarily on their performance in a specific environment, a drop in that employee’s success can be expected once they begin working under different conditions, Gino explains…
“We thought that experts might not be as likely to engage in this type of error, and we also thought that in situations where we were very, very clear about [varying external circumstances], that there would be less susceptibility to the bias,” she says. “Instead, we found that expertise doesn’t help, and having the information right in front of your eyes is not as helpful.”
The researchers do not yet have recommendations to offer as it relates to hiring, but we might have one in The Library at St. Pete: Who: The A Method for Hiring by Geoff Smart and Randy Street. The book outlines a hiring process that reduces the risk of making a bad hire – the costs of which can be great.