Sunday, 26 April 2015

An experiment on overconfidence


A nice experiment was presented at NIBS last week – Zahra Murad, "Confidence Snowballing in Tournaments". (No paper available yet.)

The talk started from a psychological idea: people are sometimes overconfident about their own performance, and get more overconfident after a few successes. This might explain, say, the overweening confidence of CEOs and "Masters of the Universe" bankers.

In the experiment, subjects competed, in pairs, on a task which could be either easy or difficult. Then winners were matched with other winners and played again – like a football tournament. (Losers were matched with other losers.) The winners of the second round played each other again, and so on.

Before each round, subjects had to bet on their own performance. Fro this, we can learn what chance they gave themselves of winning the round.

The beauty of this design is that having won a round tells you nothing about your chance of winning the next one, because you will be playing someone else who has also just won! So, a reasonable person would not get more confident after winning a round.*

In fact, on easy tasks, winners did get more confident. On difficult tasks, losers got less confident, also wrongly and for much the same reason.

What's good about this experiment?
  • It is real "behavioural economics"
It puts an insight from psychology, the hard-easy effect, into a social setting. So, it is not just psychology. But it is theory-driven: it is not trying to draw inferences about a real social situation directly from behaviour in a feeble laboratory pastiche of that setting.
  • It uses the lab to create an elegant simplification
In the real world, it's hard to judge how much somebody's confidence ought to increase after, say, making millions off a deal. In the experiment's stripped-down paradigm, there's a natural baseline: every time you win, you are rematched against other winners, so you should not get more confident.

In experiments, just as in formal models, "it's realistic" is a terrible reason to add a feature. These guys put in only what was needed.
  • It makes a nice parable...
Among the many things lab experiments can do – test theory, estimate psychological characteristics, explore institutions – is to serve as "parables" or "existence proofs".

Greek parables, like that of Icarus flying too close to the sun, have survived the centuries because they tell us about recurring patterns, helping us to recognize them in life and history. The Icarus myth's pattern is: hubris, insane arrogance, leads to nemesis, divine revenge.  Hubris and nemesis are still all around us (hullo neo-cons! hullo Eurozone!) You would not expect them always to happen – imagine a foolish political scientist estimating the per cent prevalence of hubris in international relations – but it is useful to know that they can.

Experiments can be modern parables.  Zimbardo's prison guards and Milgram's torturers have entered the folklore. They don't always apply, but they are things that can happen. (Hence "existence proof": an experiment shows, irrespective of external validity, that something has happened at least once.)
  • ... and fleshes it out
Parables are fine and underrated, but social science must investigate phenomena, not just exhibit them. The experiment shows social interactions can cause overconfidence, and also explores when it happens (easy tasks) and when other things such as underconfidence can happen. This opens up some avenues for real world study. (If only easy tasks make people overconfident, then what explains the presumed overconfidence of very successful people whose work seems quite difficult, like investment bankers?)

I see a lot of experiments and think either "this was obvious", or "I don't know what behaviour here tells us about the real world". This work passes these hurdles.


* Nerd note: there could be some set of priors for which a Bayesian updater would get more confident. So, increasing confidence is not necessarily irrational in the technical sense, just "unreasonable" in a common-sense way.

No comments:

Post a Comment