Cognitive Biases & The Judgement of Global Risk
Here is a paper detailing which cognitive biases are most likely to lead to poor judgments of global risk. Given our current “crisis” I think its important to understand how we might make psychological mistakes in allocating capital.
Article Introduction (Via Singularity Institute for AI)
All else being equal, not many people would prefer to destroy the world. Even faceless corporations, meddling governments, reckless scientists, and other agents of doom, require a world in which to achieve their goals of profit, order, tenure, or other villainies. If our extinction proceeds slowly enough to allow a moment of horrified realization, the doers of the deed will likely be quite taken aback on realizing that they have actually destroyed the world. Therefore I suggest that if the Earth is destroyed, it will probably be by mistake.
The systematic experimental study of reproducible errors of human reasoning, and what these errors reveal about underlying mental processes, is known as the heuristics and biases program in cognitive psychology. This program has made discoveries highly relevant to assessors of global catastrophic risks. A general principle underlying the heuristics-and-biases program is that human beings use methods of thought – heuristics – which quickly return good approximate answers in many cases; but which also give rise to systematic errors called biases. An example of a heuristic is to judge the frequency or probability of an event by its availability, the ease with which examples of the event come to mind.
Article Excerpts (Via Yudkowsky’s Paper for AI)
1. Availability – Suppose you randomly sample a word of three or more letters from an English text. Is it more likely that the word starts with an R (“rope”), or that R is its third letter (“park”)?
2. Hindsight Bias – when subjects, after learning the eventual outcome, give a much higher estimate for the predictability of that outcome than subjects who predict the outcome without advance knowledge. Hindsight bias is sometimes called the I-knew-it-all-along effect.
3. Black Swans – are an especially difficult version of the problem of the fat tails: sometimes most of the variance in a process comes from exceptionally rare, exceptionally huge events.
4. The conjunction fallacy – More generally, people tend to overestimate conjunctive probabilities and underestimate disjunctive probabilities. (Tversky and Kahneman 1974.) That is, people tend to overestimate the probability that, e.g., seven events of 90% probability will all occur.
5. Confirmation Bias – Subjects free to choose their information sources will seek out supportive rather than contrary sources
6. Anchoring, Adjustmet, & Contamination– Information that is visibly irrelevant still anchors judgments and contaminates guesses. When people start from information known to be irrelevant and adjust until they reach a plausible-sounding answer, they under-adjust. People under-adjust more severely in cognitively busy situations and other manipulations
7. The Affect Heurisitc – The affect heuristic refers to the way in which subjective impressions of “goodness” or “badness” can act as a heuristic, capable of producing fast perceptual judgments, and also systematic biases.
8. Scope Neglect – (example)- (2,000 / 20,000 / 200,000) migrating birds die each year by drowning in uncovered oil ponds, which the birds mistake for bodies of water. These deaths could be prevented by covering the oil ponds with nets. How much money would you be willing to pay to provide the needed nets?
9. Over-Confidence – Applies forcefully to the domain of planning, where it is known as the planning fallacy.
10. Bystander Apathy – The bystander effect is usually explained as resulting from diffusion of responsibility and pluralistic ignorance. Being part of a group reduces individual responsibility. Everyone hopes that someone else will handle the problem instead, and this reduces the individual pressure to the point that no one does anything.