Why 100% of anything looks good: the appeal of one hundred percent
Abstract (via Rutgers) Meng Li and gretchen B. chapMan
People overweight certainty, even when certainty is only an illusion. A vaccine that was described as 100% effective against 70% of disease targets was preferred to one described as 70% effective against 100% of disease targets (Studies 1 and 2). The appeal of 100% extends beyond the probability attribute. In Study 2, participants preferred both of the vaccines above to normatively equivalent vaccines that were less than 100% effective toward fewer than 100% of targets. In Study 3, participants preferred a 100% discount on a cup of coffee every 10 days to other more frequent, but lower amount, discounts. This preference evaporated, however, when savings were framed as points rather than as percentage discounts. We propose that people view 100% as a salient reference point and overweight it in those domains where it cannot be exceeded (e.g., probability, discount); the overweighting is weaker in domains where 100% can be exceeded (e.g., target range, points).
Introduction (via Rutgers)
“Not to be absolutely certain is, I think, one of the essential things in rationality,” wrote the great English philosopher Bertrand Russell (1949). However, people tend to overweight outcomes that are considered certain relative to outcomes that are merely possible—a phenomenon labeled the certainty effect by Kahneman and Tversky (1979).
One famous example of the certainty effect is the Allais paradox (Allais, 1953), shown in Table 1. The modal
response was choosing A in Choice 1, but D in Choice 2, even though Choice 2 is derived by deducting a .89 chance of $1 million from both options in Choice 1. The preference for certainty was demonstrated further by Kahneman and Tversky (1979). As shown in Table 1, participants preferred A to B, but also preferred D to C, even though Choice 2 was derived by dividing the probabilities in Choice 1 by a common ratio. These examples illustrate the general appeal of certainty. Prospect theory (Kahneman & Tversky, 1979) originally described the weight that people attach to probabilities as a function that is shallow for intermediate
probabilities and that changes abruptly near the endpoints of certainty. Subsequent research suggests an inverse S-shaped probability weighting function: concave for low probability and convex for high probability (Camerer & Ho, 1994; Gonzalez & Wu, 1999; Hartinger, 1999; Tversky & Kahneman, 1992; Wu & Gonzalez, 1996). Tversky and Kahneman hypothesized diminishing sensitivity to be the underlying psychological mechanism for the shape of the weighting function: People become less sensitive to changes in probability as they move away from the two natural reference points of 0 (certainly will not happen) and 1 (certainly will happen).
In the present article, we first demonstrate the realworld application of the pseudocertainty effect on a timely topic: the human papillomavirus (HPV) vaccine. Second, we extend the certainty effect to a general 100% effect. As mentioned earlier, overweighting of certainty can be explained by diminishing sensitivity to probabilities away from the reference point. We demonstrate that 100% on other attributes can be overweighted because of the same principle. Finally, we explore the boundary conditions under which the 100% effect occurs.
Not only do people overweight certainty, they are also subject to an illusion of certainty, even when outcomes are not certain, and they overweight this pseudocertainty relative to other probabilities. For example, Slovic, Fischhoff, and Lichtenstein (1982) demonstrated that people were more attracted to a vaccine that was described as eliminating a 10% risk for one of two equiprobable diseases than if
it was described as reducing the risk for one disease from 20% to 10%. This preference violates principles of normative expected utility theory (Savage, 1954), because the net risk reduction is the same across the two versions. The Slovic et al. study demonstrated a pseudocertainty effect: Certainty contingent upon a condition (e.g., risk elimination within a specified risk subset) is overweighted, as is true certainty (Kahneman & Tversky, 1979).
Findings (via Rutgers)
This 100% effect is consistent with a general principle: Sensitivity to changes diminishes as the distance from the
reference point increases (Tversky & Kahneman, 1992). Arkes (1991) pointed out that nonlinear functions with this attribute save cognitive effort, although errors also occur as side effects. For example, our brain seems to use less time in processing certainty than it does in processing other probabilities. In a study using gambles that either incorporated
a certain outcome or did not, participants’ response times and brain activation both exhibited less computation and
evaluative processing when certainty was present (Dickhaut et al., 2003). It is possible that people process 100%
of other attributes in a similar way, with less computation and more intuition, for better or for worse.
The appeal of 100% has important implications for health promotion, consumer decisions, and public policy. It is another cognitive bias that can intentionally or unintentionally sway decisions, especially because it is subject to a framing effect: A proportion of a whole set is 100% of a subset; risk reduction is elimination of specific risks; a small savings is a 100% discount on a single item. After all, anything can be described as “100% of something.”