This is the third installment in a short series about inequality. The first presented the logic of the evolutionary origins of inequality and its relationship to violence. The second showed how inequality leads to risk-taking. Today’s concerns something more blatantly political: how inequality incites people to manipulate public information for personal gain.
There’s a good chance that you’re familiar with the Homeland Security Advisory System (HSAS), even if you don’t know it by name. Replaced in 2011 by the National Terrorism Advisory System, it was that color-coded notification system developed in the wake of 9/11 that ostensibly functioned to alert the public of a probable terrorist threat to the United States. Green indicated a low risk of attack; blue, a general risk; yellow, an elevated risk; orange, a high risk; and red, a severe risk. In spite of the supposed “generality” of the blue level, the advisory had not once been lower than an “elevated” yellow in its nine years of existence. Instead, it vacillated between yellow and orange for the most part, though it did strike red in 2006.
If you ever had the suspicion that the HSAS was used, at least in part, as a political tool by the government who developed it, you certainly weren’t alone. Even Tom Ridge, the Secretary of Homeland Security from 2003 to 2005 and steward of the HSAS during this time, alleged that senior members of the George W. Bush White House had tried to massage the advisory scheme in their favor prior to the 2004 election. But does this sort of subterfuge work?
Robb Willer, Pat Barclay, and Stephen Benard have given us good reasons to think that it does. As an advisory system, the HSAS is subject to what biologists call a problem of honest signaling—that parties with different interests may or may not convey truthful (i.e. accurate) information to one another. The solution, as it must be in evolution just as in economics, is that the benefit of a behavior (such as a signal) must exceed its cost.
Which brings us back to the possibility of tampering with the HSAS. To estimate the actual threat level requires information unavailable to the average Jane or John. Terrorism risk data are collected and analyzed by various intelligence agencies.
This information is closely guarded and, when made more widely known, considerably distilled. Consequently, when a terror alert is raised, the public is hardly in a position to know the difference between a real threat and the appearance of one. The costs of deception, in other words, can be very small indeed.
And there are benefits. In 2004, Willer analyzed the effect that raising the threat level from yellow to orange had on presidential approval ratings as measured by Gallup polls. In conjunction with other public terrorist attack advisories, Willer found that Mr. Bush’s approval increased with the advertised threat level. Amazingly, even President Bush’s economic approval ratings appeared to increase with terrorism warnings, despite any real connection between them.
Willer’s finding is bolstered by other work demonstrating that simply reminding participants of 9/11 was enough to boost support for the former President. But this sort of thing may be a part of a larger pattern: our tendency to cooperate within groups when faced with outside challenges, such as threats from other groups or from natural disasters (see here andhere for a few examples).
Given this, Barclay and Benard sought to study whether people might take advantage of systems like the HSAS when they have something to gain and little to lose. The details of their beautifully crafted experiments are below, but here’s a synopsis. The experiments entailed:
- a task that measured cooperation among participants by contributions to their group;
- inequality of income;
- a costly, hard-to-predict, external threat to the group; and
- a manipulable warning system of the risk of this threat.
Specifically, participants played a series of Public Goods Games—a “tragedy of the commons” wherein players could selfishly keep money for themselves or selflessly contribute it, at a cost, to their group. Participants were informed that one member of each group was to be given more money in each round than the others to play with; this was the “high ranking” player, and the other players were the “low ranking” ones. In the contestable rank condition, players could withhold their funds from the group to try and grab (or keep) that top spot, but in the random rank condition, players were randomly assigned that role each round.
Whatever money participants contributed to the group not only helped the other players, but also protected the group from the threat of “failure” (determined before each round at random by a computer): if a group failed, then the players lost all the money they had previously earned. Participants had no idea what the actual risk of failure was and, therefore, how much they ought to contribute to the group to protect against it. However, they could pay to raise or lower theannounced threat level to the group. This announced level wouldn’t change the actual risk, but it could change what the players thought the risk might be.
What Barclay and Benard found is astonishing. Increasing the announced threat level led to more cooperation within groups, although high-ranking players contributed proportionately less than did low-ranking players.
High-ranking players spent more money than their low-ranking counterparts to raise the announced threat level.
This effect of rank on deception was magnified by the contestability of the higher rank. Moreover, players cooperated less when rank was contestable, leading groups to fail more often.
Does this seem familiar?
We will probably never know whether terror alerts have been distorted by political influence. Nevertheless, the findings above provide strong support for the Orwellian argument that those in power will seek to deceive those below them in an effort to safeguard their positions. Worse, this kind of manipulation often has its intended effect, which is to mollify the unlucky majority and preserve the perks of a charmed minority.