Small rewards get people to see the truth in politically unfavorable news

a hammer blows on a chat text bubbleExpand Getty

Personalizing why so many people are willing to share misinformation online is a major goal among behavioral scientists. It's easy to think that partisanship is the root of everything - people will just share things that make their side look good or their opponents look bad. But the reality is a bit more complicated. Studies have indicated that many people don't seem to carefully assess the accuracy of links, and that partisanship may be secondary to the rush to get lots of likes on social media. Given this, it's unclear what causes users to stop sharing things that a little verification would reveal to be false.

So a team of researchers tried the obvious: we'll give you money if you stop and rate the accuracy of a story. The work shows that small payments and even minimal rewards boost the accuracy of people's rating of stories. Almost all of this effect comes from people recognizing that stories that do not favor their political position are factually accurate. Although silver further bolstered the accuracy of the Conservatives, they were so far behind the Liberals in judging accuracy that the gap remains substantial.

Money for precision

The basic outline of the new experiences is quite simple: gather a group of people, ask them about their political leanings, then show them a set of headlines as they would appear on a social networking site such as Facebook . Headlines were scored based on their accuracy (i.e. whether they were true or false) and whether they would be more favorable to liberals or conservatives.

Consistent with past experience, participants were more likely to believe headlines that favored their political leanings were true. As a result, most of the misinformation believed to be true arose because people valued consistency with their political leanings. While this is true on both sides of the political spectrum, conservatives were far more likely to consider misinformation to be true, an effect so often observed that researchers cite seven different papers as having previously shown it.

In itself, this type of replication is useful but not very interesting. The interesting thing came when researchers started to vary this procedure. And the simplest variation was where they paid participants a dollar for each story correctly identified as true.

In the news that won't surprise anyone, people have become more efficient at pinpointing when stories weren't true. In raw numbers, participants scored an average of 10.4 accuracy scores (out of 16) in the control condition, but more than 11 out of 16 in the payment condition. This effect was also manifested when, instead of paying, participants were told that the researchers would give them an accuracy score after the experiment ended.

The most striking thing about this experiment is that almost all of the improvement came when people were asked to rate the accuracy of statements that favored their political opponents. In other words, the award made people more aware of the truth in statements that they would, for political reasons, prefer to consider false.

A smaller gap, but a gap nonetheless

The reverse happened when the experiment was modified and people were asked to identify stories that their political allies would like. Here the accuracy has dropped. This suggests that participants' mindset played an important role, as getting them to focus on politics caused them to focus less on accuracy. Notably, the effect was about as great as a financial reward.

Researchers also created a condition in which users were not told the source of the headline, so they could not identify whether it was from pro-supporter media. It didn't make a significant difference in the results.

As noted above, conservatives are generally worse than liberals, with the average conservative scoring 9.3 out of 16 and the typical liberal 10.9. Both groups see their accuracy increase when there are prompts, but the effects are greater for conservatives, increasing their accuracy to an average of 10.1 rig...

Small rewards get people to see the truth in politically unfavorable news
a hammer blows on a chat text bubbleExpand Getty

Personalizing why so many people are willing to share misinformation online is a major goal among behavioral scientists. It's easy to think that partisanship is the root of everything - people will just share things that make their side look good or their opponents look bad. But the reality is a bit more complicated. Studies have indicated that many people don't seem to carefully assess the accuracy of links, and that partisanship may be secondary to the rush to get lots of likes on social media. Given this, it's unclear what causes users to stop sharing things that a little verification would reveal to be false.

So a team of researchers tried the obvious: we'll give you money if you stop and rate the accuracy of a story. The work shows that small payments and even minimal rewards boost the accuracy of people's rating of stories. Almost all of this effect comes from people recognizing that stories that do not favor their political position are factually accurate. Although silver further bolstered the accuracy of the Conservatives, they were so far behind the Liberals in judging accuracy that the gap remains substantial.

Money for precision

The basic outline of the new experiences is quite simple: gather a group of people, ask them about their political leanings, then show them a set of headlines as they would appear on a social networking site such as Facebook . Headlines were scored based on their accuracy (i.e. whether they were true or false) and whether they would be more favorable to liberals or conservatives.

Consistent with past experience, participants were more likely to believe headlines that favored their political leanings were true. As a result, most of the misinformation believed to be true arose because people valued consistency with their political leanings. While this is true on both sides of the political spectrum, conservatives were far more likely to consider misinformation to be true, an effect so often observed that researchers cite seven different papers as having previously shown it.

In itself, this type of replication is useful but not very interesting. The interesting thing came when researchers started to vary this procedure. And the simplest variation was where they paid participants a dollar for each story correctly identified as true.

In the news that won't surprise anyone, people have become more efficient at pinpointing when stories weren't true. In raw numbers, participants scored an average of 10.4 accuracy scores (out of 16) in the control condition, but more than 11 out of 16 in the payment condition. This effect was also manifested when, instead of paying, participants were told that the researchers would give them an accuracy score after the experiment ended.

The most striking thing about this experiment is that almost all of the improvement came when people were asked to rate the accuracy of statements that favored their political opponents. In other words, the award made people more aware of the truth in statements that they would, for political reasons, prefer to consider false.

A smaller gap, but a gap nonetheless

The reverse happened when the experiment was modified and people were asked to identify stories that their political allies would like. Here the accuracy has dropped. This suggests that participants' mindset played an important role, as getting them to focus on politics caused them to focus less on accuracy. Notably, the effect was about as great as a financial reward.

Researchers also created a condition in which users were not told the source of the headline, so they could not identify whether it was from pro-supporter media. It didn't make a significant difference in the results.

As noted above, conservatives are generally worse than liberals, with the average conservative scoring 9.3 out of 16 and the typical liberal 10.9. Both groups see their accuracy increase when there are prompts, but the effects are greater for conservatives, increasing their accuracy to an average of 10.1 rig...

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow