New Research Shows How Meta's Algorithms Shaped User Vote Feeds in 2020

Nearly three years ago, Meta announced that it was partnering with more than a dozen independent researchers to study the impact of Facebook and Instagram on the 2020 election. Meta and the Researchers both promised that the project, which would draw on treasure troves of internal data, would provide an independent look at issues such as polarization and misinformation.

We now have the first results of this research in the form of four peer-reviewed papers published in the journals Science and Nature. The studies offer intriguing news watch how Facebook and Instagram's algorithms affected what users saw in the run-up to the 2020 presidential election.

Articles are also a milestone for Meta. The company has at times had strained relations with independent researchers and has been accused of "transparency theater" in its efforts to make more data available to those who wish to understand what is happening on this platform. In a statement, Meta policy chief Nick Clegg said research suggests Facebook may not be as influential in shaping its users' political beliefs as many believe. "Experimental studies add to a growing body of research showing that there is little evidence that key features of Meta platforms alone cause detrimental 'affective' bias, or have significant effects on key political attitudes, beliefs or behaviors,” he wrote.

The researchers' initial results seem to paint a more complex picture, however.

A study published in Nature examined the effect of so-called "echo chambers", or when users are exposed to a large number of "like-minded" sources . While the researchers confirm that most users in the United States see a majority of content from "like-minded friends, pages, and groups," they note that none of this is explicitly political or related to the news. They also found that decreasing the amount of “like-minded” content reduced engagement, but did not measurably change user beliefs or attitudes.

Although the authors note that the results do not take into account the "cumulative effects" that years of social media use may have had on their subjects, they suggest that the effects of echo chambers are often poorly characterized .

Another study published in Nature examined the effect of chronological versus algorithmically generated streams. This question has taken on particular importance in 2021, thanks to revelations from whistleblower Frances Haugen, who has argued for a return to chronological flows. Unsurprisingly, the researchers concluded that Facebook and Instagram's algorithmic feeds "strongly influence user experiences."

"The timeline feed has significantly reduced the time users spend on the platform, reduced the number of users engaging with content while on the platform, and changed the mix of...

New Research Shows How Meta's Algorithms Shaped User Vote Feeds in 2020

Nearly three years ago, Meta announced that it was partnering with more than a dozen independent researchers to study the impact of Facebook and Instagram on the 2020 election. Meta and the Researchers both promised that the project, which would draw on treasure troves of internal data, would provide an independent look at issues such as polarization and misinformation.

We now have the first results of this research in the form of four peer-reviewed papers published in the journals Science and Nature. The studies offer intriguing news watch how Facebook and Instagram's algorithms affected what users saw in the run-up to the 2020 presidential election.

Articles are also a milestone for Meta. The company has at times had strained relations with independent researchers and has been accused of "transparency theater" in its efforts to make more data available to those who wish to understand what is happening on this platform. In a statement, Meta policy chief Nick Clegg said research suggests Facebook may not be as influential in shaping its users' political beliefs as many believe. "Experimental studies add to a growing body of research showing that there is little evidence that key features of Meta platforms alone cause detrimental 'affective' bias, or have significant effects on key political attitudes, beliefs or behaviors,” he wrote.

The researchers' initial results seem to paint a more complex picture, however.

A study published in Nature examined the effect of so-called "echo chambers", or when users are exposed to a large number of "like-minded" sources . While the researchers confirm that most users in the United States see a majority of content from "like-minded friends, pages, and groups," they note that none of this is explicitly political or related to the news. They also found that decreasing the amount of “like-minded” content reduced engagement, but did not measurably change user beliefs or attitudes.

Although the authors note that the results do not take into account the "cumulative effects" that years of social media use may have had on their subjects, they suggest that the effects of echo chambers are often poorly characterized .

Another study published in Nature examined the effect of chronological versus algorithmically generated streams. This question has taken on particular importance in 2021, thanks to revelations from whistleblower Frances Haugen, who has argued for a return to chronological flows. Unsurprisingly, the researchers concluded that Facebook and Instagram's algorithmic feeds "strongly influence user experiences."

"The timeline feed has significantly reduced the time users spend on the platform, reduced the number of users engaging with content while on the platform, and changed the mix of...

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow