This blog post is partly taken from an original story that appears on the Syracuse University site .
What is the role of social media in shaping our political attitudes? New research published in Nature sets out to understand whether and how the information people see on social media shapes their political views. Entitled 鈥,鈥 this groundbreaking research uses an on-platform experiment to examine what happens when Facebook users see dramatically less content from people who share their political leanings.
The lead researchers 鈥 Professors Brendan Nyhan from Dartmouth University, Jaime Settle from William & Mary, Emily Thorson from Syracuse University and Magdalena Wojcieszak, professor of communication, from University of California, Davis 鈥 ran a study for three months in 2020 that reduced the volume of content from politically like-minded sources in the Feeds of consenting participants.
The researchers found that the majority of Facebook users鈥 news feeds consists of posts from politically like-minded sources, while political information and news represent only a small fraction of their feeds.
In addition to decreasing exposure to content from like-minded sources, the experimental intervention also resulted in a decrease in exposure to uncivil language and an increase in exposure to posts from sources with politically dissimilar views.
However, the researchers found that these changes to a person鈥檚 Facebook feed had no impact on a variety of beliefs and attitudes, including affective polarization, ideological extremity and beliefs in false claims.
These findings are part of a broader research project examining the role of social media in U.S. democracy. Known as the, the project is the first of its kind providing social media scientists with access to social media data that previously has been largely inaccessible.
Seventeen academics from U.S. colleges and universities, including 澳门六合彩资料库 Davis, teamed up with Meta to conduct independent research on what people see on social media and how it affects them. The project built in several safeguards to protect the researchers鈥 independence. All the studies were preregistered, and Meta could not restrict or censor the findings. The academic lead authors had final authority on all writing and research decisions.
The research was divided into two parts.
From June to September 2020, the researchers measured how often all adult Facebook users saw content from politically aligned sources. The results showed that for the median Facebook user, slightly over half the content they saw was from politically like-minded sources, and just 14.7% was from sources with different political leanings.
In September to December 2020, the researchers conducted a multi-wave experiment with 23,377 consenting adult users of Facebook in the US. The study reduced the volume of content from like-minded sources to gauge the effect on political attitudes. People in the treatment group saw about one-third less content from like-minded sources. In the treatment group, total engagement with content from like-minded sources decreased, but their rate of engagement increased: when they did see content from like-minded sources, they were more likely to click on it. This pattern illustrates human behavior compensating for algorithmic changes.