We do not take into consideration social network information, such as what groups and friends the panelists have voluntarily subscribed to, connected to, or joined, which also influences what content is available for their feed. A recent Pew Study suggests that “echo chambers” are not an accurate reflection of people’s experiences on social media but rather that most people are exposed to a wide array of political viewpoints and individually choose to block and filter what they do not wish to see. The most common percentage point difference we observed was between 1 and 2 percent.Ĭitizen Browser cannot reverse-engineer Facebook’s recommendation algorithm, and none of our observations should be treated as causal claims that Facebook has targeted a specific piece of content at a specific grouping. The histogram below shows the distribution of all percentage point differences across all pairings based on content collected over the course of 92 days, from Dec. 1, 2020, to March 2, 2021, a total of 17,332 observations. In order to contextualize these values, we look at their distribution in data we have already collected to determine which values are common and which are unusual. So if a piece of content is seen by 5 percent of Trump supporters and 11 percent of Biden supporters, the difference is 6 percentage points. ![]() To rank content based on population differences, we look for the largest absolute difference in the population percentage points between two groupings (disregarding anything with a difference of less than 0.1 percentage points). To calculate population percentages in the groupings, we divide the number of unique panelists shown a particular piece of content by the number of panelists in that same grouping during the date range. See the Limitations section for more details. There are many factors that influence any given person’s feed that we do not account for, including users’ friends and social networks. Our observations should not be taken as proof of Facebook’s choosing to target specific content at specific demographic groups. We compare the percentage points of each grouping that was served each piece of content to that of the other grouping in the pair.įor more information on the data we collect, the panel’s demographic makeup, and the extensive redaction process we undertake to preserve privacy, see our methodology How We Built a Facebook Inspector. We describe each pairing in more detail in the Pairings section of this article.įor each pair, we examine four types of content served by Facebook: news sources, posts with news links, hashtags, and group recommendations. We labeled our panelists based on their self-disclosed political leanings, gender, and age. To measure what Facebook’s recommendation algorithm displays to different groupings of people, we compare data captured from each over a two-week period. From Dec. 1, 2020, to March 2, 2021, 2,601 paid participants have contributed their data to the project. These captures collect the content that was displayed on their Facebook feeds at the moment the app performed its automated capture. Using Citizen Browser, our custom Facebook inspector, we perform daily captures of Facebook data from paid panelists. Our interactive dashboard, Split Screen, gives readers a peek into the content Facebook delivered to people of different demographic backgrounds and voting preferences who participated in our Citizen Browser project.īecause it turns out moving fast and breaking things broke some super important things. And Facebook itself, in an internal report, found that its recommendation engine fueled polarization and radicalization.īut it’s difficult to determine the effects of this personalization because independent researchers can’t easily access Facebook’s platform to study its effects. In 2010, Eli Pariser coined the term “ filter bubble” to describe the way algorithms like Facebook’s reinforce users’ beliefs by showing them more of what they already prefer.Īccording to a 2020 Gallup-Knight survey, 60 percent of Americans feel big tech companies are furthering divides on social issues. ![]() Many believe that the rise of personalized news and information on a massive scale has changed how people view the world. Facebook provides each of its 2.7 billion users with a personalized stream of news, advertisements, and recommendations tailored to what Facebook thinks each user will like.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |