Facebook Bad Science study


It was Elis Pariser who first drew widespread attention to the possibility of a filter bubble within Facebook “due to data drawn from intensely personalized connections” and started the debate on “echo chambers” (in which individuals are exposed only to information from individuals of similar convictions and beliefs) and “filter bubbles”(in which content is selected by algorithms based on the end-user’s previous behavior).

Since then the debate has raged over whether Facebook (a closed social network) is creating filter bubbles through its algorithm that determine what news item its News Feed will display for individuals to read. 

The latest study, carried out by Facebook researchers and published in Science shows conclusively that there is a filter bubble which is exacerbated by personal choices leading in an ever tightening spiral of content that shows users the world they expect to see amplified via the lens of their own bias. 

Why Are Filter Bubbles Bad?

If you believe that social media is a great way to connect with others, share opinions and information, discover new things and broaden your knowledge base then a filter bubble is a bad thing indeed. It reduces you to a captive, held on a reservation where the only thing you will get to see are other captives who are just like you and who share much the same ideology and professed belief system. 

Considering that research on social media news sharing shows that a diverse “marketplace of ideas” is key to a healthy democracy, amongst other things, a filter bubble, created to narrowly contextualize content and capitalize on hyper-targeted advertising, is good neither for democracy nor for Facebook users. 

The Facebook Study is Bad Science

The Facebook study compared content that is affected by three distinct end user behaviors: 

  • Homophily – people’s similar, expressed ideology
  • Algorithmic ranking – what Facebook does to rank and present content in its News Feed
  • Selective avoidance of attitude-challenging items – the choices we make on content we decide to click on, or not

It then appeared to compare how personal selection of content measured up against Facebook’s ranking of news items in its News Feed and decided that:

on average in the context of Facebook, individual choices more than algorithms limit exposure to attitude-challenging content.

Impartial as this may sound it is actually wrong and its calculation itself is based upon bad science. As Microsoft’s Christian Sandvig makes abundantly clear in order to compare the effects of personal choice over algorithmic ranking in the surfacing of content the two must be independent of each other and not controlled by the agency that is studying them. 

In this case not only is the algorithmic ranking of Facebook’s News feed controlled completely by Facebook (which changes it frequently) but also personal choices on what content to see, made by Facebook users who clicked on specific items are seen by Facebook’s News ranking algorithm, which then further reinforces the bias and decides what to show and what not to show. 

To add insult to injury Facebook’s study group of end users appear to have been chosen under parameters that define very specific, non-representative behavior. This makes the 4% of the Facebook population of active users sample, that Facebook used, totally useless by way of extracting meaningful data signals, from.  

Princeton’s Zeynep Tufekci makes much the same point when she suggests that Facebook buried its research findings “…as deep as it could, using a mix of convoluted language and irrelevant comparisons”. 

Is Facebook’s Study Any Good?

From a purely marketing point of view Facebook proved that its algorithm is good at picking up on what people do not want to see and not showing it to them. This makes content diversity poor and the role of “discovery” via content sharing problematic for brands, unless of course they pay to increase the visibility of their content. 

It is tempting here to make this the basis of yet another Facebook-bashing post. The world’s largest social network is not going anywhere any time soon, but beyond linking up with friends and family its members may not get as much value out of their time on the platform as they might hope. 


Exposure to ideologically diverse news and opinion on Facebook
Supporting Materials for Exposure to Ideologically Diverse News and Opinion on Facebook (pdf)
What Facebook’s Algorithm Change Means for Brands, Publishers, and the Future of Media