Facebook: friend or foe?
21100
post-template-default,single,single-post,postid-21100,single-format-standard,theme-bridge,bridge-core-3.1.6,woocommerce-no-js,qodef-qi--no-touch,qi-addons-for-elementor-1.6.9,qode-page-transition-enabled,ajax_fade,page_not_loaded,,qode-title-hidden,columns-4,qode-child-theme-ver-1.0.0,qode-theme-ver-30.4,qode-theme-bridge,qode_header_in_grid,qode-wpml-enabled,wpb-js-composer js-comp-ver-7.5,vc_responsive,elementor-default,elementor-kit-41156

Facebook: friend or foe?


Photo: Zamawawi Isa/Shutterstock


 

It was hard to escape news in early 2016 of Facebook’s “trending” feature. First we learned there were human curators behind what the social media company claimed to be an automated algorithm, then that these employees may be censoring conservative voices from users’ feeds. 

Facebook denied the allegations, which were first reported by Gizmodo. But shortly after, The Guardian U.K. published leaked documents detailing Facebook’s policy for both injecting and blacklisting stories. Facebook worked hard to cultivate the belief it was delivering impartial, mathematically and algorithmically determined news to users. The official description of its “trending” feature does not mention a human element:

“Trending shows you topics that have recently become popular on Facebook. The topics you see are based on a number of factors including engagement, timeliness, pages you’ve liked and your location.”

This results in what Upworthy co-founder Eli Pariser calls a “filter bubble,” a situation where the content a user is exposed to reinforces their pre-existing beliefs and limits exposure to new and different ideas.

The backlash against Facebook for using people − with all their fallibility and personal bias − to filter the news reveals an important characteristic of the contemporary age: algorithms are considered more trustworthy than human beings. This perspective misses or ignores the reality that bias exists within the math itself. Code is written by people, and people carry explicit and implicit biases. These biases manifest in a myriad of ways, sometimes subtly, sometimes blatantly.

In 2015, research from Carnegie Mellon University and the International Computer Science Institute found that Google’s ad-targeting algorithms would display higher-paying jobs to men visiting employment websites than to women. Bias might also show up in a lack of attention to detail. Early versions of Apple’s health application did not include a feature to track menstrual cycles, for example.

Gatekeeper to content

Algorithms prioritize some content over others, with high stakes. Facebook’s algorithm favours video and photos over links and status updates − another preference built into the system by humans. In response, and as a sign of Facebook’s tremendous impact, content creators and marketers have adapted to the algorithm, investing in more video content and graphic designers, altering how we experience the Internet in the process.

A 2015 study by analytics firm parse.ly showed that more than 40% of web traffic to news sites came from Facebook, beating Google by an inch and Twitter by a mile for redirects. Facebook therefore acts as a gatekeeper to content and is profiting from that role. Only a small fraction of a user’s followers will see a given post organically, so Facebook charges users to reach a bigger or targeted audience.

This model has huge implications for news organizations. One Morgan Stanley analyst told the New York Times, “In the first quarter of 2016, 85 cents of every new dollar spent in online advertising will go to Google or Facebook.” Content creators, publishers, businesses and organizations must use these platforms in order to reach their audience. Facebook in particular, with its 1.5 billion active users, has enormous power. Publishers must work with the social media giant in order for their content to be seen.

Facebook Live is the latest example of the control the company excerpts on news and cultural consumption. Facebook is paying publishers and celebrities to use the new service, which livestreams events to the Facebook app on mobile phones. “In practical terms,” explained an article in recode, “that means Facebook’s News Feed algorithm prioritizes live video while the broadcast is ongoing, meaning it will appear higher in people’s feeds. It’s also doing things like sending out push notifications when videos are live, and alerting people when they’ve missed a live video.”

With digital and social platforms increasing the size of their audience, traditional media is struggling to adapt. As examined elsewhere in this issue of the Monitor, dissemination of news and journalism has been heavily impacted by the change of distribution models, and these changes will only increase as time goes on.

It’s important to remember that Facebook is not a content creator − it is a platform. While independent publishers are able to use the platform to reach their audience, mainstream media outfits still dominate the field and largely determine what is considered newsworthy. Facebook is not in the business of creating the news − just sharing it.

Facebook’s domination of media creates the risk of the continued homogenization of information with a very corporate bias. It is a private corporation with no obligation to anyone but its shareholders. As users of the platform, we must remain sceptical of any product that has so much power and so little oversight. And as our lives increasingly go digital, we must be critical of the unconscious, unseen and subtle biases that exist around us, especially in those mechanisms and tools that claim to be impartial.

Reprinted with permission from Monitor Vol. 23 No. 2, July/August 2016.

Davis Carr is a communications assistant with the Canadian Centre for Policy Alternative (CCPA) and the cofounder of JustChange, a micro-granting organization that gives $1,000 every two months to a group or person effecting social, environmental or economic change in Ottawa.

 

No Comments

Sorry, the comment form is closed at this time.