Facebook knew that its algorithm would promote extremism and did nothing

Facebook knew that its algorithm would promote extremism and did nothing

Facebook does not have the best track record when it comes to information legitimacy or fairness in promoting content. Proof of this is the number of posts filled with fake news that end up gaining traction on the social network or extremist content that ends up appearing without a defense mechanism for users.

According to a report in the Wall Street Journal, however, Mark Zuckerberg's company knew in advance of at least one of these two situations. Based on a presentation shown exclusively to select employees (and since then discarded without being applied in the official communication of Facebook products), the newspaper argues that the company's management was aware of the potential of its algorithm to promote divisive content, amplifying feelings of polarization.

Msrk Zuckerberg, Facebook CEO (Image: Disclosure / Facebook)

“Our algorithm explores the human brain's attraction to division. If we leave this unattended, Facebook can present users with more and more polarized content in an effort to gain more audience attention and expand the time (spent) on the platform, ”says a slide of the presentation that, according to Wall Street Journal, was summarily discarded and its remarks ignored.

The newspaper also notes that Facebook's chief policy officer, Joel Kaplan, thought at the time that the changes that led to the new algorithm would have affected more conservative users and publications, but did not explain in what context (whether he hid them or promoted them, for example) .

Responding to the story, Facebook issued a statement:

“We have learned a lot since 2016 and we are not the same company today. We have built a robust team focused on integrity, reinforced our policies and practices in order to limit harmful content, and used research to understand the impact of our platform on society so that we could improve ”.

In other words, Facebook has not denied the information raised by the Wall Street Journal. The North American newspaper pointed out, however, that even before the social network created this “integrity team”, a researcher at the service of the company, called Monica Lee, discovered in 2016 that “64% of all extremist groups enter (in the Facebook) through our recommendation tools ”. There was even an attempt to adjust the algorithm to curb this behavior, but the idea was supposedly overturned because it was “anti-growth”.

Recently, the company headed by Mark Zuckerberg appointed a supervisory committee that attacks precisely the promotion of polarizing and extreme content. Brazilian Ronaldo Lemos, lawyer and director of the Rio de Janeiro Institute of Technology and Society, is in this group.

  • Read more: Brazilian joins the Freedom Supervision Committee on Facebook