Facebook Algorithms Promote Extremists & Make You Addicted to Social Media: Report

Facebook has struggled to control extremist and violent content on its platform since its inception. The social network giant found itself involved in controversy time and time again following the unfortunate events in Myanmar, New Zealand, and many others.

A new report from the Wall Street Journal suggests that Facebook’s top executives were aware of the platform promoting extremist groups, but chose not to take any action against it. One of the company’s internal presentations from 2018 showed how Facebook’s algorithms aggravated extremist content in some cases.

One of the slides from the presentation said that these algorithms are capable of feeding Facebook users divisive content if left unchecked. It said:

Our algorithms exploit the human brain’s attraction to divisiveness. If left unchecked, Facebook would feed users more and more divisive content in an effort to gain user attention & increase time on the platform.

In another statement, Facebook claimed that they had learned a lot since 2016 and built an “integrity team” and strengthened their policies to tackle such issues. Regardless, a Facebook researcher named Monica Lee found that 64% of all extremist group joins were due to the platform’s recommendation tools.

Facebook sought to tackle these problems by tweaking its algorithms and forming temporary teams to deal with the issue but these concepts were shot down to as they were “anti-growth”.

In the end, the social network giant ended up not doing much for the sake of upholding “free speech”. A wall Facebook has decided to hide behind on multiple similar instances.



Get Alerts

Follow ProPakistani to get latest news and updates.


ProPakistani Community

Join the groups below to get latest news and updates.



>