A few days ago, the US Senate Consumer Protection and Data Security Committee announced that there are bipartisan concerns about Facebook and its involvement in political divisions and strikes. The committee also confirmed that the site is also not safe for children and that the company insisted on publishing ads that harm this age group.
This comes after the former Facebook official, Frances Haugen, who leaked internal company documents, said that Facebook is focused on making profits and hiding things that may harm people, and that the company has hidden agendas and works against the public interest.
Haugen said documents she collected and shared with the Wall Street Journal and US law enforcement show the company is lying to the public while making significant progress against hate, violence, and disinformation. She also added that “the version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.”
More importantly, Haugen emphasized that Facebook proved it could do more to address these problems when it changed its content policies for several weeks around the time of the 2020 US election, adding that the company deliberately did not prioritize political content at the time. But the platform soon reverted to old algorithms that value participation over anything else, leading to riots on January 6 at the Capitol. Haugen explained that the company knows that if they change the algorithm to be more secure, people will spend less time on the site, they will click on fewer ads, and (Facebook) will make less money.
Indeed, what the former Facebook official said is not new at all, as recent years have witnessed an accelerated spread on the Internet of abusive or hate speech, and the problem has escalated at present, as toxic comments on the Internet have provoked violence on the ground, From the religious-nationalist revolutions in Myanmar to neo-Nazi propaganda in the United States.
A large number of studies and research papers in many European and Arab countries also unequivocally confirmed through the use of opinion polls that Facebook is heavily involved in causing political divisions and strikes. These studies have indicated that the prevalence of inflammatory content on Facebook has begun to increase to an alarming extent after many tech giants have recently used algorithms instead of the human element, making them a part of their content review process.
In addition to the absolute reliance on algorithms, even though Facebook’s standards specifically state that “organizations and people dedicated to promoting hate are not allowed to be on the platform,” the platform’s critics assert that there is a lack of transparency in the content removal policy. In other words, Facebook encourages its users to report content they consider harmful, however, the decision to remove or keep the reported content is not explained to users.
Undoubtedly, there is hate speech that is ravaging Facebook in many parts of the world, for racial, religious, political, sectarian, or other types of affiliation reasons. The owners of these feelings, who were timidly expressing them in closed rooms, found on Facebook a public space through which they spread hate speech to reach thousands and sometimes millions, which doubles its impact.
Furthermore, the most dangerous thing that could happen is for this inflammatory rhetoric on Facebook to reach a critical mass in its numerical density and emotional intensity, which leads to the transformation of these conversations on Facebook into real hate crimes. This phenomenon is known as digital fires (as mentioned in the Risk Report issued by the International Economic Forum in 2013). The report put these digital fires on the list of risks that the world will face in the next ten years. Therefore, there must be real deterrent measures to reduce the risks that the world suffers from as a result of the policies that incite the spread of hate pursued by Facebook for material profit. The question that comes to mind now is how will the work of the Facebook platform be affected after what happened? Are we in the process of taking strict measures against the company, especially with Congress promising dire consequences?
Dr Marwa El-Shinawy is an Assistant Prof. at International American University for Specialized Studies(IAUS)