With roosters crowing in the background as he spoke from the crowded refugee camp in Bangladesh, which has been his home since 2017, Maung Sawyeddollah, 21, describes what happened when Violent hate speech and misinformation targeting the Rohingya minority in Myanmar began circulating on Facebook.
“We’re kind to most of the people there. But some types of people are very narrow-minded and very nationalist who have fueled the hatred of the Rohingya on Facebook,” he said. “And the good people, who have close ties to the Rohingya. They changed their mind against the Rohingya and they turned to hate.”
For years, Facebook, now known as Meta Platforms Inc., has claimed that it is a neutral platform in Myanmar that has been exploited by bad actors, and despite its efforts to remove violent and hateful material, has been very aggressive. sorry it failed. That story echoes its response to its role in other conflicts around the world, whether the 2020 election in the US or hate speech in India.
But a new and comprehensive report by Amnesty International says Facebook’s fancy story is false. Amnesty says that the platform is more than just a passive site with fully uncensored content. Instead, Meta’s algorithms “actively amplify and promote content” on Facebook, aiming to incite violent hatred against the Rohingya starting in early 2012.
Amnesty found that, despite warnings from years ago, the company not only failed to remove violent hate speech and misinformation against the Rohingya, but actively spread it. and amplified it until it culminated in the 2017 massacre. The timing coincides with Facebook’s growing popularity in Myanmar, where, for many, it’s seen as their only connection. them with the online world. That has effectively made Facebook the internet of a large number of Myanmar’s population.
More than 700,000 Rohingya fled to neighboring Bangladesh that year. Myanmar’s security forces are accused of rape, mass murder and the burning of thousands of Rohingya-owned homes.
“Meta – through dangerous algorithms and relentless pursuit of profit – has significantly contributed to serious human rights abuses against the Rohingya,” the report reads.
A spokesperson for Meta declined to answer questions about the Amnesty report. In a statement, the company said it “stands in solidarity with the international community and supports efforts to hold Tatmadaw accountable for their crimes against the Rohingya.”
“Our safety and integrity work in Myanmar remains guided by feedback from local civil society organizations and international organizations, including the United Nations Truth-Seeking Mission on Myanmar; the Human Rights Impact Assessment we conducted in 2018; as well as our ongoing human rights risk management” Rafael Frankel, Public Policy Manager for Emerging Markets, Meta Chau Asia-Pacific, said in a statement.
Like Sawyeddollah, who was cited in Amnesty’s report and spoke to the AP on Tuesday, most of those fleeing Myanmar — about 80 percent of the Rohingya live in Rakhine state, western Myanmar at the time. there – still a refugee. camp. And it is demanding that Meta compensate for its role in the violent crackdown on Rohingya Muslims in Myanmar, which the US declared a genocide earlier this year.
Amnesty’s report, out Wednesday, is based on interviews with Rohingya refugees, former Meta employees, academics, activists and others. It is also based on documents disclosed to Congress last year by whistleblower Frances Haugen, a former Facebook data scientist. It noted that digital rights activists said Meta had improved civil society engagement and some aspects of its content moderation practice in Myanmar in recent years. In January 2021, after a violent coup toppled the government, it banned the country’s military from its platform.
But critics, including some Facebook employees, maintain that such an approach will never really work. What that means is that Meta is trying to get rid of malicious material while its algorithms are designed to push “attractive” content that is more likely to make people angry that it is essentially against it.
“These algorithms are really dangerous to our human rights,” said Pat de Brun, human and artificial intelligence researcher and consultant.
“The company has shown a complete unwillingness or inability to address the root causes of its human rights impact.”
After the United Nations Independent International Truth-Seeking Mission in Myanmar highlighted Facebook’s “significant” role in atrocities against the Rohingya, Meta admitted in 2018 that “we have not doing enough to help prevent our platform from being used to cause division and incite violence offline.”
In the following years, the company “touted certain improvements in community engagement and content moderation practices in Myanmar,” Amnesty said, adding that its report ” shows that these measures have proven to be completely inappropriate.”
For example, in 2020, three years after violence in Myanmar killed thousands of Rohingya Muslims and displaced 700,000 others, Facebook investigated how a video of a leading anti-Rohingya hate figure , U Wirathu, is spread on his website.
The poll found that more than 70% of a video’s views came from “chain” – i.e. it was recommended to people who played another video, showing “next” content. Facebook users didn’t search or search for the video, but included it using the platform’s algorithms.
Wirathu has been banned by Facebook since 2018.
“Even a well-resourced approach to content moderation, in isolation, may not be enough to prevent and mitigate the harmful effects of this algorithm. This is due to content moderation. does not address the root cause of Meta’s algorithmic amplification of harmful content,” the Amnesty report said.
Rohingya refugees are seeking unspecified compensation from Menlo Park, the California-based social media giant for its role in perpetrating genocide. Meta, the subject of twin lawsuits in the US and UK seeking $150 billion for Rohingya refugees, has so far refused.
“We believe that the genocide against the Rohingya can happen just because of Facebook,” Sawyeddollah said. “They communicated with each other to spread hate, they organized campaigns through Facebook. But Facebook was silent.”