Facebook apologises for role in anti-Muslim violence in Sri Lanka in 2018

Facebook’s failure to remove hate speech and falsehoods may have contributed to real-world anti-Muslim violence in Sri Lanka in 2018, a human rights report has found.

The report, published by Article One on May 11, was the product of a two-year partnership with the social media giant, which involved conducting human rights assessments in Sri Lanka and Indonesia.

In a
statement to Bloomberg
, Facebook ‘deplored’ the misuse of its platform, adding,
“We recognize, and apologize for, the very real human rights impacts that
resulted.”

The Article One report gives much focus to the violence in Kandy during broader anti-Muslim and Islamophobic riots. Ethnonationalist and Buddhist nationalist groups had torched Muslim homes and businesses, mosques were also damaged, with at least one reported fatality, of a man trapped inside of a burning house.

According to Hidayath Saththar, a provincial council member, identified
by the New York Times in
its 2018 coverage, reported that four mosques, 37
houses, 46 businesses and multiple vehicles were damaged.

The Reuters news agency had reported how Buddhist nationalists had
more broadly “protested against the presence in Sri Lanka of Muslim
Rohingya asylum-seekers from mostly Buddhist Myanmar, where Buddhist
nationalism has also been on the rise.”

Muslims in the village of Mullegama, speaking to the Associated Press, described how the police did nothing to stop the violence.

Coverage from multiple international sources reported how
the violence was spurred on by the death of a Buddhist youth, with many defying
a police curfew. There was no evidence of a bias motive in the attack, but
anti-Muslim voices used
the death
as a pretext for further anti-Muslim violence.

The extent of the anti-Muslim and Islamophobic violence forced a temporary
ban on Facebook.

Before and in the aftermath of the violence, Facebook extremists used dehumanising
language in their calls for violence against Muslims, even
posting
detailed instructions about creating petrol bombs.

But for years, politicians and civil society groups had warned the social
media giant that ethnonationalist and Buddhist nationalists were using the
platform to target Muslims, women, and other minorities. One influential
report, published
by the think tank Centre for Policy Alternatives in 2014, identified twenty
ethnonationlist hate groups targeting women and minorities, with a vast
majority still active on Facebook four years later, according
to a detailed report
in BuzzFeed News.

A New York Times investigation, drawing from testimonies from victims,
officials, and ordinary people, argued that Facebook’s newsfeed had a “central
role in nearly every step from rumor to killing
.” Facebook had declined
to answer at length to specific questions about its role in the violence, as a
spokeswoman quoted over email, referred to its steps to remove content when
made aware.

The most infamous example of an anti-Muslim falsehood
derived from
Sinhalese-speaking Facebook
, with memes and unfounded
claims
that Muslims were spiking food with sterilisation pills to destroy
the country’s ethnic majority.

It resulted in a crowd attacking a Muslim restauranter destroying his shop
before a mosque was set alight. The so-called ‘confession’ was filmed and would
become another a viral anti-Muslim falsehood.

A similar conspiracy about Muslims and sterilisation appeared
in a Reuters investigation
one year later. It found no evidence to substantiate
claims that a Muslim doctor had, in secret, sterilised 4,000 Sinhala Buddhist
women.

The Reuters investigation also
revealed
that the falsehood appeared one week after Buddhist nationalists had
used the Easter terror attacks which killed 269 people
in coordinated suicide bombings, as a pretext to destroy Muslim homes, stores,
and mosques in the northwestern province of the country.

Academics have documented
examples of the exponential spread of anti-Muslim and Islamophobic content on
Facebook from Buddhist nationalists in Sri Lanka. One page, with over 13,000 ‘likes’
had produced 71 anti-Muslim and Islamophobic image graphics between January and
May 2013. The group’s name in English ‘Safe Buddhism’ had
a more menacing name
in Sinhala, which read as, “End Muslim barbarity against
Sinhala Buddhist harmony”.

Facebook also lacked
the staff to monitor hate speech in Sinhalese and Burmese, according to
academics. BuzzFeed News identified
through officials and civil society groups that, as of March 2018, it had just
two Sinhalese-speaking moderators, who both live outside of Sri Lanka, in a country
with over 4 million active users.

According to the Article One assessment,
“Facebook’s lack of formal human rights due diligence in Sri Lanka prior to
this HRIA and the limited cultural and language expertise among Facebook staff
at the time of the May 2018 Kandy incident may have contributed to offline harm
stemming from online engagement.” An issue which was exacerbated further by a
now-removed algorithm designed to drive engagement on the platform.

Nor did Facebook implement its Community Standards in a conducive manner,
allowing forms of harassment and hate speech to grow or remain on the platform,
Article One found.
Examples of anti-dehumanising Muslim language cited in the report included, “Kill
all Muslims, don’t even save an infant; they are dogs.”

The Article One report also highlights a decade of frustrations for civil
society groups who felt shut out by the platform following repeated attempts
(including phone calls and published reports) to highlight the misuse of its
platform and offline violence, until the government shutdown of social media in
2018.

Facebook published detailed statements in response to the Article One
reports on
May 12
, adding how it is
“using proactive detection technology to identify potentially violating hate
speech, developing machine learning capabilities in Sinhala and Bahasa
Indonesia.”

They have also employed more policy leads and program managers in Sri
Lanka, Indonesia and Cambodia, and hiring more
content reviewers who speak Sinhala, Tamil, Bahasa Indonesia, Javanese and
Khmer.

Despite significant criticism, Article One found that a majority of the 74
stakeholders engaged, found value in Facebook in Sri Lanka when managed
effectively, adding that marginalised groups including human rights activists
and LGBTQI+ individuals found safe spaces on the platform. But the report did
highlight how their fundamental rights and that of women and children, may have
been upended.

Article One also
called
on Facebook to acknowledge the harms it may have caused and to make
the findings of such reports public.

The post Facebook apologises for role in anti-Muslim violence in Sri Lanka in 2018 appeared first on TELL MAMA.

Categories: News, Sri Lanka