Efforts to expunge hate speech from social media platforms in Myanmar are hampered by the volume of traffic, multiple languages and a shortage of moderators and fact checkers.
By KYAW YE LYNN & SU MYAT MON | FRONTIER
For the Myanmar Pressphoto Agency, the thousands of people – more than 4,000 by some media estimates – who flocked to hear nationalist Buddhist monk U Wirathu speak in Yangon in October was a major news story.
The digital media outlet posted photos with captions from the event on its Facebook page, only to realise soon after that its page could not be accessed.
“At first, we did not know what happened to the page,” MPA founder Ko Paing Soe Oo told Frontier. “We later found out that Facebook banned it, saying we breached their community standards.”
Facebook took down MPA’s pages for nearly 10 hours after Paing Soe Oo posted photos of the rally, at which Wirathu condemned international calls for Tatmadaw leaders to be held accountable for rights violations against the Rohingya in Rakhine State.
Facebook’s algorithms led to the blocking of the page because the social media platform had been taking action against accounts and pages by and about the monk, who is notorious for his anti-Muslim rhetoric.
“It seems Facebook is automatically removing all posts that include some words they banned. In our case, we believe their [artificial intelligence] noticed U Wirathu photos and banners of [Tatmadaw Commander-in-Chief] Senior General Min Aung Hlaing, so the system took down the page,” Paing Soe Oo said.
“It is like they are doing [this in] the same way that China censored words relating to the ‘Tiananmen Square’ massacre,” he said, referring to the violent crackdown against student-led democracy protests in Beijing in 1989.
How did this “collateral damage” happen? Facebook’s attempts were aimed at cleaning up the spread of hate speech online. Yet the experience of MPA, along with other news outlets, highlighted how attempted solutions can lead to unexpected problems, and how they are, at best, a work in progress.
Paing Soe Oo says Facebook should have learned from its mistake in 2017, when its algorithms took down all mention, in Burmese and English, of kalar, which originally was a neutral term for foreigners but can also be a derogatory term for people of South Asian descent.
Although Facebook’s algorithms detect questionable words, it is the social media giant’s content reviewers who decide if they are to be removed. Last year, Facebook lacked reviewers proficient in Burmese. This has since changed, a Facebook spokesperson said in an email interview. “Automation helps surface potentially violating content, but we have real people looking at the reported content,” said the spokesperson.
The reviewers have a big workload: 34 percent of Myanmar’s 53 million people are active social media users, mainly through smartphones.
As of November 2018, Facebook had 99 Myanmar-language experts to review Burmese posts, the Facebook spokesperson said, adding that “we expect to have at least 100 by the end of this year”. She said Burmese content was being reviewed around the clock and added that Facebook had the ability to review other “dialects” spoken in Myanmar.
In the third quarter of 2018, Facebook had “proactively identified 63 percent of the hate speech we removed in Myanmar”, she added, up from 13 percent in the last quarter of 2017 and 52 percent in the second quarter of 2018.
“Our reviewers come from many backgrounds to reflect the diversity of our community in Myanmar and around the world and go through vigorous training before they start working for us,” the spokesperson said.
However, the Myanmar tech community questions Facebook’s claimed capacity to track content in languages other than Burmese. More than 100 languages are spoken in Myanmar, which has eight major communities among its 135 officially recognised ethnic groups.
Most of Facebook’s content reviewers are in Ireland and Malaysia, said Ms Victoire Rio, who leads the Myanmar Tech Accountability Network. “Ireland jobs require having permission to work in Ireland which means most reviewers tend to come from the [Myanmar] diaspora, and often have not lived in Myanmar since the country became [widely] connected [to the internet],” she said in an interview.
“We’ve seen a visible increase in moderation errors being made in Q3 [third quarter of 2018], which is greatly problematic,” Rio added.
Still, Rio acknowledges that Facebook has taken important steps to address hate speech, including by supporting the shift to Unicode fonts in Burmese, in coordination with the Myanmar Unicode Migration Team. “This is incredibly important to the future of Myanmar’s digitalisation,” she said.
Myanmar uses two types of fonts – Zawgyi and Unicode – which Facebook says poses challenges to its monitoring efforts. Unicode fonts are easier to track as they provide only one way to type a word in Burmese, unlike Zawgyi.
“As a first step in support of Myanmar’s transition to Unicode, we have switched off Zawgyi locale for new users to Facebook,” said the Facebook spokesperson.
Rio believes that the ratio of Facebook reviewers to users in Myanmar remains too low. At about 99 reviewers for 20 million Myanmar users, there is just one reviewer for 202,000 users, she said. “Given the situation and prevalence of hate speech, it’s hard to explain why we are so far below the global average (of 1:115,000),” Rio said.
In April 2018, six Myanmar-based advocacy groups wrote an open letter to Facebook CEO Mr Mark Zuckerberg to complain about the shortage of moderators.
“If you are serious about making Facebook better, however, we urge you to invest more into moderation – particularly in countries such as Myanmar, where Facebook has rapidly come to play a dominant role in how information is accessed and communicated,” the letter said.
Ko Myat Thu, a freelance researcher familiar with Facebook’s operation in Myanmar, said the social media company’s response to concern over hate speech was too slow.
“I don’t think Facebook is doing well and winning the war against hate speech and fake news [on its platform] in the country,” he said.
Myat Thu said Myanmar relies on only a handful of groups to check facts on Facebook pages, including ThinkBeforeYouTrust and Real or Not.
Ko Phyo Wai and three colleagues launched ‘Think before You Trust’ more than a year ago because they were troubled by hate speech on Facebook and Facebook Messenger warning Buddhists and Muslims to prepare for a confrontation on September 11, 2017.
“They were spreading very fast countrywide and no one seemed to really care about the potential impact,” said Phyo Wai. “We decided to do something voluntarily to counter it.”
As of late November, the group’s page had 45,891 followers. “We receive a lot of very positive comments and messages for doing this. That’s why we keep doing fact-checking as much as we can,” he said.
Ma Phyu Phyu Thi, research director at Myanmar ICT for Development Organization (MIDO), a digital rights organization promoting media and digital literacy, said most groups trying to fight hate speech rely on volunteers. “It is disappointing that there are so few groups doing this work to counter the spread of fake news online,” she said.
This report has been published through a grant to the Southeast Asian Press Alliance (SEAPA) Fellowship programme for 2018-2019 from the United Nations Office of the High Commissioner for Human Rights. The views expressed do not reflect the official opinion of OHCHR.