OPINION
Fresh testimony from a Facebook whistleblower suggests the company’s previous failure to protect Myanmar users could amount to gross negligence, particularly amid the Rohingya crisis.
By OLIVER SPENCER | FRONTIER
The newly published book Careless People, by Facebook’s former director of public policy Sarah Wynn-Williams, alleges that the company’s management was “deeply unconcerned” about its role in Myanmar, which helped “enable posts that led to horrific sexual violence and genocide” against the Rohingya minority group.
Although Facebook has acknowledged mishandling its role in Myanmar as asserted by civil society and the United Nations, Wynn-Williams’s book provides an unprecedented look at its internal decision-making.
If her allegations are true, they suggest that Facebook’s failures were not simply due to a slowness to respond but indicate negligence, and possibly gross negligence, by senior management. This has been the key argument in legal action against the company in the United States, United Kingdom and US Securities and Exchange Commission. In a human rights context, negligence means failing to exercise reasonable care in protecting rights, and gross negligence implies a reckless disregard for safety.
Facebook’s domination of internet use in Myanmar has often been summarised with the phrase “Facebook is the internet”. It was achieved by aggressive expansion strategies that meant the platform was pre-installed on mobile phones, with data often offered free via “Facebook Basics”.
Wynn-Williams writes that Myanmar was “the one country where Mark [Zuckerberg’s] dream for Internet.org kind of came true”. The resulting monopoly on access meant that many people had little exposure to other sources of information.
Despite its dominance, “Facebook provided no meaningful safeguards to protect users from misinformation, nor did it invest in understanding the political and social landscape of Myanmar.” This left Myanmar’s information ecosystem entirely at the mercy of malign actors and Facebook’s algorithms.
Civil society has long warned of the military’s 700-strong “information ops” teams that spread hatred, disinformation and propaganda on Facebook directly and by hijacking verified accounts. The teams were one of the “first examples of an authoritarian government using the social network against its own people”.
World’s ‘worst hate speech’
Facebook has said it is committed to countering harmful content, such as “coordinated inauthentic behaviour”, and has banned certain people. Wynn-Williams recounts that Facebook’s security team discovered a secret group of 571 members coordinating spearfishing attacks on high-profile accounts.
Yet she questions whether Facebook’s response was at all proportionate, noting that Myanmar was described as hosting the world’s “first Facebook genocide” and that the country suffers from “worse hate speech and fake news than we’ve seen in any other country”.
The quasi-civilian government of former army general U Thein Sein tried to curb online incitement when it suited its interests. Wynn-Williams describes a viral post in July 2014 that falsely claimed a Buddhist woman had been raped by a Muslim teashop owner, and was amplified by the ultranationalist Buddhist monk Ashin Wirathu, fuelling riots in Mandalay.
As tensions escalated, the government asked Facebook to remove Wirathu’s posts and blocked the platform in the meantime. According to Wynn-Williams, Facebook’s Dublin-based moderation team initially refused to act, arguing that the post contained “factual reports and calls to political action – not hate speech”.
Wynn-Williams writes that “Facebook’s own moderators didn’t understand the words they were supposed to police”. One case officer admitted that they could not properly evaluate the content because “Google Translate doesn’t do Burmese”. When the post was later removed, Facebook was unblocked in four minutes.
Although Facebook hired a Burmese-speaking contractor based in Dublin, the moderation system remained deeply flawed. Wynn-Williams recounts that one of the two Burmese-language moderators in Dublin appeared to allow racist content while disproportionately removing posts from civil society and human rights defenders. “Worried he might be in cahoots with the [military], we raise this with the content team, who tell us there’s nothing to worry about,” she claims.
A notable failure was Facebook’s refusal to ban a racial slur often used against the Rohingya. “We ask [Facebook managers] to ban the word kalar, if they won’t deal with the moderator. They refuse,” Wynn-Williams recalls, with the slur not banned until later.
Although the number of Burmese-language moderators later reportedly grew to about 100, Wynn-Williams suggests this is still not enough. “To properly cover the country as well as we cover, say, Germany, we’d need hundreds of skilled moderators. And in China we had promised four hundred initially, rising to over two thousand Facebook employees.”
A ‘low priority’ country
Wynn-Williams also highlights failures in Facebook’s content reporting system in Myanmar. Although a global mechanism existed for people to report problematic content, few reports were made in Myanmar and there was little interest in investigating why.
“Within weeks, we learn just how crudely Facebook is functioning in Myanmar,” she writes. The reporting button was not available in the Burmese language, and many people used third-party apps to access Facebook that did not even have a report function. “Of course they weren’t filing reports. Of course we took no action. Our users weren’t using apps capable of any of that.”
Another critical failure was that Facebook never published its Community Standards in Burmese, so most users were unaware of what was allowed or prohibited.
Despite Facebook’s apparent role in fuelling conflict and violence, the company’s leadership treated Myanmar as a low priority. “They think they can ignore this because it’s happening in Myanmar,” Wynn-Williams recounts.
The country being a low priority meant solutions were delayed. Moreover, many Myanmar people used a non-Unicode font that Facebook’s systems could not interpret, undermining moderation. Wynn-Williams notes that “engineering are very aware of this problem with Burmese, and it’s eminently solvable”. Although the solution was eventually implemented, engineers said that at the time “nobody would listen”.
When Wynn-Williams’s team tried translating Facebook’s Community Standards into Burmese, the communications team refused to help, saying: “Myanmar isn’t a priority country in [Southeast Asia].”
‘Blindly unconcerned’
In the book’s most damning conclusion, Wynn-Williams argues that Facebook is an “astonishingly effective machine to turn people against each other” that is “helping some of the worst people in the world do terrible things”. The company has operated a business model that rewards inflammatory content and allowed the Myanmar military and its allies to weaponise the platform. It then reacted too late when that turned into violence.
Wynn-Williams asserts, “The truth here is inescapable. Myanmar would’ve been far better off if Facebook had never arrived there.” She states that, “At every juncture, there was an opportunity to make different choices” but Facebook’s senior management “seemed deeply and blindly unconcerned about any of this”, in a case of “lethal carelessness”.
Wynn-Williams’s claims corroborate long-held suspicions among civil society in Myanmar and suggest that Facebook was not merely a growing company grappling with unforeseen challenges. According to her account, written years after she was “thrown out” of the company, Facebook’s management was repeatedly warned, internally and externally, about the specific risks in Myanmar, yet it chose not to invest the necessary resources or prioritise the situation until it was too late.
This raises serious questions about whether Facebook’s conduct constituted negligence or, more gravely, gross negligence – a reckless disregard for human rights and safety that goes far beyond a failure of oversight. If these allegations are accurate, they point to systemic failures at the highest levels of the company, and serious culpability that requires greater accountability than a mere apology.
Wynn-Williams’s allegations relate to a period several years ago and Facebook, now known as Meta, claims to have learned from past mistakes by reviewing its role in Myanmar and recruiting a dedicated, capable team.
However, deeply problematic Myanmar content continues to be protected on the platform. Its recent global policy changes – made in response to shifts in American governance and seemingly without sufficient regard for the global majority – reinforce Wynn-Williams’s central allegation that countries such as Myanmar remain a low priority for Meta’s “careless” management.
Careless People: A Cautionary Tale of Power, Greed, and Lost Idealism, by Sarah Wynn-Williams, was published by Macmillan in March 2025.
Oliver Spencer is a human rights lawyer who has worked on Myanmar for 20 years.