Rohingya lawsuit against Facebook a ‘wake-up call’ for social media

Will the landmark suit, which argues that the spread of hate speech on the platform facilitated the genocide of Rohingya Muslims in Myanmar, be a turning point for Big Tech?

Rohingya refugees sit on a makeshift boat as they are interrogated by the Border Guard Bangladesh
There are more than 1 million Rohingya refugees living in Bangladesh’s dense camps in Cox’s Bazar. Image: REUTERS/Navesh Chitrakar/File Photo

A landmark lawsuit by Rohingya refugees against Meta Platforms Inc, formerly known as Facebook, is a “wake-up call” for social media firms and a test case for courts to limit their immunity, human rights and legal experts said.

The $150 billion class-action complaint, filed in California on Monday by law firms Edelson PC and Fields PLLC, argues that Facebook’s failure to police content and its platform’s design contributed to violence against the Rohingya community.

British lawyers also submitted a letter of notice to Facebook’s London office.

While analysts are split over the merits of the case and its chances of success, Rohingya activists said their status of being deemed illegal immigrants in Myanmar left them with few options.

“The Rohingya lost everything. But in Myanmar, there is no law for the Rohingya,” said Nay San Lwin, co-founder of advocacy group Free Rohingya Coalition, who has faced abuse on Facebook.

“Facebook profited from our suffering. The survivors have no option other than a lawsuit against Facebook. It will be an injustice if Rohingya survivors are not compensated for their losses,” he told the Thomson Reuters Foundation.

Meta did not respond to a request for comment.

In an earlier statement in response to the lawsuit, a Meta spokesperson said the company was “appalled by the crimes committed against the Rohingya people in Myanmar.”

Facebook profited from our suffering. The survivors have no option other than a lawsuit against Facebook. 

Nay San Lwin, co-founder, Free Rohingya Coalition

“We’ve built a dedicated team of Burmese speakers, banned the Tatmadaw (Myanmar military), disrupted networks manipulating public debate and taken action on harmful misinformation to help keep people safe. We’ve also invested in Burmese-language technology to reduce the prevalence of violating content.”

A day after the lawsuit was filed, Meta said it would ban several accounts linked to the Myanmar military, and said on Wednesday it had built a new artificial intelligence system that can adapt more easily to take action on new or evolving types of harmful content faster.

It was a sign that the tech giant was rattled, said Debbie Stothard, founder of the Alternative ASEAN Network on Burma (ALTSEAN), an advocacy group.

“The timing of these announcements shows the lawsuit is a wake-up call. The lawsuit itself is quite a bold move, but the Rohingya clearly felt there were sufficient grounds,” she said.

“Strategic litigation like this - you never know where it can go. In recent times we have seen climate-change litigation becoming more commonplace and getting some wins,” she added.

No precedent

More than 730,000 Rohingya Muslims fled Myanmar’s Rakhine state in August 2017 after a military crackdown that refugees said included mass killings and rape. Rights groups documented killings of civilians and burning of villages.

Myanmar authorities say they were battling an insurgency and deny carrying out systematic atrocities.

United Nations human rights investigators said in 2018 that the use of Facebook had played a key role in spreading hate speech that fuelled the violence against the Rohingya.

A Reuters investigation that year, cited in the U.S. complaint, found more than 1,000 examples of posts, comments and images attacking the Rohingya and other Muslims on Facebook.

But in the United States, platforms such as Facebook are protected from liability over content posted by users by a law known as Section 230.

The Rohingya complaint says it seeks to apply Myanmar law to the claims if Section 230 is raised as a defence.

“Based on the precedents, this case should lose,” said Eric Goldman, a professor of law at Santa Clara University School of Law. “But you’ve got so much antipathy towards Facebook nowadays - anything is possible.”

While the technology industry and others have long held that Section 230 is a crucial protection, the statute has become increasingly controversial as the power of internet companies has grown.

Earlier this year, Meta chief executive Mark Zuckerberg laid out steps to reform the law, saying that companies should have immunity from liability only if they follow best practices for removing damaging material from their platforms.

The lawsuit is a good test case for courts to limit how much immunity platforms are afforded, said David Mindell, a partner at Edelson PC, one of the law firms that brought the suit.

“This case is about what happens when a powerful company has this unchecked power over the world,” he said.

Whistleblower complaints 

Goldman and Mindell said that recent whistleblower complaints from inside Facebook, which allege the company did not act even when it knew its platform was being used for human rights abuses, could buttress the lawsuit, as could the company’s admission that it was “too slow” to contain the abuse.

The lawsuit highlights that “a company can apologise all they like, but at the end of the day, people were harmed,” said David Kaye, a human rights lawyer who chairs the board of the Global Network Initiative, a group that includes Facebook and other tech firms.

“And those stateless people can’t go to the government of Myanmar for remedy. And if they can’t go to the company - what’s the remedy?”

The International Criminal Court has opened a case into the accusations of crimes. In September, a U.S. federal judge ordered Facebook to release records of accounts connected to anti-Rohingya violence in Myanmar that the social media giant had shut down.

The progress of the lawsuit would be keenly watched by not just the Rohingya, but also other groups and individuals who have been harmed by online hate speech, said Stothard.

“Refugees, migrants, LGBT people, other minorities - they have all suffered serious harm,” she said.

“The question to ask is not, will the lawsuit succeed, but why was it necessary? It’s about making social media companies accountable,” she said.

This story was published with permission from Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers humanitarian news, climate change, resilience, women’s rights, trafficking and property rights. Visit http://news.trust.org/climate.

Like this content? Join our growing community.

Your support helps to strengthen independent journalism, which is critically needed to guide business and policy development for positive impact. Unlock unlimited access to our content and members-only perks.

Most popular

Featured Events

Publish your event
leaf background pattern

Transforming Innovation for Sustainability Join the Ecosystem →