Press "Enter" to skip to content

Facebook’s Actions in Myanmar Reflect a Failure to Follow U.N. Guidance on Business and Human Rights

An Exploration of Corporate Accountability, Human Rights and Content Regulation — Part 4

Facebook’s role in the Rohingya genocide has prompted me to embark on a series of blog posts exploring corporate accountability, human rights and social media platform content regulation. Last month, the U.N. Fact-Finding Mission on Myanmar released its full report which found that, “Facebook has been a useful instrument for those seeking to spread hate, in a context where for most users Facebook is the internet.” Last week, the New York Times revealed that the Myanmar military engaged in a systematic campaign — involving as many as 700 military personnel and taking place over several years — to use Facebook as tool for ethnic cleansing.

Facebook let this happen.

Facebook is not alone in its failures. The United Nations Working Group on Business and Human Rights recently released a report indicating that most businesses seem unaware of their human rights responsibility or unwilling to implement human rights due diligence. The Working Group serves to promote the U.N. Guiding Principles on Business and Human Rights, which clarify that all businesses have an independent responsibility to respect human rights. It requires that businesses “avoid causing or contributing to adverse human rights impacts” and states that businesses “should treat the risk of causing or contributing to gross human rights abuses as a legal compliance issue wherever they operate.”

The Guiding Principles have been in existence since 2011, before Facebook even existed in Myanmar. While they are merely voluntary (they are, after all, guidelines), they nonetheless offer a globally recognized normative framework — a social, rather than a legal, expectation that businesses will prevent adverse impacts on the rights and dignity of people.

The U.N. Guiding Principles provide that, at a minimum, businesses should conduct human rights due diligence to avoid adverse human rights impacts and seek to mitigate harm when adverse impacts happen. The Guiding Principles explain that, in assessing impacts, businesses should seek to understand the concerns of the people who may be affected, by either consulting them directly, or, if that is not possible, by consulting credible experts, such as human rights defenders and others from civil society. The Principles further provide that businesses should engage in this assessment at regular intervals, such as in response to changes in the operating environment due to rising social tensions. If a business learns that it has contributed to human rights violations, it should: take the necessary steps to cease its contribution; mitigate any remaining impact; track the effectiveness of its response; and and communicate how they have addressed the issue.

If Facebook had followed the U.N. Guiding Principles, it would not have become a tool for ethnic cleansing in Myanmar. It would have, at a minimum, heeded the warnings and acted much sooner.

For years, researchers and human rights activists warned Facebook that it was aiding the spread of hate speech in Myanmar. Earlier this year, UN investigators reported that Facebook played a “determining role” in the deteriorating human rights situation there. It was not until late August, however, that Facebook removed several accounts and pages and acknowledged that it had been too slow to act.

The damage in Myanmar had already been done. According to the U.N. Fact-Finding Mission report, a carefully crafted hate campaign has led to a negative perception of Muslims and the branding of the entire Rohingya population as “illegal Bengali immigrants.” In addition, the report found numerous allegations that posts and messages on Facebook increased discrimination and violence in Myanmar.

The U.N. Guiding Principles state that, when a business has had a negative impact on human rights, it should establish or participate in a grievance procedure for communities that have been harmed. The Principles acknowledge that such a procedure “can only serve its purpose if the people it is intended to serve know about it, trust it and are able to use it.” The Principles, however, do not articulate standards for what appropriate remediation entails.

Apart from removing content, Facebook has responded to the Myanmar crisis by hiring an agency to conduct a human rights impact assessment, increasing its Myanmar-language content review staff and “introducing locally designed tips on how to spot fake news.” It is also currently advertising for a human rights product policy director. None of these measures offer any direct remedy to the 700,000 Rohingya displaced by the Myanmar military’s ethnic cleansing campaign — or to the Rohingya who remain in Myanmar under a threat of discrimination and violence. So far, there is no indication that Facebook has implemented a grievance procedure or plans to provide any direct remedy to victims.

Admittedly, it is not clear what remediation should entail here. As noted in previous posts, Facebook cannot be held liable in U.S. courts over this issue. Nor is the Myanmar government going to take any action. It was the Myanmar military, after all, that used Facebook as an aid to ethnic cleansing. And even if the U.N. Guiding Principles were more specific on this issue, the fact that they are voluntary means, in the end, it will be entirely up to Facebook to decide what remediation means in Myanmar — and how to avoid contributing to human rights abuses in the future.

Comments are closed.