Public Comment on the Oversight Board's "Cambodian Prime Minister" Case
On March 16th, the Oversight Board announced the following call for public comments regarding an appeal to remove video of a speech by Cambodia's prime minister, Hun Sen from his official Facebook page:
CALL FOR PUBLIC COMMENT
Cambodian prime minister(2023-003-FB-MR) User appeal to remove content from Facebook, and case referred by Meta
To read the case summary below in Khmer, please click here.
On January 8, 2023, seven months before Cambodia’s next general election, a live video was streamed on the official Facebook page of Cambodia’s prime minister, Hun Sen. Hun Sen has been in power since 1985 and is seeking re-election. The video shows a lengthy speech delivered by the prime minister in Khmer, Cambodia’s official language, during a ceremony marking the opening of a national road expansion project in Kampong Cham. In it, he responds to allegations that his ruling Cambodian People’s Party (CPP) stole votes during the country’s local elections in 2022. He asks his political opponents who made the allegations to choose between the “legal system” and “a bat,” and says they can choose the legal system, or he “will gather CPP people to protest and beat you up.” He adds, “if you say that’s freedom of expression, I will also express my freedom by sending people to your place and home.” He names individuals, warning that they “need to behave,” and says he may “arrest a traitor with sufficient evidence at midnight.” However, he says “we don’t incite people and encourage people to use force.” After the live broadcast, the video was automatically uploaded onto Hun Sen’s Facebook page, where it has been viewed about 600,000 times and was shared fewer than 3,000 times. The speech was covered by local and regional media outlets. In recent years Cambodia has seen increasing crackdowns on political opponents, activists, trade union leaders and critical media outlets.
Three users reported the video five times for violating Meta’s Violence and Incitement Community Standard. This prohibits “[t]hreats that lead to serious injury (mid-severity violence),” including “[s]tatements of intent to commit violence.” Generally, content is prioritized for human review based on its severity, virality and likelihood of violating content policies. In this case, Meta’s technology automatically closed out these user reports without human review, as they were deprioritized and not reviewed within 48 hours of reporting. After the users who reported the content appealed, it was reviewed by two human reviewers, who found it did not violate Meta’s policies. At the same time, the content was escalated to policy and subject matter experts within Meta. They determined that it violated the Violence and Incitement Community Standard but applied a newsworthiness allowance to allow it to remain on the platform. A newsworthiness allowance allows otherwise violating content to remain on Meta’s platforms where its public interest value outweighs the risk of it causing harm.
One of the users who reported the content appealed Meta’s decision to the Board. Separately, Meta referred the case to the Board saying that it involves a difficult balance between the values of “Safety” and Voice” in determining when to allow potentially violent speech from a political leader. Meta asked the Board for guidance on how to apply the Violence and Incitement Community Standard to threatening statements made by political leaders outside of conflict or crisis situations.
The Board selected this case because it raises relevant policy questions around how the company should treat speech from political leaders which appears to violate Meta’s content policies. This is particularly relevant in the context of potentially violent threats against political opponents from a national leader before an election in a country with a history of electoral violence and irregularity. This case falls within the Board’s “Government use of Meta’s platforms” priority area and is also relevant to its “Elections and civic space” priority.
The Board would appreciate public comments that address:
- The political and human rights situation in Cambodia and the potential impact of Hun Sen’s speech on the elections scheduled for July 2023.
- Meta’s moderation of content posted by state actors, particularly moderating content they post about opposition figures, and its implications.
- Meta’s responsibilities in elections and key moments for political participation, particularly in states like Cambodia where Meta’s platforms are of greater importance to civic life and local media is restricted.
- What the criteria should be for deciding when to apply a “newsworthiness allowance” to potentially harmful content posted by political leaders.
Dangerous Speech Project Responds
Director of Operations of the Dangerous Speech Project Tonei Glavinic responded on behalf of the DSP in a public comment:
Dangerous Speech Project
Tonei Glavinic, Director of Operations
2023-003-FB-MR
1. Influential figures should be subject to more scrutiny, not less.
Private companies should not stifle public discourse, but risks like inspiring group violence outweigh the public interest in viewing content. Further, Facebook cannot silence people who hold office, since they can easily speak to the public without social media. Also, the newsworthiness exception’s effect is circular, since newsworthiness is not a fixed value; it rises and falls with access. Giving political figures direct access to a bigger public gives them more influence, making their content more newsworthy.
A witness at the UN Tribunal for Rwanda described the gradual process of incitement, recounting how a notorious radio station groomed its listeners to commit and condone unthinkable violence. The witness said, “In fact, what RTLM [Radio Télévision Libre des Mille Collines] did was almost to pour petrol, to spread petrol throughout the country little by little, so that one day it would be able to set fire to the whole country.”
This case is a textbook example of spreading petrol: while Hun Sen said that individuals should not commit violence based on this speech, he clearly expressed support for intimidation and violence as a response to political opposition, which is likely to increase his audience’s willingness to commit or condone such violence in the future.
2. Deciding when to take action
In any such process, it’s difficult to decide which drop of petrol is the first “actionable” one; while this example is quite blatant, others are much less so. We have urged Meta in a previous comment (PC-09337, case 2021-001-FB-FBR) to focus on real world impact and consequences, by identifying influential users who spread disinformation and use language that tends to increase fear, threat, and a sense of grievance, and monitoring for significant shifts in the sentiment of the followers’ posts – especially signs that a critical mass of followers understand the political figure to be endorsing or calling for violence. In cases where such shifts occur, Meta could reach out to the accountholder to inform them of the situation, giving them an opportunity to denounce that understanding – and in the event they failed to do so, putting them on notice that further such content would be understood as violating platform policies and subject to removal.
3. Meta’s decisions carry more weight in contexts where its platforms play a major role in civic life.
The Board has noted that Cambodia is one of many contexts where Meta’s platforms play a crucial role in civic life, and the ability of media and citizens to express dissenting views or even provide objective information is restricted. There are some parallels between this context and Palestine, as commenters like Lara Friedman (PC-10159) shared in case 2021-009-FB-UA, and we would encourage the Board to review the relevant comments from that case in making a decision here.
In such contexts, Meta’s decisions about whether to restrict harmful content spread by powerful figures carry significant weight, since Meta is often seen as an objective and powerful third party whose opinions are respected. Choosing to restrict or remove harmful content in such environments not only limits the content’s reach (and, in this case, its capacity to inspire violence), but also sends an important normative message: inciting violence against another group is unacceptable and a violation of human rights.
In deciding whether to remove or merely restrict harmful content, we do urge Meta to evaluate whether the value of creating space for counterspeech and dissent outweighs the harm of the content itself; however, we expect that in cases like this one where it is clearly unsafe for citizens to express dissent, such evaluation will fall on the side of removal.
Tonei Glavinic responds to the Oversight Board's case about a video of a speech by Cambodia's prime minister, Hun Sen, on his official Facebook page.
DownloadRead More