With a United Nations fact-finding mission today calling for an investigation into Myanmar’s military, or Tatmadaw, for war crimes, crimes against humanity and genocide, it could have been expected that top soldiers might go to ground.
However, the disappearance of Commander in Chief Senior General Min Aung Hlaing’s and other Tatmadaw leaders’ Facebook pages nearly coincident with the damning report’s release signaled something else entirely: Facebook is finally taking action in Myanmar, a country where it has come under fire for disseminating hate speech over its online platforms.
Discussion of social media’s role in perpetuating violence and conflict in Myanmar has to date focused on hate speech spread largely by private individuals rather than state actors.
While hate speech is a major issue, as Asia Times reported previously here and here, Facebook has also been used to drive campaigns of misinformation and disinformation for propaganda purposes through military and state channels.
That includes state claims spread on Facebook that the UN’s World Food Program has essentially fed Rohingya rebels and thus legitimized its clampdown on humanitarian access to conflict areas, or to accuse Rohingya rape victims of fabricating their stories against security forces.
Such campaigns of psychological warfare go beyond overt state propaganda in Myanmar. To wit: “During a recent investigation, we discovered that they used seemingly independent news and opinion pages to covertly push the messages of the Myanmar military,” Facebook said in a press release today.
The social media giant announced it had uncovered a campaign of misinformation and disinformation of “coordinated inauthentic behavior.” Such attempts to promote the illusion of grassroots social consensus on an issue is commonly known as “astroturfing.”
While the exact scale and nature of this is not entirely clear in Myanmar’s context, Facebook supplied six examples of “violating” and “non-violating” material in a press release which said it had removed a total of 18 Facebook accounts, one Instagram account, 52 Facebook pages followed by almost 12 million people.
The company said it was preserving the data, including content, on the accounts and pages it had taken down.
In one of the supplied examples, a fake Facebook account was seen to promote the idea that Rohingya were engaged in a frantic campaign of arson burning down their own villages to make it look like it was rampaging security forces.
This was a notion that had gained popular currency in Myanmar, often repeated by the government and state media to deny any security forces’ role in the arson attacks.
The question now is just how much Facebook’s echo chamber of military-backed news outlets and fake accounts had to do with reinforcing such messages and perpetuating violence against the Rohingya.
Time may or may not tell. As the push for a UN investigation gathers steam, such material could be used to establish the crucial “intent” component of the crime of genocide.
A Facebook representative told Asia Times, “We are committed to working with and providing information to the relevant authorities as they investigate these atrocities. Upon receipt of legal process, we will respond to such requests in accordance with applicable laws and our terms of service.”
This, of course, could be bad news for investigators and researchers who may not have total access to the removed data. How the online environment affects real-world conflict dynamics is still under-researched and poorly understood.
While today’s takedowns in Myanmar signal some much welcome transparency from Facebook, the question remains why the US-based social media giant waited so long to act.
There are also questions about whether any of these pages promoted their material through Facebook’s paid advertising model, a key money-spinner for the publicly listed company The Ministry of Information and the army have been known to pay to “boost” their posts.
Facebook will no doubt face calls for transparency over how much it profited from the spread of violence-promoting material and on the known effects of its amplification.
Myanmar’s situation may represent the first genocide to have played out in a real-time information warfare environment carried out in part over social media. Facebook has access to hard data that researchers could use to determine cause and effect.
While important lessons will no doubt eventually be learned about online-driven violence from Myanmar’s example, there is no guarantee without stronger safeguards and monitoring that future social media-sparked atrocities won’t be committed.
Manual and artificial intelligence-based interventions to censor social media content are of questionable efficacy, analysts say. By late Monday, a new Facebook account for Min Aung Hlaing had been created, though notably without Facebook’s “blue tick” verification stamp.