FANNY POTKIN and POPPY McPHERSON
ONE of two Muslims allowed to run for the ruling party in Buddhist-majority Myanmar’s general election on Sunday, Sithu Maung, worries fake news on Facebook could damage his chances.
Within a torrent of racist abuse and misinformation posted about him ahead of the polls are false claims he plans to close Buddhist monastic schools and to advocate for the teaching of Arabic.
“They use race and religion to attack me,” the 33-year-old told Reuters in the commercial capital of Yangon, where he is standing for a seat won by the ruling party in the last election.
“These days people use social media more than ever … and when they see false information 10 times it becomes the truth.”
Social media companies face a global challenge to stop disinformation around elections, including the 2020 U.S. vote. In Myanmar, the stakes for Facebook are particularly high after previous accusations it helped incite genocide.
Half of Myanmar’s 53 million people use Facebook, which for many is synonymous with the internet.
Facebook executives told Reuters hate speech in Myanmar was “near historic lows” after it invested in resources from artificial intelligence language and photo detection to measures to slow the spread of viral content.
But civil society groups have found dozens of networks of accounts, pages, and groups spreading ethnically and religiously charged falsehoods that they fear could lead to strife and undermine the second election since the end of hardline army rule in 2011.
Reuters separately found more than two dozen inter-connected pages and accounts with a combined reach in the hundreds of thousands. The majority were removed after Reuters flagged them to Facebook.
“There’s a short-term immediate concern of all this disinformation and hate speech fuelling real-world violence,” said Jes Kaliebe Petersen, CEO of tech hub Phandeeyar, part of the Myanmar Tech Accountability Network (MTAN), a civil society group coordinating efforts to reduce risks posed by social media.
Harmful content, he said, is “spreading like wildfire”.
The government of leader Aung San Suu Kyi, her ruling National League for Democracy (NLD) and the election commission did not respond to requests for comment.
Although the NLD is widely expected to win the election easily, as it did in 2015, there is precedent for social media hate speech leading to violence in Myanmar.
Anti-Muslim rumours on Facebook were widely seen as helping to trigger deadly riots in 2012 and 2014. In 2017, violent speech on Facebook was blamed for supporting an army crackdown on Rohingya Muslims that drove more than 730,000 to flee Myanmar.
But Rafael Frankel, Facebook’s director of public policy for Southeast Asia, told Reuters ahead of the election: “What we have seen so far is typical and nothing in any way out of the ordinary from what we would see in other parts of the world when an election is happening.”
Even before campaigning got underway, Facebook deleted 280,000 items in Myanmar for hate speech in the second quarter of 2020, up from 51,000 in the first quarter.
Meanwhile, it is verifying accounts for some politicians – including Sithu Maung – and giving them a direct line for complaints.
Facebook also said it had taken down hundreds of accounts for “coordinated inauthentic behaviour” including about 70 it traced to members of Myanmar’s military on Oct. 8.
Among the blocked army-linked accounts were two that had attacked Sithu Maung with ethnic and religious slurs.
Twenty pages and pro-military accounts found by Reuters published simultaneous posts and included some newly created ones that replicated others that had only recently been blocked by Facebook.
The army did not respond to a request for comment.
Separately, Facebook said in a monthly report on coordinated inauthentic behavior on Friday it had dismantled a network of 36 accounts and six pages run by a Myanmar public relation firm that used fictitious people to profess support for the military-backed Union Solidarity and Development Party (USDP).
It is not the first time Facebook has blocked pages linked to Myanmar’s army: In 2018, it banned 20 top military officials and organisations for inauthentic behaviour, including commander in chief Senior General Min Aung Hlaing.
This week, Min Aung Hlaing accused unspecified social media platforms of bias in their treatment of Myanmar politics as he questioned the credibility of the elections more generally. Those claims were widely spread by pro-military accounts and pages found by Reuters and removed by Facebook.
But it is not only the army that has been using Facebook to spread disinformation, researchers say.
Some pages that Facebook has taken down had content supportive of Suu Kyi’s NLD.
Opposition politicians, including from the USDP, the ruling NLD’s biggest rival, have also been among the targets: branded as Muslim sympathisers or close to China by accounts whose origins are unclear.
The USDP did not respond to a request for comment.
Researchers say that the online vitriol is amplified by numerous “inauthentic” networks that spread narratives through inter-connected pages, groups, and fake accounts. Some pages, including dozens found by Reuters, style themselves as independent news or entertainment sources, and post the same content simultaneously on Facebook and other social media.
Others have taken out paid ads to promote politicians, including army chief Min Aung Hlaing – reaching more than a million people – but ran them without a disclaimer in violation of Facebook policies on political advertising.
“Facebook has taken down a few networks of assets involved in this type of problematic behaviour, but there is a lot more,” said MTAN researcher Victoire Rio.
“These malicious actors always seem to come back, often using the same names and logos, which calls into question Facebook’s ability to tackle these issues comprehensively.” – Thomson Reuters Foundation.