August 5, 2022
MANILA – Over 6 million Facebook and Instagram posts by Filipino users during the May elections were taken down by Meta for inciting violence, hate speech or bullying, according to a report that tracked poll-related social media content in the Philippines, known as “patient zero” in the global war against disinformation.
The social network also observed large-scale “inauthentic behavior” on its platforms, referring to fake accounts spamming election-related posts, according to Meta’s latest quarterly adversarial threat report, which monitored content from the Philippines in the four months leading up to the May 9 polls and the week after.
The bulk of the removed posts, or about 5 million, were flagged for breaching the platforms’ violence and incitement policies, said Melissa Chin, Meta’s content policy manager for Asia-Pacific.
These include posts pushing for “high-severity violence,” threats leading to serious injury and statements expressing violence related to voting, voter registration or election results, the report showed.
Some 670,000 posts on Facebook and Instagram were removed for containing hate speech, which Meta defined as a “direct attack on people based on their protected characteristics,” referring to race, ethnicity, religious affiliation, gender identity, sexual orientation or serious disease.
The types of attack include dehumanizing speech, stereotypes and expression of disgust, among others.
More than 550,000 posts that violated Meta’s bullying and harassment policies were also deleted. These included attacks on public figures using “severe sexualizing content, negative physical descriptions tagged to, mentioned or posted on the public figure’s account.”
“We removed a wide range of text because we know bullying and harassment can come in many different forms,” Chin said.
Meta’s policies also cover attacks on human rights defenders and journalists, which the social media giant refers to as “involuntary public figures.”
Meta’s removal of millions of posts over a four-month period was one of the biggest actions taken by the social network in the Philippines, whose citizens rank among the most avid users of social media in the world.
According to the Digital 2022 report, there were 83.85 million Facebook users and 18.65 million Instagram users in the Philippines in early 2022.
The same report found that Filipinos age 16 to 64 spent an average of four hours and six minutes using social media each day within the past 12 months, making the Philippines the world’s second most active country on social media.
Filipinos’ heavy use of social media has made them highly vulnerable to manipulation by coordinated and well-funded false information channels. In 2018, Facebook’s global politics and government outreach director Katie Harbath referred to the Philippines as “patient zero” in the global war against disinformation.
In response, David Agranovich, director of threat disruption at Meta, said the company had been expanding initiatives to take down more fake accounts.
“Think back to 2017, we initially built this network disruption model which aimed to disrupt mainly the activity of coordinated networks that were seeking to manipulate public debates and deceive people using fake accounts, what we now call coordinated inauthentic behavior,” he said.
“We want to address the threats we’re seeing in our platform and … we want to evolve how we respond to adversaries as they adapt to our enforcement and try to stay afloat,” Agranovich added.
He noted that over 15,000 accounts in the Philippines were detected to be violating inauthentic behavior (IB) policies, leading to their takedown.
“They used IB tactics to inflate the distribution of content that included election-related posts, including some that used politics as a spam lure at the time when people were interested in following these topics,” the report noted.
In April, Meta reported that it had removed networks of more than 400 accounts, pages and groups in the Philippines as part of its efforts to fight disinformation ahead of the May elections.
In September 2020, Meta removed 155 accounts, 11 pages, nine groups and six Instagram accounts for violating their policy against foreign or government interference and showing coordinated inauthentic behavior on behalf of a foreign or government entity.
The contents of the accounts and pages were supportive of former President Rodrigo Duterte and the possible 2022 presidential bid of his daughter, then Davao City Mayor Sara Duterte, who eventually ran for and won the vice presidency.
Nathaniel Gleicher, head of Facebook security policy, said fake accounts were also traced to individuals from China’s Fujian province posting en masse about global news and matters relating to the West Philippine Sea and operations of the US Navy.
In March 2019, Facebook also took down over 200 accounts belonging to a network managed by Nic Gabunada, Duterte’s social media strategist in the 2016 presidential election, because of similar behavior.