Study identifies signs of 'political manipulation' on social media
ADVERTISEMENT

Welcome, Kapamilya! We use cookies to improve your browsing experience. Continuing to use this site means you agree to our use of cookies. Tell me more!
Study identifies signs of 'political manipulation' on social media
Mico Abarro,
ABS-CBN News
Published Feb 02, 2022 09:36 PM PHT

MANILA - A study identified five signs of supposed 'networked political manipulation' on social media as the 2022 Philippine elections neared.
MANILA - A study identified five signs of supposed 'networked political manipulation' on social media as the 2022 Philippine elections neared.
These actions that may be seen as systematic effort to push political agenda are part of the findings of the cross-platform review of the Philippine Election Digital Landscape by the Philippine Media Monitoring (PMM) Laboratory presented on Wednesday.
These actions that may be seen as systematic effort to push political agenda are part of the findings of the cross-platform review of the Philippine Election Digital Landscape by the Philippine Media Monitoring (PMM) Laboratory presented on Wednesday.
The research group's "Digital Pulse Project" studied discussions on Facebook, Twitter, and YouTube between May and October 2021 related to the elections.
The research group's "Digital Pulse Project" studied discussions on Facebook, Twitter, and YouTube between May and October 2021 related to the elections.
During that period, they found indicators of 'networked political manipulation' on those platforms such as the mass removal of content that attacked certain election personalities, as well as certain social media accounts adopting the look and style of news media and other institutions to gain credibility.
During that period, they found indicators of 'networked political manipulation' on those platforms such as the mass removal of content that attacked certain election personalities, as well as certain social media accounts adopting the look and style of news media and other institutions to gain credibility.
ADVERTISEMENT
Here are five characteristics of content that may be part of political manipulation on social media.
Here are five characteristics of content that may be part of political manipulation on social media.
1. Highly influential accounts that became suspended and unavailable.
One of the signs was highly influential accounts suddenly becoming suspended and unavailable on Twitter and Facebook, indicating they've been taken down by the platforms but only after damaging discussion of the elections.
One of the signs was highly influential accounts suddenly becoming suspended and unavailable on Twitter and Facebook, indicating they've been taken down by the platforms but only after damaging discussion of the elections.
"During data collection, these accounts were one of the most influential on the platform," said Assistant Professor Fatima Gaw, Digital Pulse Project co-lead. "Most retweeted, they are replied to, and shared in the platform. But during data analysis, when we actually click the link and open their kind of content, they're suddenly suspended, they're unavailable for further analysis."
"During data collection, these accounts were one of the most influential on the platform," said Assistant Professor Fatima Gaw, Digital Pulse Project co-lead. "Most retweeted, they are replied to, and shared in the platform. But during data analysis, when we actually click the link and open their kind of content, they're suddenly suspended, they're unavailable for further analysis."
"Whatever transgressive activities they're doing, the content they're pushing out there might have already negatively impacted the discourse because by the time they're taken down, their content has already gone viral."
"Whatever transgressive activities they're doing, the content they're pushing out there might have already negatively impacted the discourse because by the time they're taken down, their content has already gone viral."
2. 'Unidentified' accounts being the most shared and interacted with.
Another sign was the presence of accounts that were highly-active on Facebook and Twitter, but whose identities could not be verified. PMM Laboratory said these accounts could be mobilized to manufacture public support on the platforms without consequences.
Another sign was the presence of accounts that were highly-active on Facebook and Twitter, but whose identities could not be verified. PMM Laboratory said these accounts could be mobilized to manufacture public support on the platforms without consequences.
ADVERTISEMENT
"They cannot be verified as real people, or as having personal traces in other social media accounts. So it's possible that they are real people being anonymous, they're using an anonymous account for their protection," Gaw said. "But they might also be trolls projecting as unique individuals or even automated bots."
"They cannot be verified as real people, or as having personal traces in other social media accounts. So it's possible that they are real people being anonymous, they're using an anonymous account for their protection," Gaw said. "But they might also be trolls projecting as unique individuals or even automated bots."
3. 'Cloaked' political content in non-political accounts.
The study also noted that political content was being shared by Facebook pages that were normally non-political like entertainment, gaming, and meme pages. It could be a manipulative tactic that "obfuscated" the content's purpose to influence voters without being identified as part of political campaigning.
The study also noted that political content was being shared by Facebook pages that were normally non-political like entertainment, gaming, and meme pages. It could be a manipulative tactic that "obfuscated" the content's purpose to influence voters without being identified as part of political campaigning.
"These are what we call 'cloaked' accounts because they cloak or veil themselves in generic conventions or categories outside of news and politics. But they themselves are sharing again political content, pushing political content for political candidates," Gaw said.
"These are what we call 'cloaked' accounts because they cloak or veil themselves in generic conventions or categories outside of news and politics. But they themselves are sharing again political content, pushing political content for political candidates," Gaw said.
4. 'Hit and run' posting strategy.
On YouTube, as well as Facebook and Twitter, accounts were also observed to implement mass removal of content after it had generated "massive" views and engagement. PMM Laboratory said these could have been published on purpose to incite controversy against particular personalities, then taken down to remove evidence of hostile activity.
On YouTube, as well as Facebook and Twitter, accounts were also observed to implement mass removal of content after it had generated "massive" views and engagement. PMM Laboratory said these could have been published on purpose to incite controversy against particular personalities, then taken down to remove evidence of hostile activity.
"The motives behind mass removal of content is unclear," Gaw said. "What is common among them is really the content that were taken down. It's inflammatory, it's hostile in nature against particular candidates and groups. So what we surmise is that this mass removal is purposive, there's an intent behind it."
"The motives behind mass removal of content is unclear," Gaw said. "What is common among them is really the content that were taken down. It's inflammatory, it's hostile in nature against particular candidates and groups. So what we surmise is that this mass removal is purposive, there's an intent behind it."
ADVERTISEMENT
5. Co-optation of institutional and professional logics and practices
Some accounts on Facebook and YouTube were also seen to present themselves as news media sites and social movements "to harness credibility and invoke people’s trust in these institutions." PMM Laboratory considered committing this act without exercising professional standards as "deceptive and malicious."
Some accounts on Facebook and YouTube were also seen to present themselves as news media sites and social movements "to harness credibility and invoke people’s trust in these institutions." PMM Laboratory considered committing this act without exercising professional standards as "deceptive and malicious."
"Their overt strategy in communicating or launching political content actually adapts the stylistic, discursive, and cultural practices of professional media and knowledge industry like the academe or research agencies to make it look like their content is legitimate, have undergone the proper research, and they are trustworthy," Gaw said.
"Their overt strategy in communicating or launching political content actually adapts the stylistic, discursive, and cultural practices of professional media and knowledge industry like the academe or research agencies to make it look like their content is legitimate, have undergone the proper research, and they are trustworthy," Gaw said.
PMM Laboratory warned that anti-democratic actors were hijacking political discourse on social media by both co-opting these institutions while at the same time attacking them.
PMM Laboratory warned that anti-democratic actors were hijacking political discourse on social media by both co-opting these institutions while at the same time attacking them.
It also found that across Facebook, YouTube, and Twitter, non-political actors like influencers and bloggers were "catapulting" the growth of politicians on social media.
It also found that across Facebook, YouTube, and Twitter, non-political actors like influencers and bloggers were "catapulting" the growth of politicians on social media.
Mainstream media meanwhile, while maintaining a large presence on social media, was "detached" from other audiences on the platforms and "limited" in influence.
Mainstream media meanwhile, while maintaining a large presence on social media, was "detached" from other audiences on the platforms and "limited" in influence.
ADVERTISEMENT
"The fact that most news media, they're located within the same clusters, we see Rappler, Inquirer, ABS-CBN, they're located in just one cluster, its sort of indicative they're reaching the same set of newsreading audiences," said DPP research co-lead, Assistant Professor Jon Benedik Bunquin.
"The fact that most news media, they're located within the same clusters, we see Rappler, Inquirer, ABS-CBN, they're located in just one cluster, its sort of indicative they're reaching the same set of newsreading audiences," said DPP research co-lead, Assistant Professor Jon Benedik Bunquin.
"The declining centrality of news media as an actor that's overtaken by other actors such as obscure accounts on Twitter or political pages on Facebook is detrimental to the political knowledge of voters who rely on social media for political information."
"The declining centrality of news media as an actor that's overtaken by other actors such as obscure accounts on Twitter or political pages on Facebook is detrimental to the political knowledge of voters who rely on social media for political information."
The same study found that pages supportive of former senator and presidential aspirant Bongbong Marcos "grew in influence" on Facebook, while election-related discussions about Marcos grew on Twitter between May to October 2021.
The same study found that pages supportive of former senator and presidential aspirant Bongbong Marcos "grew in influence" on Facebook, while election-related discussions about Marcos grew on Twitter between May to October 2021.
Read More:
2022 Elections
Halalan 2022
PMM Laboratory
Social Media
Online
YouTube
political manipulation
networked political manipulation
ADVERTISEMENT
ADVERTISEMENT