Four views, one issue: Four personalities share solutions against disinformation

By Raphael Polintan

Another session of Edu2022 and SUCs2022 Voter Conversations entitled “The Truth Wars: Combating Fake News and Disinformation,” was conducted on Friday, January 28, 2022, featuring Rappler CEO and Nobel Laureate Maria Ressa, Communications Researcher Fatima Gaw, Ateneo Professional Schools Vice President John Paul Vergara, and former Department of Tourism Undersecretary Vicente “Enteng” Romano III.

The webinar discussed the fight for the truth from the perspectives of journalism, research, technology, and social influence to educate members of the public on disinformation practices and help ensure the integrity of historical records and the upcoming elections.

Behavior Manipulation

Ressa began her discussion by quoting Edward O. Wilson, “The biggest crisis that we are facing right now are our Paleolithic emotions, our medieval institutions, and our god-like technology.”

“Our emotions are driving us, our institutions have failed to protect us, and the technology…is shaping what we are becoming,” Ressa explained.

She discussed that through the amplification of lies and the stigmatization of journalists, fact-checkers, and institutions, politicians are able to undermine democracy and control the behavior of voters by appealing to their biology.

“If you can make people believe lies are the facts, then you can control them,” she reiterated.

She further explained that the “bottom-up,” where entities become portrayed as criminal and false information becomes mainstream, would be followed by “top-down,” where power (e.g. politicians) comes down to take advantage of the public perception.

Moreover, Pro-Duterte/Marcos communities have grown due to social media algorithms causing overwhelm in fact-checking communities.

Ressa remarked, “When you hear a lie a million times, to you, because of your biology, it becomes a fact.”

Last December 2019, Ressa was attacked by numerous tweets demanding the shutdown of Rappler when she retweeted a study by the University of the Philippines on Bongbong Marcos’ diploma. After an investigation by Rappler, it was determined that the accounts that slammed her were relatively new with most created during October 2021, around the same month Marcos filed his candidacy for presidency.

Rappler then published an article entitled “Marcos Network Tries to Take Over Twitter with Freshly-Made Accounts” on January 18, 2022. A few days later, on January 21, Twitter suspended over 300 accounts in the Marcos network for inauthentic engagements.

The phenomenon, according to Ressa, is astroturfing, the deceptive practice of hiding the true sponsors of a message by using the public as a disguise.

Ressa then presented FactsFirstPH to end her discussion, a movement launched recently to battle false information on social media platforms. More than one hundred organizations have joined so far to form the initiative in support of truth.

Five Insights, Five Lessons

Gaw succeeded Ressa and shared some insights from her research to paint a bigger picture of how false information influences voter discourse and perceptions on candidates and issues.

Firstly, Gaw pointed out that determining the author of a message is equally important as judging whether or not the content is true. The information from fake accounts, she stated, could lead researchers to disinformation campaigns.

She proceeded to present some evidence that pointed toward the existence of organized disinformation operations. One example involved pseudo-news channels, such as BANAT NEWS TV and JUST in BALITA, that have become the most recommended on YouTube. 

Although the channels brand themselves as objective figures, underneath their camouflage, they are actually “political apparatuses meant to deceive people.”

Meanwhile, Twitter has seen a rise of “unidentifiable accounts” or accounts that do not contain personal information for identity verification. Although some could belong to legitimate users, most unidentifiable accounts engage in manipulation and push certain political views.

For her second insight, Gaw highlighted that a common characteristic between disinformation actors is their hostile stance toward institutions; they ridicule and defame researchers, experts, journalists, and the collective groups who preserve the truth.

Gaw referenced influencers Maharlika and Sangkay Janjan TV who stated that ABS-CBN spread lies and that history books cover-up the true past, respectively, as an example.

“Without clear-cut evidence of any of these claims, they are conspiratorial, and this conspiracy versus the media and the academe is planting doubt on our integrity and credibility. When that happens, people seek for alternative sources of information and who else would they turn to but these same people?” she said.

“What they effectively do is break our confidence and trust in professional institutions and pave the way for their own prominence as the new gatekeepers of knowledge,” Gaw further explained.

Thirdly, Gaw highlighted the power of accounts spreading information en masse, saying that the influence of disinformation operations is amplified through the number of videos presenting fake data and narratives, thereby compelling people to believe wrong information through the number of other people discussing and agreeing with the falsified facts.

A new tactic was also observed by Gaw’s research within the Marcos-Sara account network where content creators would post videos of “kalye surveys” or on-the-ground surveys asking people who they will vote for. Gaw was quick to clarify that although the strategy involved surveys, the results presented to the public were prone to error and manipulation.

“These surveys are not scientific at all. They’re not representative of Filipino voters…What this does is manufacture political reality, a more exaggerated and a hyperbolic version.”

Gaw added in her fourth insight that disinformation is potent and prevalent because of the spreaders’ mastery of social media affordances, limitations, and loopholes.

She explained, “Even if the topic at hand is history…they are able to infiltrate that space and eventually take over the list. They do this by using clickbait or controversial headlines like ‘untold history’ or ‘hidden truth.’ They tag their content in keywords that pertain to popular keywords like Duterte or even bloggers like Wil Dasovich but the content isn’t really about that.”

“Because they rank high, regular users would believe that they are credible sources of knowledge because that’s how they trust these search engines,” she added.

Finally, Gaw suggested that malignant operations no longer spread disinformation but instead spread amplified propaganda. Widespread counter-discourse is targeted against certain entities, such as ABS-CBN, emphasizing key arguments such as the media company “being anti-democratic” or “is simply being held accountable.”

“When [they] see all this counter discourse, [they] feel antagonistic versus ABS-CBN and [they] feel similarly against other media outlets that are against the administration, for instance, because it seeds doubt and manufactures public opinion as if it is the majority opinion of the Filipino people.”

Gaw furthered her point by presenting the ad-hominem campaigns launched by entities that are against the personalities they support.

“They use crude language like ‘epal,’ pardon my language, ‘gago,’ ‘bulok,’ ‘mandaraya,’ all these as a political strategy. This is quite unproductive especially in the election season because it pushes people away from engaging with issue-based dialogue and deliberating the public interest in the face of the election.”

Having indicated her insights from the studies she co-authored, Gaw then presented to the audience five lessons she learned on what people can do to fight disinformation.

Firstly, she urged removing preconceptions about disinformation, stressing the importance of seeing it as a phenomenon that requires evidence to be understood. Further, she stated that we must be honest and acknowledge the declining image of institutions, also saying that we must be more inclusive and invite neglected communities to the conversation.

Moreover, she recommended that the academe should provide safe and accessible places to individuals open for political discourse. She also added that the media must be supported and a culture of consuming reliable news should be adapted among the youth.

Finally, Gaw emphasized the importance of understanding how social media platforms function, particularly the process of distributing and recommending content to users.

“Only through this knowledge can we truly know the real damage of disinformation and demand accountability from those big-tech platforms,” she remarked.

A Technology-Based Perspective

Proceeding after Gaw, Vergara approached the war on disinformation through the perspective of technology and data science.

Artificial intelligence (AI) and data science play a critical role in the automatic and efficient detection of trolls by referencing the behavior of trolls, said Vergara. 

“In summary, the way AI is applied in this context is we obtain features from a post like number of characters, keywords, extracted topics, sentiments, syntax, comprehension, what time the post happened, how it contrasts with other posts in the thread, etc. and then feed these into a training algorithm to separate troll posts from normal posts,” he said.

Vergara added that an account’s profile and behavior could also be used to determine its authenticity. Aggregate data from the time and date of account creation, IP addresses, and account names could be used to bolster detection capabilities since troll farms likely produce accounts with tacit consistencies.

Although algorithmic methods are promising in the war against disinformation, such methods are limited in their capability to identify malignant accounts, according to Vergara. 

Following his discussion, Vergara advised the public to “not feed the trolls.” He said that emotions must be kept in check, research must be conducted, and the truth must be held.

“Technology here is context. As far as solutions go for our misinformation dilemma, it can offer few solutions but, in the end, it is a social engineering game…It’s about disruptive ideas towards good or ill. Best to keep your emotions in check and apply critical thinking as you deal with information.”

He also pointed out the intrinsic bias of an individual when confronted with opposing information. A person’s advocacies and biases, he stated, could trigger them, which could be exploited or lead to the truth being bent.

“Suppose we do have that perfect automated system where we could block lies and ensure only facts are posted in social media…facts can still be twisted towards conclusions that align with advocacies and agenda.”

Vergara concluded his presentation by urging the public to fight for not just facts and truth, but also for respectful discourse. He also reaffirmed Gaw’s position that institutions should create secure places for discourse, stating the need for universities to be places where healthy discourse could be facilitated.

Using Their Own Medicine Against Them

Roman spoke after Vergara and began by criticizing society’s seemingly nonchalant stance in preventing disinformation in the previous decades.

“Sabi ni Winston Churchill at some point, ‘History is written by the victors.’ Unfortunately, I think tayo, who were the victors of EDSA revolution, we did not do a good job at writing it kasi even in our textbook, I understand we only devote a page and a half to martial law…and we even allowed the vanquished, and I’m talking about the Marcoses here, to write their own version of history as far as martial law is concerned and we allowed them to peddle that story to the Filipino people, and where did it get us?” said Roman.

According to him, numerous individuals now believe that former president and dictator Ferdinand Marcos was a mythical president who led the Philippines to a better future while former senator Ferdinand “Bongbong” Marcos Jr. was his father’s rightful heir.

Roman proceeded to offer a solution, proposing the implementation of his own project, Strategic Truth-Seeding in Their Battlefield, where evidence-based content is planted among false ones to disrupt disinformation operations.

He added that due to the inability of people to change their mind because of confirmation bias— the tendency to process information in a manner that supports preconceived beliefs—it is imperative to establish an emotional connection to a form the common person is accustomed to.

“[People] are very selective in the content that they consume. Kung aligned ‘yun sa paniniwala nila, then they read it, they will pass it on, they will share it. Pero kung contrary to what they already believe in, they will simply reject it no matter what the source.”

Roman’s truth-seeding project began last November 2021, debunking false components of the pro-Marcos narratives in YouTube and Facebook. He stylized his videos in a manner similar to videos with revisionist content, emulating elements such as thumbnails and titles, to “get into the mix” and influence the algorithm to infiltrate disinformation networks.

He also supported the role of the academe in controlling disinformation as a long-term solution, recommending the strengthening of the martial law narrative and adopting it into Philippine curricula as an important component. He further suggested that students should be taught how to discern fake and reliable narratives since they are prone to disinformation given their higher exposure to the Internet.

Moreover, citing Germany’s provisions that prohibits holocaust denial and Nazi promotion, Roman opened the possibility of pursuing legal action and passing laws against the distortion of the martial law narrative.

Photo source: Ateneo de Manila University on Facebook