Home » Facebook Owner to Help Train Australian Politicians, Influencers in Run-up to Election

Facebook Owner to Help Train Australian Politicians, Influencers in Run-up to Election

Facebook owner Meta Platforms FB.O will help train Australian political candidates on aspects of cyber security and coach influencers to stop the spread of misinformation in a bid to boost the integrity of an upcoming election, it said on Tuesday.

Australia has not yet set a date for its next election, which is due by May. Authorities are already on high alert for electoral interference, having previously highlighted foreign interference attempts aimed at all levels of government and targeting both sides of politics.

“We’ll stay vigilant to emerging threats and take additional steps, if necessary, to prevent abuse on our platform while also empowering people in Australia to use their voice by voting,” Josh Machin, the company’s Australian chief of public policy, said in a statement that is to be posted online.

The social media giant added that it had drafted in a university to help with fact-checking operations in Australia and would require disclosure of the names of those paying for election-related advertisements, in what it called its most comprehensive election strategy.

The steps show how social media firms are seeking to combat online distortion and abuse of information during the lead-up to an election, a time when such efforts are typically at their most heated.

The Facebook Protect security program for high-profile individuals launched in Australia in December, with the company vowing to work with election officials and political parties to offer training for candidates on its policies and tools and ways to keep safe.

To avert hacking, it will prompt candidates to upgrade security to two-factor authentication. The company said it would also coach influencers, or those who earn advertising income from online commentary, to spot fake news.

People seeking to run election-related ads will need to furnish government-issued identification, as well as mandatory disclosures of funding sources for them, it said.

Ads by unauthorized parties, without funding disclosure, would be taken down and stored in a public archive for seven years, it added.

RMIT University, which joined Meta’s third-party fact-checking effort, said it would review posts the company identified as potential misinformation and try to verify them via interviews with primary sources and checks of public data.

“A continuing focus of our work is to identify the super spreaders of misinformation and the ecosystems in which they operate,” said RMIT FactLab Director Russell Skelton in a statement. “High impact misinformation disrupts evidence-based public policy and debate and so it is crucial we gain a better understanding of what drives this.”

Source: Voice of America

Related Posts