Social media giant Facebook says it will crackdown on misinformation on the platform in the lead up to the federal election in May, but Reset Australia says the plan is a “woefully inadequate” distraction from the bigger problems embedded in the business.
Facebook’s parent company Meta announced this week that it will ramp up third-party fact checking of content on the platform ahead of the election to examine potentially misleading and/or harmful content. That said, even if it’s false, Facebook won’t necessarily take it down, but rather “significantly reduce its distribution” and alert users sharing it.
Melbourne university RMIT, which is already doing media fact checking via its FactLab, will partner with Facebook alongside with Agence France Presse and the Australian Associated Press to review and rate content for accuracy.
Meta’s Australian public policy boss Josh Machin they’ll also be on the lookout for abuse directed at politicians.
“We’re particularly talking about misinformation, but we’ll have experts who can cover hate speech and violent organisations, and the full gamut of potential community standards violations that we might see in a really critical important time,” he said.
“We’ll stay vigilant to emerging threats and take additional steps, if necessary, to prevent abuse on our platform while also empowering people in Australia to use their voice by voting.”
Meta is no doubt keen to get on the front foot around election misinformation and manipulation with the taint of the 2016 US election, where around 120 fake Russian-backed pages created 80,000 posts that were received by 29 million Americans directly and 129 million in total, no doubt still front of mind.
The platform is also in hot water over its role in spreading misinformation in the lead up to the January 6 Capitol riot last year and mining billionaire Andrew Forrest has launched legal action against Meta over scam ads featuring him that appeared on Facebook.
And many remain wary of Facebook after it blocked and took down posts by hospitals, arts organisations, charities, ASX-listed companies and essential services, among others, in February last year, during its fight with the federal government over paying media companies for news
“We’re helping equip the Australian community with media literacy skills that can cover misinformation wherever they encounter that – whether that’s online, if it’s potentially on our services, or potentially on other online services, but also offline,” he said.
“And we know misinformation can be spread in conversations with family and friends.”
Fake ads approved
But Facebook critic Reset Australia described Meta’s efforts as “too late, woefully inadequate and another attempt to distract from the main threat to our election — the algorithmic amplification of problematic content”.
Reset Australia recently embarrassed the social media platform with an experiment that saw it design five Facebook ads containing election disinformation commonly used during the last US election to show how easy it is to subvert the platform’s ad approval mechanism. They ranged from saying the election was cancelled due to covid to both Labor and the Liberals being caught printing ballots, as well as a “no jab, no say” message purporting to be from the AEC. They were approved.
While Machin said in the company’s defence that they “did not go live”, Reset scheduled them for several months down the track to ensure they weren’t seen.
Dhakshayini Sooriyakumaran, Reset’s tech policy director, said Meta’s election plan was inadequate in three key ways.
“Meta has announced these inadequate election safeguards at the very last minute, on the eve of the election, framing them as preventative. Proactive action requires systemic, upstream, legally binding regulation,” she said.
“The company has attempted to delay this through their co-drafting of an incredibly weak, voluntary, opt-in Australian Code of Mis and Disinformation (through its industry peak body DIGI).
“Meta’s proposed actions continue to place responsibility on individuals and the public to be more aware of the harmful content that the platform itself chooses to serve them. This is gross neglect of responsibility and accountability.”
Sooriyakumaran said third-party fact checking is too slow in the election context, and occurs after the platform has served harmful content to Australian voters, with the Facebook algorithm already amplifying it.
“During the last US election Meta temporarily adapted the algorithm to reduce the distribution of sensational material, and prioritise content from authoritative sources. This has not been considered in Australia,” she said.
She added that Meta under-invests in Australia and Reset’s January experiment demonstrated that Facebook’s political ad transparency measures aren’t as effective as they claim.
Daily startup news and insights, delivered to your inbox.