fbpx
Social Media

Australia’s eSafety commissioner launches legal action against Elon Musk’s X Corp

- December 21, 2023 2 MIN READ
Twitter, X, rebrand, Elon Musk
Photo: AdobeStock
Australia’s online safety regulator, the eSafety Commission, has launched legal action against the Elon Musk’s rebranded Twitter company, X Corp, for its alleged failure to comply appropriately to requests to explain how the social media site was dealing with child sexual exploitation and abuse material.

The civil case in the Federal Court of Victoria follows several months of back-and-forth between Twitter, as it was then known, and its local lawyers, and the eSafety Commission, seeking clarification over how it was dealing with child abuse material.

eSafety alleges X Corp failed to comply with the notice, because it did not prepare a report in the manner and form specified and failed to respond or failed to respond truthfully and accurately to certain questions in the notice. The history of the company’s responses to the eSafety notices are here.

eSafety issued an infringement notice for $610,500 against X Corp in September 2023 for its failure to comply. X Corp did not pay it and challenged the matter review of eSafety’s reliance on the transparency notice and the giving of the service provider notification and the infringement notice.

In October eSafety Commissioner Julie Inman Grant released the Commission’s second report investigating how big tech companies are tackling issues of child sexual exploitation, sexual extortion and the livestreaming of child sexual abuse under the Online Safety Act.

Twitter was approached in February this year along with Google, TikTok, Twitch and Discord.

Twitter/X and Google, did not comply with the notices issued to them.

Inman Grant said at the time that the report, which summarises their answers, highlights serious shortfalls in how some companies detect, remove and prevent child sexual abuse material and grooming, and inconsistencies in dealing with that material across their different services, among other issues.

“We really can’t hope to have any accountability from the online industry in tackling this issue without meaningful transparency which is what these notices are designed to surface,” she said.

“What we are talking about here are serious crimes playing out on these platforms committed by predatory adults against innocent children and the community expects every tech company to be taking meaningful action.”

Twitter/X’s non-compliance was found to be more serious with the company failing to provide any response to some questions, leaving some sections entirely blank. In other instances, the responses were incomplete and/or inaccurate. The Musk-owned business did not respond to a number of key questions including the time it takes the platform to respond to reports of child sexual exploitation; the measures it has in place to detect child sexual exploitation in livestreams; and the tools and technologies it uses to detect child sexual exploitation material.

The report found that in the three months after Twitter/Xs change in ownership in October 2022, the proactive detection of child sexual exploitation material fell from 90% to 75%, but the company said said its proactive detection rate had subsequently improved in 2023.

Inman Grant said at the report’s release that “Twitter/X has stated publicly that tackling child sexual exploitation is the number 1 priority for the company, but it can’t just be empty talk, we need to see words backed up with tangible action”.

She flagged at the time X Corp’s failure to pay the fine would leave it open for eSafety to take other action.

The Commission said it launched legal action because “it is important that X Corp. and other providers are deterred from non-compliance with statutory notices”.