Australia Fines X for Not Providing Information on Child Abuse Content
Australia said on Sunday that it would fine X for failing to provide information about its efforts to combat child exploitation and that the social media service had told officials that its automated detection of abusive material declined after Elon Musk bought the company.
The amount of the fine is 610,500 Australian dollars, or about $384,000.
X, formerly known as Twitter, did not comply with a national law that requires platforms to disclose what they are doing to fight child exploitation on their services, Australian officials said. They said they had sent legal notices to X, Google, Discord, TikTok and Twitch in February, asking the companies for details about their measures for detecting and removing child sexual abuse material.
“Companies can make empty statements like ‘Child exploitation is our top priority,’ so what we’re saying is show us,” Julie Inman Grant, Australia’s commissioner in charge of online safety, said in an interview. “This is important not only in terms of deterrence in the types of defiance we are seeing from the companies but because this information is in the public interest.”
Mr. Musk bought Twitter for $44 billion last October. Since then, he has renamed the service X and loosened the platform’s content moderation rules. The company said this year that it was suspending hundreds of thousands of accounts for sharing abusive material, but a New York Times review in February found that such imagery persisted on the platform.
X told Australian officials that its detection of child abuse material on the platform had fallen to 75 percent from 90 percent in the three months after Mr. Musk bought the company. The detection has since improved, X told them.
Google and X failed to answer all of the regulator’s questions, Australian officials said. While Google received a warning, they said, X’s lack of a response was more extensive.
Tech companies take varied approaches to detecting and eradicating child sexual abuse materials. Some use automated scanning tools on all parts of their platforms, while others use them only in certain circumstances. Several of the companies said they responded to reports of abuse within minutes, while others take hours, according to a report from Australia’s eSafety Commissioner.
X can appeal the fine. The company did not immediately have a comment. Lucinda Longcroft, a director of government affairs and public policy for Google, said in a statement, “Protecting children on our platforms is the most important work we do.” She added, “We remain committed to these efforts and collaborating constructively and in good faith with the safety commissioner, government and industry on the shared goal of keeping Australians safer online.”
X also told the Australian regulator that it maintained a “zero-tolerance policy” on child sexual abuse material and was committed to finding and removing the content on its platform. The company said it uses automated software to detect abusive images and has experts who can review content shared on the platform in 12 languages.
In response to whether children might be targeted for grooming on X, the company told the regulator, “Children are not our target customer, and our service is not overwhelmingly used by children.”
Linda Yaccarino, X’s chief executive, recently said at a conference that Generation Z was the company’s fastest-growing demographic, with 200 million teenagers and young adults in their 20s visiting the platform each month.