Several of the world’s top tech companies face potential fines of nearly $800,000 per day if they don’t provide the Australian government with twice-yearly reports detailing their efforts to combat child abuse material on their platforms.
On Wednesday, the eSafety Commissioner issued new legal notices to major tech firms like Apple, Google, Meta, and Microsoft under the Online Safety Act. These notices mandate that the companies submit biannual reports outlining the measures they have taken to address this critical issue. The legal notices also extend to Discord, Snap, Skype, and WhatsApp. These companies must report on their strategies to prevent the distribution of child abuse material, livestreamed abuse, online grooming, sexual extortion, and AI-generated deepfake abuse content.
“We’re ramping up the pressure on these companies to step up their efforts,” said eSafety Commissioner Julie Inman Grant. “They need to show us they are making progress by reporting every six months.”
Deadline for Compliance
Tech firms have until February 15 next year to comply with these reporting requirements. Companies that fail to meet these deadlines will incur daily fines of $782,500, with the eSafety Commissioner ready to pursue legal action to enforce these penalties.
Ignoring the Problem
One of the most troubling revelations was that Apple and Microsoft were not proactively detecting child abuse material on their cloud services. Despite the widespread knowledge that these platforms can harbor such content, they failed to act. It was also found that Microsoft took an average of two days to respond to user reports of child sexual exploitation and abuse. In some cases, responses took up to 19 days if a re-review was necessary. Meta reported making 27 million reports of child sexual abuse to authorities, whereas Apple only reported 267 cases, indicating a significant disparity.
“The problem is they’re willfully ignoring the issue,” Inman Grant told ABC RN Breakfast. “They’re not looking closely at what’s on their platforms and not allowing users to report such content.” The eSafety Commission plans to publish summaries of the reports from these tech firms, adding a layer of public accountability.
Previously, similar notices were issued to these tech companies last year, leading to concerning responses that have now resulted in stricter enforcement. “In our follow-up discussions, we haven’t seen significant improvements in addressing these safety issues,” Inman Grant added.
Support and Feedback
The new reporting requirements have been welcomed by the International Justice Mission, an NGO focused on human rights and law enforcement. “These companies have so far failed to ensure a safe online environment for children,” said David Braga, the country director of International Justice Mission Australia. “We hope that increased transparency will push big tech companies to review their content distribution systems.”
Legal Actions and Future Measures
“These transparency measures will work alongside our mandatory codes and standards,” Inman Grant said. “For those who ignore the law or fail to pay fines, like X Corp, we will pursue legal action.”
The eSafety Commission imposed its first fine under this scheme to X (formerly Twitter) last year after the company failed to address inquiries about child abuse material on its platform. X has refused to pay the over $600,000 fine and is currently in a legal battle with the eSafety Commission. Similar notices were also issued earlier this year, requiring tech firms to report on their efforts to combat the spread of terrorism content on their platforms.
For support, you can contact: Lifeline on 13 11 14, 1800RESPECT on 1800 737 732, Kids Helpline on 1800 551 800, Beyond Blue on 1300 22 46 36, Headspace on 1800 650 890, MensLine Australia on 1300 789 978, QLife (for LGBTIQ+ people) on 1800 184 527, or the Suicide Call Back Service on 1300 659 467.
Subtly charming pop culture geek. Amateur analyst. Freelance tv buff. Coffee lover