Australia Probes Meta, TikTok, YouTube Over Child Safety

Australia’s online safety regulator is conducting formal investigations into Meta, Snap, TikTok, and YouTube regarding potential non-compliance with the country’s ban on social media for children under 16. The investigations stem from a recent report highlighting concerns about the platforms’ age verification processes and existing loopholes.

Investigation Details and Concerns

The eSafety Commissioner’s office released its first compliance report on Tuesday, revealing “significant concerns” about the enforcement of the legislation, which took effect on December 10, 2025. The report identified gaps in the companies’ policing measures, including insufficient safeguards against underage account creation and systems allowing repeated age verification attempts.

Landmark Legislation and Global Impact

The law, prohibiting those under 16 from holding accounts on major platforms, is being closely monitored globally as other jurisdictions consider similar measures to protect young people from online harms. Despite a reported decrease in under-16 accounts over the past four months, the regulator noted that substantial numbers of children are still accessing these platforms.

Regulator's Statement and Expectations

“These platforms have the capability to comply today and we certainly expect companies operating in Australia to comply with our safety laws,” stated eSafety Commissioner Julie Inman Grant. She warned of escalating consequences for non-compliance, including potential reputational damage.

Platform Responses

Meta, the parent company of Facebook and Instagram, reaffirmed its commitment to the ban. A spokesperson stated, “We are committed to complying with Australia’s social media ban and working constructively with eSafety and the government.” They also acknowledged the challenges of accurate age determination online and advocated for robust age verification and parental approval at the app store level.

Broader Context and Legal Precedents

The regulatory scrutiny follows a recent US case where Meta and Google were found liable for damages related to a woman’s mental health struggles attributed to social media addiction. This verdict has sparked debate about potential legal responsibility for tech platforms regarding user experiences. The eSafety investigation covers Facebook, Instagram, Snapchat, TikTok, and YouTube.

Enforcement and Future Standards

To successfully enforce the ban, the regulator must demonstrate that the platforms failed to take reasonable steps to prevent underage users from maintaining accounts. The outcomes of these cases could establish important precedents for the enforcement of Australia’s social media ban and the expected standards of compliance from global tech companies.