General News

Australia Watchdog: Tech Giants Failing Child Protection

Download IPFS

SYDNEY – Australia’s online safety regulator has sharply criticised major technology firms specifically Google’s YouTube and Apple for failing to properly safeguard children from sexual abuse material appearing on their platforms.

In a transparency report released on Wednesday, eSafety Commissioner Julie Inman Grant stated that these companies have not implemented essential protections such as cloud‑scanning, hash‑matching, and language‑analysis tools to identify grooming, sexual extortion, livestreamed abuse, and synthetic or AI-generated child sexual abuse material

. The report found that Apple and YouTube did not track basic metrics: they failed to report how many abuse reports they received or how quickly they responded, nor did they disclose staffing levels in their Trust & Safety teams.

“In many cases, these companies aren’t prioritising the protection of children and are seemingly turning a blind eye to crimes occurring on their services,” Commissioner Inman Grant said in a formal statement.

The biannual disclosure was issued under Australia’s Online Safety Act, which mandates eight major platforms including Apple, Google, Meta, Microsoft, Discord, Snap, Skype, and WhatsApp to report every six months on how they combat child sexual exploitation online. The latest findings show minimal progress since earlier reports in 2022 and 2023.

Google responded by disputing the report’s interpretation, arguing that its figures stem from internal reporting metrics, not performance shortcomings, and emphasising that over 99% of reported abuse content on YouTube is automatically removed prior to viewership. Apple and other companies did not issue comments by the time of publication.

The controversy comes amid broader reforms in Australia’s online safety landscape. The government recently decided to include YouTube in a social media ban for under‑16s following eSafety’s advice despite its educational role. Critics warn the policy could inadvertently reduce platform transparency and remove certain child‑safe features, increasing overall risk.

Australia’s legislation under the Online Safety Amendment (Social Media Minimum Age) Act 2024 will come into effect in December 2025, imposing fines of up to AUD 50 million for non-compliance. The amendment also requires age verification for social media services, expanding beyond earlier exemptions.

Commissioner Inman Grant urged tech firms to act swiftly

Leave a Comment

Your email address will not be published. Required fields are marked *

*

OPENVC Logo OpenVoiceCoin $0.00
OPENVC

Latest Market Prices

Bitcoin

Bitcoin

$68,665.29

BTC -1.99%

Ethereum

Ethereum

$2,060.86

ETH -2.83%

NEO

NEO

$2.68

NEO -0.43%

Waves

Waves

$0.44

WAVES -0.70%

Monero

Monero

$331.50

XMR -1.03%

Nano

Nano

$0.46

NANO -1.83%

ARK

ARK

$0.17

ARK -0.52%

Pirate Chain

Pirate Chain

$0.19

ARRR -10.23%

Dogecoin

Dogecoin

$0.09

DOGE -0.85%

Litecoin

Litecoin

$54.94

LTC -0.75%

Cardano

Cardano

$0.25

ADA -3.22%

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.