Friday, December 13, 2024
News

Australia regular demands Apple, others share CSAM measure plans

Apple is being sued for failing to curtail child sexual abuse material on iCloud.

An Australian regulator has sent legal letters to Apple and Facebook owner Meta Platforms demanding they share their strategies for stamping out child abuse material on their platforms or face fine, according to Reuters.

Apple announced measures to prevent the spread of Child Sexual Abuse Material (CSAM) in August 2021, but was widely criticized over security and censorship. Australia’s e-Safety Commissioner has sent what are described as legal letters to Apple and Meta regarding new Australian laws to compel disclosure.

“This activity is no longer confined to hidden corners of the dark web, but is prevalent on the mainstream platforms we and our children use every day,” Commissioner Julie Inman Grant told Reuters. “As more companies move towards encrypted messaging services and deploy features like livestreaming,” she continued, “the fear is that this horrific material will spread unchecked on these platforms.”

On August 6, 2021, Apple previewed new child safety features coming to its various devices that were due to arrive later this year. And the announcement stirred up lots of controversy. Here’s the gist of Apple’s announcement: Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts. First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

However, in a September 3 statement to 9to5Mac, Apple said it was delaying its controversial CSAM detection system and child safety features. 

Here’s Apple’s statement: Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

Dennis Sellers
the authorDennis Sellers
Dennis Sellers is the editor/publisher of Apple World Today. He’s been an “Apple journalist” since 1995 (starting with the first big Apple news site, MacCentral). He loves to read, run, play sports, and watch movies.