Sunday, December 22, 2024
News

Apple removes all mentions of CSAM from its Child Safety webpage

Apple is being sued for failing to curtail child sexual abuse material on iCloud.

As noted by MacRumors, Apple has removed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods.

On August 6, Apple previewed new child safety features coming to its various devices that were due to arrive later this year. And the announcement stirred up lots of controversy. Here’s the gist of Apple’s announcement: Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts. First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

However, in a September 3 statement to 9to5Mac, Apple said it was delaying its controversial CSAM detection system and child safety features. 

Here’s Apple’s statement: Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

Now that Apple has removed all mentions of CSAM from its Child Safety webpage, the future of the plan seems to be uncertain.

Dennis Sellers
the authorDennis Sellers
Dennis Sellers is the editor/publisher of Apple World Today. He’s been an “Apple journalist” since 1995 (starting with the first big Apple news site, MacCentral). He loves to read, run, play sports, and watch movies.

1 Comment

  • Bravo, I have always supported and will always support such an initiative that deals with child safety. Thank you, Apple. Since I am a father myself and this is the age of technology and the Internet – it turns out that we need to be more attentive and have conversations with children, shows different articles, for example from here https://phdessay.com/free-essays-on/safety/ where you can learn how to keep yourself safe, what you should pay attention to, what actions should be taken and much more. Yes, they are just articles, but they are very informative.

Comments are closed.