Saturday, September 7, 2024
News

Apple accused of not effectively monitoring its platforms to prevent sexual abuse of children

 A proposed class action suit claims that Apple is hiding behind claims of privacy in order to avoid stopping the storage of child sexual abuse material on iCloud, and alleged grooming over iMessage.

Apple is failing to effectively monitor its platforms or scan for images and videos of the sexual abuse of children, child safety experts allege, reports The Guardian.

The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) accuses Apple of vastly undercounting how often child sexual abuse material (CSAM) appears in its products, the article adds. NSPCC says child predators use iCloud, iMessage and FaceTime to store and exchange CSAM in a higher number of cases in England and Wales alone than the company reported across all other countries combined.

Through data gathered via freedom of information requests and shared exclusively with the Guardian, the children’s charity found Apple was implicated in 337 recorded offenses of child abuse images between April 2022 and March 2023 in England and Wales, according to The Guardian. In 2023, Apple made just 267 reports of suspected CSAM on its platforms worldwide to the National Center for Missing & Exploited Children (NCMEC), which is in stark contrast to its big tech peers, with Google reporting more than 1.47m and Meta reporting more than 30.6m, per NCMEC’s annual report.

The tech giant apparently can’t win. On August 6, 2021, Apple previewed new child safety features coming to its various devices later this year. And the announcement stirred up some controversy.

Here’s what Apple had planned: Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts. First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

These features were supposed to arrive in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

The Controversy

However, the announcement was met with major controversy. For example, Fight for the Future, a nonprofit advocacy group for digital rights, has launched nospyphone.com. It’s a web page designed to give individuals a way to directly contact Apple to oppose its controversial child protection plan that involves scanning devices for child abuse images.

Also, employees spoke out internally about Apple’s plans, reports Reuters. The August 13 article said employees of the tech giant have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests.

What’s more, PC Mag noted that the intent wass to stop crime, but it could come at the expense of an iPhone owner’s privacy. The main concern was that Apple was creating a beachhead to monitor the activity on a consumer’s personal hardware, a step beyond simply surveilling the data on a social network, video site, or email service.  

On September 3, 2021, in a statement to 9to5Mac, Apple says it’s delaying its controversial CSAM detection system and child safety features. 

Here’s Apple’s statement: Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

Dennis Sellers
the authorDennis Sellers
Dennis Sellers is the editor/publisher of Apple World Today. He’s been an “Apple journalist” since 1995 (starting with the first big Apple news site, MacCentral). He loves to read, run, play sports, and watch movies.