Thursday, November 21, 2024
Uncategorized

Some Apple employees concerned over company’s child protection plans (updated)

A backlash over Apple’s move to scan U.S. customer phones and computers for child sex abuse images has grown to include employees speaking out internally, reports Reuters.

The articles says employees of the tech giant have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests.

Apple’s senior vice president of software engineering, Craig Federighi, has defended Apple’s child safety features. You can his comments in an interview with The Wall Street Journal

About the child safety features

On August 6, Apple previewed new child safety features coming to its various devices later this year. And the announcement has stirred up some controversy.

You can read the entire “Expanded Protections for Children” article for details, but here’s the gist of it:

Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts. First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

The backlash

However, as noted by the Financial Times (a subscription is required to read the entire article), security researchers are concerned about Apple’s plan. Specifically, they worry that the plan uses a “NeuralHash” system to compare known CSAM (Child Sexual Abuse Material) images to photos on a user’s iPhone before they’re uploaded to iCloud. 

PC Mag notes that the intent is to stop crime, but it could come at the expense of an iPhone owner’s privacy. The main concern is that Apple is creating a beachhead to monitor the activity on a consumer’s personal hardware, a step beyond simply surveilling the data on a social network, video site, or email service.  

To address these concerns, Apple provided additional commentary about its plans: Apple’s known CSAM detection system will be limited to the United States at launch, and to address the potential for some governments to try to abuse the system, Apple confirmed to MacRumors that the company will consider any potential global expansion of the system on a country-by-country basis after conducting a legal evaluation.

Dennis Sellers
the authorDennis Sellers
Dennis Sellers is the editor/publisher of Apple World Today. He’s been an “Apple journalist” since 1995 (starting with the first big Apple news site, MacCentral). He loves to read, run, play sports, and watch movies.