News

Australian commission criticizes Apple, Microsoft for not doing enough to stop child exploitation on their platforms

Sometimes you just can’t win. On Dec. 7, Apple said it was abandoning its plans to scan photos users stored in iCloud for child sexual abuse material (CSAM). This was to designed detect child sexual abuse images on iPhones and iPads.

However, the plan was widely criticized and Apple says that in response to the feedback and guidance it received, the CSAM detection tool for iCloud photos is dead, reports Wired. However, now an Australian regulator, after using new powers to make tech giants share their info, has accused Apple and Microsoft of not doing enough to stop child exploitation content on their platforms, reports Reuters.

On August 30, the Australian e-Safety Commissioner Julie Inman Grant demanded that the two tech companies as well as others such as Facebook and Snapchat share details about their methods for preventing child abuse and exploitation on their platforms. The demand was made on August 30, and the companies had 29 days to comply or face fines.

As apparently the first two tech giant reviews, Apple and Microsoft have been found wanting. According to Reuters, Grants criticized the two companies for not proactively scanning user files on iCloud or OneDrive for CSAM. She was also displeased that there aren’t algorithmic detections in place for FaceTime or Skype.

Dennis Sellers
the authorDennis Sellers
Dennis Sellers is the editor/publisher of Apple World Today. He’s been an “Apple journalist” since 1995 (starting with the first big Apple news site, MacCentral). He loves to read, run, play sports, and watch movies.