Sometimes you just can’t win. On Dec. 7, Apple said it was abandoning its plans to scan photos users stored in iCloud for child sexual abuse material (CSAM). This was to designed detect child sexual abuse images on iPhones and iPads.
However, the plan was widely criticized and Apple says that in response to the feedback and guidance it received, the CSAM detection tool for iCloud photos is dead, reports Wired. However, now an Australian regulator, after using new powers to make tech giants share their info, has accused Apple and Microsoft of not doing enough to stop child exploitation content on their platforms, reports Reuters.
On August 30, the Australian e-Safety Commissioner Julie Inman Grant demanded that the two tech companies as well as others such as Facebook and Snapchat share details about their methods for preventing child abuse and exploitation on their platforms. The demand was made on August 30, and the companies had 29 days to comply or face fines.
As apparently the first two tech giant reviews, Apple and Microsoft have been found wanting. According to Reuters, Grants criticized the two companies for not proactively scanning user files on iCloud or OneDrive for CSAM. She was also displeased that there aren’t algorithmic detections in place for FaceTime or Skype.
Apple is developing a “significantly thinner version” of the iPhone that could be released next…
Apple wants its Vision Pro Personas to have more realistic hair.
This Magic Keyboard Folio is the perfect partner for your iPad 10th Gen. It brings…
Do you like to play online games and want to grow a career in it?…
Samsung vs Apple in 2024: Innovation vs Refinement. Who wins? Look inside to see which…
I haven’t noticed it on my new tablet, but some new M4 iPad Pro models…