Apple will scan iPhone for nude photos, will report user to police if it finds child porn
India Today
Apple's new technology will now scan iCloud accounts of users for sexually explicit content involving children.
Apple announced on Thursday that it would launch new software to analyse user’s iCloud Photos for images of child sexual abuse. The new software will be used to check the photos stored in your iCloud and reaffirm whether it contains child sexual abuse material or not. If found, the user will be reported to relevant authorities. While Apple deserves a pat on the back for taking cognisance of issues such as child sexual abuse, the move may not go down well with the privacy advocates. As per Mark Gurman’s report on Bloomberg, Apple has also announced a feature to analyse each and every picture sent and received on the Messages app for child sexual abuse material. If any of the images match the kind of material Apple is hunting for, it will be reported back to Apple’s servers. Things don’t end with Messages, Apple has also included Siri, the voice assistant, in its fight against child sexual abuse. The report states that Siri will now get special powers to intervene whenever a user searches for sexually explicit content involving children.More Related News