Apple’s feature to slow the spread of child sex abuse material will not be released anytime soon
India Today
Apple had previously said that it wants to protect children from predators who use communication tools to recruit and exploit them and limit the spread of Child Sexual Abuse Material (CSAM).
Apple wants to make an informed decision before rolling out the child protection feature that was announced last month. The Cupertino-giant in a fresh statement said that it would not roll out the feature without making improvements to it based on feedback. Apple had announced that its new feature would scan the users’ photos for child sexual abuse material. However, the move was severely criticised by the privacy advocates as it clearly breaches the privacy of users and can be exploited by governments. “Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple spokesperson told The Verge in a statement.More Related News