The American technology giant, Apple has delayed the rollout of the child safety features that it announced last month to check iPhones for images of child sexual abuse.
Apple said that it would take more time to collect inputs and make improvements before releasing child safety features as criticism of the system on privacy and other grounds rose from both inside and outside the company.
Last month, more than 90 policy and rights groups around the world told Apple it should abandon plans for scanning children’s messages for nudity and the phones of adults for images of child sex abuse.
The proposed features of Apple include scanning users’ iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos and expanded CSAM guidance in Siri and search.
Critics of the plan argued that the feature could be exploited by repressive governments looking to find other material for censorship or arrests and would also be impossible for outside researchers to determine whether Apple was only checking a small set of on-device content.
The technology giant countered that it would enable security researchers to verify its claims, but last day the company said that it would take more time to make changes to the system.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company said.
Apple has taken several steps to dispel misunderstandings and tried to reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more.
The company had planned to roll out the feature for iPhones, iPads, and Mac with software updates later this year in the US. It is now unclear when Apple plans to roll out the “critically important” features.