On Friday, Apple said it would delay the testing launch of its new technology to combat child exploitation.
Last month, the tech giant made headlines after revealing that it would launch a new tool to scan iPhone users’ devices and iCloud for child sexual abuse imagery.
The announcement was widely criticized by the audience, with some people accusing the company of undermining their privacy.
Bach then, the tech manufacturer explained that its technology would identify child sexual abuse images and report them when uploaded to its online storage in the United States.
However, digital rights organizers criticized the move, arguing that the Silicon Valley giant’s adjustments to its operating systems could quickly create a “back door” on devices, allowing governments and other organizations or powerful groups to spy on users.
Apple did not provide specific details when announcing the tool’s launch but said it was “intended to help protect children from predators who use communication tools to recruit and exploit them.”
It was also known that the new technology would include an opt-in feature to warn minors and their parents about sexually explicit image attachments sent or received in iMessage.
But Apple has stopped the tool’s rollout, saying it needs to collect more feedback and make some improvements.
On Friday, Apple (APPL) said it would pause the feature, citing comments from customers, researchers, advocacy groups, and others.
“We have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the tech firm said in a statement.
Critics of the move welcomed Apple’s decision to backtrack on implementing the feature.
The digital rights group Fight for Freedom said the tool threatened security, privacy, democracy, and freedom.
But some child safety advocates urged the tech giant not to give in to critics or those worried about politics.
Andy Borrows, head of child safety online at the National Society for the Prevention of Cruelty to Children, called the measure “disappointing,” adding: “Apple had adopted a proportionate approach that sought to balance user safety and privacy, and should have stood their ground.”