A global coalition of more than 90 civil rights and policy organizations is pressing Apple to drop its plans to monitor consumer images as a measure to counter child abuse material.

The tech giant earlier this month announced that it would add a function to the iPhone that would scan photographs for known child sexual abuse material or CSAM, when uploaded to its servers. 

But the system quickly faced backlashes as experts alleged that the tool would otherwise be used for other purposes in the wrong hands. 

Such worries are repeated in an open letter by the groups, filed on Thursday, August 19, to Apple CEO Tim Cook.

“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material,” it reads, “we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”

The software will utilize artificial intelligence to detect when youngsters exchange photographs with nudity and inform their parents. It would include another feature that would intervene when a user attempts to use Siri or Search to look for CSAM-related terms.

But as the groups reminded Apple that algorithms currently available to sexual materials are still largely imperfect, they also noticed that the alert parent’s feature could not guarantee if the users were indeed parents and child-related.

According to the signers, the move could put children in intolerant households or those seeking educational materials at risk. They also claimed that the change would disrupt iMessage’s end-to-end encryption, which Apple has adamantly defended in other instances.

“An abusive adult may be the organiser of the account, and the consequences of parental notification could threaten the child’s safety and wellbeing,” they alleged.

Apple said users’ photos updated into iCloud would be scanned. If any concerning material is detected, the company would report them to law enforcement in a way it says will preserve user privacy.

But the groups argued not only will users risk having their personal database monitored, but also the system would be the perfect feed for authoritarian governments which would want to otherwise use Apple’s software for something else rather than protecting children.

“The company and its competitors will face enormous pressure—and potentially legal requirements—from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable,” they wrote. 

“Those images may be of human rights abuses, political protests, images companies have tagged as “terrorist” or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them,” the group added. “… Apple will have laid the foundation for censorship, surveillance, and persecution on a global basis.”

Apple’s child protection features had also drawn controversy within the company itself, not just the technology policy groups, Reuters reported last week. 

The outlet said more than 800 messages from internal employees had inundated an Apple internal Slack channel about the feature alone. Many of them were concerns about privacy and government mishandling of the system.