Apple will begin scanning U.S. phones for child sexual abuse images, a move that could raise concerns about security and privacy.

The tech giant plans to roll out software that will scan Americans’ iPhones to detect child sexual abuse images and report them to law enforcement in a way it says will preserve user privacy, Techcrunch reported.

TechCrunch cited a source from Apple who said that detection of child sexual abuse material (CSAM) is one of the new features to protect children from online harm. Another feature will intervene when a user tries to search for CSAM-related terms through Siri and Search.

The decision comes after Apple has long resisted scanning users’ files in the iCloud servers. However, most cloud services such as Dropbox, Google, and Microsoft have already scanned user files for content that might violate their terms of service or be potentially illegal.

NeuralHash will feature on iOS 15 and macOS Monterey, which are due to be released in a few months

The new software, called NeuralHash, will work on a user’s iPhone or Mac and converts photos into a unique string of numbers and letters—a hash. The hash is run against a database of known hashes of child abuse imagery before the user uploads to iCloud. The hash database is supplied by various child protection bodies.

Apple then uses a cryptographic technique called ‘Threshold secret sharing,’ to decrypt the contents if a user crosses a specified threshold of known child abuse imagery in their iCloud Photos.

NeuralHash will land in iOS 15 and macOS Monterey, slated to be released in the next month or two. Apple said the software will roll out in the United States at first but would not say if or when it would be rolled out internationally.

Despite broad support for efforts to combat child sexual abuse, security experts and users worry that the new technology could snowball beyond looking for the child abuse material.

A big question is why now and not sooner.