Apple to tackle child abuse by monitoring images on phones and in messages

Apple is introducing a series of new tools to analyze the photos on a user’s iCloud account for explicit child images, intervene in searches for abusive material and block explicit images from being sent to children.

Apple’s image analysis allows users to quickly search through their photos thanks to face detection and recognition tools, while the software can also help improve your pictures by recognizing the type of scene you are photographing. Making use of this technology, Apple will compare iCloud photos to a database of images and if a threshold of matches is found, a report will be sent to the National Center for Missing and Exploited Children (NCMEC).

The system uses cryptography to turn the images into a unique set of numbers, known as a NeutralHash, which can then be compared to the NeutralHash values of images on the Child Sexual Abuse Material database. The system ensures that non-matching images are not shared and according to the white paper, the process has an error rate of less than one in 1 trillion.

In addition, Siri and search functions will warn users who search for abusive child material, as well as provide resources for users to file reports of suspected abuse. Parents will also be able to activate a feature within Apple Messages that blurs out images sent to their child’s phone that are deemed to be explicit.

Images can still be viewed with a second press but by choosing to view them, the parents will be alerted. There will also be various warnings given on the phone before the image is shown, giving the child a chance to cancel without seeing the picture.

According to Bloomberg, who first published the story, these features are expected to go into use later this year.

Leave comment

Your email address will not be published. Required fields are marked with *.