Apple is continuously criticized for its recently announced child safety feature to detect child sexual abuse material. An international coalition of more than 90 civil society organizations in an open letter urges the company to abandon the plans to build surveillance capabilities into iPhones, iPads and other Apple products.
The coalition says the feature will create new risks for children and could be used to censor speech and threaten the privacy and security of people around the world.
Apple has said the planned tool is designed to detect known images of child sexual abuse is called “neuralMatch” and will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children (NCMEC) notified.
Apple said it will also scan users’ encrypted messages for sexually explicit content as a child safety measure.
However, the international coalition says the features can be misused:
“The scan and alert feature in Messages could result in alerts that threaten the safety and wellbeing of some young people, and LGBTQ+ youths with unsympathetic parents are particularly at risk.”
The organisations say that once the feature’s hash scanning for photos is built into Apple products, the company will face enormous pressure, and possibly legal requirements, from governments around the world to scan for all sorts of images that the governments find objectionable.