Governments were already discussing the misuse of CSAM scanning technology even before Apple announced its plans, security researchers say.

The biggest concern raised when Apple said it would scan iPhones for Child Sexual Abuse (CSAM) materials is that spec creep would result as governments insist the company look for others Types of images are scanned, and now there seems to be good evidence …

background

Apple insisted that there was solid protection in place to protect privacy and prevent abuse. It would only match images with known CSAM databases; it would check at least two databases and require the image to be in both; Action would only be triggered if there were 30 matching images; and there would be a manual review before law enforcement was alerted.

However, I and others were quick to point out that such promises are impossible to keep.

As the company said in relation to previous controversy, Apple complies with local laws in each of the countries in which it operates …



Source link

Leave a Reply