Apple to start scanning iPhones for child sex abuse images
Apple has announced that beginning with the upcoming iOS version, it will scan iPhones for Child Sexual Abuse Material (CSAM). If known CSAM images are detected in iCloud Photos, they would be reviewed manually, and if necessary, reported to National Center for Missing & Exploited Children (NCMEC).
…