Apple Plans to Scan Photos for Child Abuse Content

News

Apple has recently revealed a series of steps to be taken to help protect children from within their ecosystem of apps.

The purpose of the features is to cut down on Child Sexual Abuse Material (CSAM) in three ways. These include: automatic detection of CSAM material within saved iCloud Photos, encrypted detection of sensitive content in the Messages app, and the implementation of new Siri and Search interactions to provide resources for children in unsafe situations.

Apple stated that these features are set to come later this year, with support starting in iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

For more information about the new movement, see this article.

Photo by zhang kaiyv on Unsplash

img

    Let’s talk about your project
    I agree with the use of my personal data and information by Elinext as it is said in the Privacy and Cookie Policy. I understand that due to the nature of business held by Elinext, the use, and processing of my personal information
    Share link
    Copy link