Apple has recently revealed a series of steps to be taken to help protect children from within their ecosystem of apps.
The purpose of the features is to cut down on Child Sexual Abuse Material (CSAM) in three ways. These include: automatic detection of CSAM material within saved iCloud Photos, encrypted detection of sensitive content in the Messages app, and the implementation of new Siri and Search interactions to provide resources for children in unsafe situations.
Apple stated that these features are set to come later this year, with support starting in iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.
For more information about the new movement, see this article.
Photo by zhang kaiyv on Unsplash