A safety feature that uses AI technology to scan messages sent to and from children will soon hit British iPhones, Apple has announced.
The feature, referred to as “communication safety in Messages”, allows parents to turn on warnings for their children’s iPhones. When enabled, all photos sent or received by the child using the Messages app will be scanned for nudity.
If nudity is found in photos received by a child with the setting turned on, the photo will be blurred, and the child will be warned that it may contain sensitive content and nudged towards resources from child safety groups. If nudity is found in photos sent by a child, similar protections kick in, and the child is encouraged not to send the images, and given an option to “Message a Grown-Up”.
All the scanning is carried out “on-device”, meaning that the images are analysed by the iPhone itself, and Apple never sees either the photos being analysed or the results of the analysis, it said.
“Messages analyses image attachments and determines if a photo contains nudity, while maintaining the end-to-end encryption of the messages,” the company said in a statement. “The feature is designed so that no indication of the detection of nudity ever leaves the device. Apple does not get access to the messages, and no notifications are sent to the parent or anyone else.”
Apple has also dropped several controversial options from the update before release. In its initial announcement of its plans, the company suggested that parents would be automatically alerted if young children, under 13, sent or received such images; in the final release, those alerts are nowhere to be found.
The company is also introducing a set of features intended to intervene when content related to
Read more on theguardian.com