Apple Says iPhones in the U.K. and Australia to Have Child Safety Feature That Scans Messages for Nudity

Photo: (Photo : Michael M. Santiago/Getty Images)

Consumer electronics giant Apple has just announced that British and Australian iPhones will soon have a safety feature that uses AI technology to scan messages sent to and from kids. The child safety feature, which Apple referred to as "communication safety in Messages", allows parents to turn on warnings for the iPhones of their children

The Guardian reported that when the safety feature is enabled, all photos that were received or sent by the child using the Messages app will then be scanned for nudity. If nudity is then discovered in the pictures received by a child with the setting turned on, the said photo will then be blurred, and the child will get a warning that the image may contain sensitive content. It will then be nudged towards resources from child safety groups.

All the scanning of the images is carried out "on-device", meaning that the photos are analyzed by the iPhone itself. Apple has made it clear that it never sees either the pictures being analyzed or the results of the iPhone's analysis.

Apple makes it clear that it does not get access to the messages

The company issued a statement regarding this revolutionary child safety feature, saying "Messages analyses image attachments and determines if a photo contains nudity, while maintaining the end-to-end encryption of the messages. The feature is designed so that no indication of the detection of nudity ever leaves the device. Apple does not get access to the messages, and no notifications are sent to the parent or anyone else."

According to Gizmodo, the feature will also be made available in Australia. The photo in question will be blurred, with a quick warning to Australian users that the image may be sensitive, giving children the option to view the picture anyway or forward it on to an adult they trust. The children can also reach out for help through the Kid's Helpline for example.

Apple has also decided to drop several controversial options from the update before its much awaited release. In the company's initial announcement of its plans, Apple suggested that parents would be automatically alerted if their young kids, who are under the age 13, sent or received such photos. Those type of alerts are nowhere to be found, however, in the final release.

Read Also: Dad Stabs Toddler During Blue Springs Car Chase: Anthony Beighley-Beck and Tabatha Ong Arrested

Apple intends to intervene when user searches for child exploitation content

Apple also plans to introduce a set of features intended to intervene when a user searches for content related to child exploitation in Siri, Safari, or Spotlight. As originally announced by Apple in the summer of last year, the search warnings and the communication safety in Messages were part of a trio of features intended to arrive alongside iOS 15 that autumn.

According to iMore, the third of those features, which would scan pictures before they were uploaded to iCloud and report any photos that matched known child sexual exploitation imagery, proved extremely contentious. Apple decided to delay the launch of all three features, with the company negotiating with privacy and child safety groups.

Related Article: Premature Baby Harper Jacobo, the Smallest Ever to Survive With an Incomplete Esophagus   

© 2024 ParentHerald.com All rights reserved. Do not reproduce without permission.

Join the Discussion
Real Time Analytics