iOS 15.2 Beta is now out with a new Communication Safety feature

In conjunction with the release of the iOS 15.2 Beta, Apple has included a new feature that will assist parents in protecting their children from illicit photographs that are shared with them through Messages. The function will identify nudity in photographs and will blur the images suitably if it is detected. The tool will also alert youngsters about the content and provide them with sharing choices so that they may reach out to trustworthy adults for assistance.

Communication Safety is a new feature that has been pushed out in the second Beta version of iOS 15.2 and is available to all users. When sent through text message, it allows users to check for nudity in photographs. This verification is conducted on the device and is carried out in conjunction with the app’s end-to-end encryption of messages. Apple, therefore, verifies that no evidence of a nude photo being received on the smartphone is kept anywhere on the device.

This, however, was not always the case in the past. Initially, Communication Safety had a function that alerted parents of children under the age of 13 if their children’s phones were opened with a nude image on them. However, the capability was quickly abandoned due to concerns that notification would pose a danger in cases of parental abuse.

Instead, if Messages detects that a nude picture has been received on an iPhone, Apple will now allow minors to contact a trusted contact on their phone using that contact’s phone. Communication Safety will be available as part of the Family Sharing function, and parents will be required to enable it on their children’s devices before using it.

According to the company, Communication Safety is the latest attempt by Apple to assist in creating a safer ecosystem for children who use its products. In August, Apple released its Child Sexual Abuse Material (CSAM) check, which would have scanned photographs stored on iCloud, iMessage, and Siri for any content related to child sexual abuse. Apple would have flagged any potentially objectionable information and referred it for additional review.

Naturally, the function sparked widespread anxiety about Apple searching through customers’ photographs, and the company responded by removing the feature. A public explanation concerning the feature’s intended usage was later issued by Apple, who assured the public that the company will not use the function to censor any other content or share the user’s content with any other authority. The criticism, on the other hand, did not abate.

Apple has ultimately chosen to put off the implementation of CSAM to iCloud for a while longer. However, the new Communication Safety function is more oriented towards kid safety online, and it has received excellent feedback as a result, leading to the tool being modified a couple of times. Therefore, the iPhone version’s release has been eagerly anticipated by the parent community.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.