Apple gave Justification on its Child safety policy

Apple has been in the headlines for quite some time now as a result of its rules around child protection. In recent days, Apple stated that it will roll out a new feature that would check the iCloud of customers for photos relating to child sexual abuse and report any findings.

Apple’s statement quickly earned the attention of privacy activists, who slammed the company’s decision as “discriminatory.” As of today, Apple has clarified that it would only utilize its technology to scan photos that have previously been identified by clearinghouses in several countries,” according to the company.

It was reported by the news agency Reuters that Apple stated that a threshold of 30 photographs should be detected in a person’s phone before Apple’s technology informs the business that a human should evaluate it and determine whether it should be submitted to law enforcement. Apple stated that the company will begin with 30 employees, but that the number would decrease in the following days.

The cryptographic architecture prevents Apple servers from decrypting any match data before the threshold is reached, and it also prevents Apple from counting the number of matches for any particular account until the threshold is reached.

Once the barrier has been surpassed, Apple servers can only decode vouchers that correspond to positive matches, and the servers get no knowledge of any additional pictures. Apple said in a lengthy document that the encrypted vouchers allow Apple servers to access a visual derivative, such as a low-resolution version of each matched image, through the use of the vouchers.

Moreover, according to the Reuters story, Apple was dissatisfied with the manner in which it handled communications surrounding the impending technology. Apple, on the other hand, refused to say if it had made any changes to its policy as a result of the criticism.

The Cupertino-based company stated that the technology is still in the development stage and that adjustments are likely to occur before a final rollout is accomplished.

It had previously been claimed that Apple’s own workers were dissatisfied with the company’s kid safety measures. Apple employees had posted over 800 comments to an internal Slack channel to express their concerns about the company’s decision. The workers were concerned that authoritarian nations, such as China, may take advantage of the function. The function might be used to track items that are unrelated to child sexual abuse, and users could be spied on by Apple at the government’s behest if the option is activated.

Apple had previously stated on its blog that the feature will initially be available in the United States, with plans to roll it out to additional countries later on in the year.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.