Apple To No Longer Go Ahead With Its iCloud Screening Plan

Apple has formally scrapped one of its most divisive ideas: a program to check iCloud photos for evidence of child sexual abuse (or CSAM). In fact, on-device scanning was introduced as a new functionality in iOS this summer, and it uses cutting-edge technology to silently filter through individual users’ images for signals of illicit stuff. With this new addition, the scanner will notify human personnel, who will then probably notify the police if they detect any CSAM proof.

The Idea Was Met With Heavy Backlash

Apple

Privacy and security researchers were quick to voice their opposition to the proposal, with some pointing out that the screening capability may be used to look for other material in the future. The widespread agreement was that the technology could rapidly become a gateway for law enforcement. Some said that the mere presence of such scanning features in iOS constituted a dangerous path toward greater monitoring abuses.

Even while Apple first pushed back against these complaints, the corporation eventually caved and,  shortly after announcing the new feature, declared it would “defer” its adoption till a later point in time.

Apple

As things stand, it seems like that day will never arrive. Along with a slew of new iCloud privacy improvements, the firm said Wednesday that it would not be using on-device scanning. Apple announced its new direction in a message to Wired:

According To Apple

After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.

All of Apple’s intentions seemed to be in good faith. The internet spread of CSAM is a serious issue that, according to specialists, has worsened in recent years. The fact that someone tried to address the issue shows they cared. However, the core technology Apple recommended wasn’t the ideal tool for the task, given the monitoring risks it created.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.