Close Menu
Technotification
    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram
    Technotification
    • Home
    • News
    • How To
    • Explained
    • Facts
    • Lists
    • Programming
    • Security
    • Gaming
    Technotification
    Home › News › Apple is going for a big step to examine CSAM

    Apple is going for a big step to examine CSAM

    By Ratnesh ShindeAugust 6, 2021
    Facebook Twitter Reddit LinkedIn
    Apple CSAM

    Apple may launch a strange feature anytime this week. Apple will now examine your iCloud photographs for child sexual abuse material (CSAM). Intent aside, Apple’s technology and methodology are extremely flexible.

    Matthew D. Green, an Associate Professor at the Johns Hopkins Information Security Institute, made the claims. Green discussed Apple’s ideas and what they may mean if implemented in a series of tweets.

    Apple’s plan to unveil a new scanning tool tomorrow has been verified to Green by numerous sources. The “client-side tool” will scan for CSAM on users’ devices. It will send any image tagged by it to Apple servers.

    “Your phone will download a database of “fingerprints” for all bad photos (child porn, terrorist recruitment videos, etc.) It will search your phone for matches.”

    He says the gadget will hunt for CSAM using a perceptual hashtag. Photos with the hashtag will be sent to Apple’s servers.

    Apple reportedly produced these hashes using “a new and proprietary neural hashing method.” Green also says Apple has the NCMEC’s clearance to utilize it.

    Initially Apple is not going to deploy this system on your encrypted images. They’re going to use it on your phone’s photo library, and only if you have iCloud Backup turned on.

    So in that sense, “phew”: it will only scan data that Apple’s servers already have. No problem right?

    — Matthew Green (@matthew_d_green) August 5, 2021

    This means that, depending on how they work, it might be possible for someone to make problematic images that “match” entirely harmless images. Like political images shared by persecuted groups. These harmless images would be reported to the provider.

    — Matthew Green (@matthew_d_green) August 5, 2021

    This technology should help catch and eventually stop child pornography distribution and production, at least within the iOS ecosystem.

    The enormous implications of such a technology worry Green.

    Apple will only scan pictures saved in iCloud, meaning those shared with Apple. But a gadget that can scan your photos does not sound very privacy-friendly.

    Green says so. He claims in his tweets that such a client-side scanning technique might seek for any image on your device.

    The issue is that the consumer has no idea what hashtags or “fingerprints” are being utilized during scanning. Anyone with access to the system may thus gather a specific group of files from your library.

    So, how much do you trust Apple? It may simply use the same technique to look for anything other than CSAM in your photographs tomorrow.

    Even if Apple utilizes it correctly, these hashtags are prone to mismatches. Green indicates that they are “imprecise” to capture near matches, i.e. even if a picture is cropped or otherwise changed.

    As a result, the algorithm frequently misidentifies innocent images as harmful. We’ve all seen it on Twitter and other social media sites that refuse to reveal content because “you’ve decided not to view it.” When you open it, you see a doggie in a flower field.

    Aside from Apple abusing the technology, Green is worried about threat actors. If someone can manufacture collisions, or otherwise mislead the algorithm to identify unrelated pictures, it loses all effectiveness.

    The most concerning idea are that Apple may use the technology in future encrypted services. Globally, law enforcement authorities are already pushing to monitor encrypted sites more closely.

    If Apple is required to utilize the technology for such and comparable purposes within the encrypted network in the future, Apple’s privacy arguments will be weakened.

    Share. Facebook Twitter LinkedIn Tumblr Reddit Telegram WhatsApp
    Ratnesh Shinde

    Related Posts

    NVIDIA GeForce NOW is Finally Coming to India

    January 8, 2025

    Account Takeover Fraud: The Stealthy Menace in the Digital Age

    February 5, 2024

    How to Protect Your Online Identity and Data From Cybercriminals in 2024

    November 19, 2023

    India’s JioGamesCloud Added 100+ New Games

    October 15, 2023

    Apple’s latest iOS 16.6 Patch Boosts iPhone Privacy & Security

    July 31, 2023

    Multiview Feature Now Available on YouTube Tv

    July 31, 2023
    Lists You May Like

    5 Best Torrent Sites for Software in 2025

    January 2, 2025

    10 Best RARBG Alternative Sites in April 2025 [Working Links]

    April 1, 2025

    10 Best Torrent Search Engine Sites (2025 Edition)

    February 12, 2025

    The Pirate Bay Proxy List in 2025 [Updated List]

    January 2, 2025

    10 Sites to Watch Free Korean Drama [2025 Edition]

    January 2, 2025

    10 Best Torrent Sites for eBooks in 2025 [Working]

    January 2, 2025

    10 Best GTA V Roleplay Servers in 2025 (Updated List)

    January 6, 2025

    1337x Alternatives, Proxies, and Mirror Sites in 2025

    January 2, 2025

    10 Best Anime Torrent Sites in 2025 [Working Sites]

    January 6, 2025

    Call of Duty Warzone: All Bunker Codes and Locations

    December 4, 2022
    Pages
    • About
    • Contact
    • Privacy
    • Careers
    Privacy

    Information such as the type of browser being used, its operating system, and your IP address is gathered in order to enhance your online experience.

    © 2013 - 2025 Technotification | All rights reserved.

    Type above and press Enter to search. Press Esc to cancel.