🤑 It doesn’t get more affordable. Grab this 60% OFF Black Friday offer before it disappears…CLAIM SALE

Apple to check iCloud photo uploads for child abuse images

Published 08/05/2021, 03:04 PM
Updated 08/05/2021, 07:55 PM
© Reuters. FILE PHOTO: The Apple logo is seen at an Apple Store, as Apple's new 5G iPhone 12 went on sale in Brooklyn, New York, U.S. October 23, 2020.  REUTERS/Brendan McDermid
MSFT
-
GOOGL
-
AAPL
-
META
-
GOOG
-

By Stephen Nellis

(Reuters) -Apple Inc on Thursday said it will implement a system that checks photos on iPhones in the United States before they are uploaded to its iCloud storage services to ensure the upload does not match known images of child sexual abuse.

Detection of child abuse image uploads sufficient to guard against false positives will trigger a human review of and report of the user to law enforcement, Apple (NASDAQ:AAPL) said. It said the system is designed to reduce false positives to one in one trillion.

Apple's new system seeks to address requests from law enforcement to help stem child sexual abuse while also respecting privacy and security practices that are a core tenet of the company's brand. But some privacy advocates said the system could open the door to monitoring of political speech or other content on iPhones.

Most other major technology providers - including Alphabet (NASDAQ:GOOGL) Inc's Google, Facebook Inc (NASDAQ:FB) and Microsoft Corp (NASDAQ:MSFT) - are already checking images against a database of known child sexual abuse imagery.

"With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material," John Clark, chief executive of the National Center for Missing & Exploited Children, said in a statement. "The reality is that privacy and child protection can co-exist."

Here is how Apple's system works. Law enforcement officials maintain a database of known child sexual abuse images and translate those images into "hashes" - numerical codes that positively identify the image but cannot be used to reconstruct them.

Apple has implemented that database using a technology called "NeuralHash", designed to also catch edited images similar to the originals. That database will be stored on iPhones.

When a user uploads an image to Apple's iCloud storage service, the iPhone will create a hash of the image to be uploaded and compare it against the database.

Photos stored only on the phone are not checked, Apple said, and human review before reporting an account to law enforcement is meant to ensure any matches are genuine before suspending an account.

Apple said users who feel their account was improperly suspended can appeal to have it reinstated.

The Financial Times earlier reported some aspects of the program.

One feature that sets Apple's system apart is that it checks photos stored on phones before they are uploaded, rather than checking the photos after they arrive on the company's servers.

On Twitter, some privacy and security experts expressed concerns the system could eventually be expanded to scan phones more generally for prohibited content or political speech.

Apple has "sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content," Matthew Green, a security researcher at Johns Hopkins University, warned.

"This will break the dam — governments will demand it from everyone."

Other privacy researchers such as India McKinney and Erica Portnoy of the Electronic Frontier Foundation wrote in a blog post that it may be impossible for outside researchers to double check whether Apple keeps its promises to check only a small set of on-device content.

© Reuters. FILE PHOTO: The Apple logo is seen at an Apple Store, as Apple's new 5G iPhone 12 went on sale in Brooklyn, New York, U.S. October 23, 2020.  REUTERS/Brendan McDermid

The move is "a shocking about-face for users who have relied on the company’s leadership in privacy and security," the pair wrote.

"At the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor," McKinney and Portnoy wrote.

Latest comments

Risk Disclosure: Trading in financial instruments and/or cryptocurrencies involves high risks including the risk of losing some, or all, of your investment amount, and may not be suitable for all investors. Prices of cryptocurrencies are extremely volatile and may be affected by external factors such as financial, regulatory or political events. Trading on margin increases the financial risks.
Before deciding to trade in financial instrument or cryptocurrencies you should be fully informed of the risks and costs associated with trading the financial markets, carefully consider your investment objectives, level of experience, and risk appetite, and seek professional advice where needed.
Fusion Media would like to remind you that the data contained in this website is not necessarily real-time nor accurate. The data and prices on the website are not necessarily provided by any market or exchange, but may be provided by market makers, and so prices may not be accurate and may differ from the actual price at any given market, meaning prices are indicative and not appropriate for trading purposes. Fusion Media and any provider of the data contained in this website will not accept liability for any loss or damage as a result of your trading, or your reliance on the information contained within this website.
It is prohibited to use, store, reproduce, display, modify, transmit or distribute the data contained in this website without the explicit prior written permission of Fusion Media and/or the data provider. All intellectual property rights are reserved by the providers and/or the exchange providing the data contained in this website.
Fusion Media may be compensated by the advertisers that appear on the website, based on your interaction with the advertisements or advertisers.
© 2007-2024 - Fusion Media Limited. All Rights Reserved.