A person holding an Apple iPhone and looking at it.

Halfpoint/Shutterstock (Licensed)

Apple reportedly will scan iPhones for child abuse imagery

The move is controversial because of the potential ramifications.

 

Andrew Wyrich

Tech

Published Aug 5, 2021   Updated Aug 5, 2021, 5:05 pm CDT

Apple will soon include software on iPhones in the U.S. that will scan the devices for child abuse imagery, according to a new report.

The Financial Times reported the news, citing sources.

Matthew Green, a security expert and associate professor at Johns Hopkins Information Security Institute, also tweeted about Apple’s decision on Wednesday. Green has worked with Apple in the past to patch issues that could have allowed hackers to decrypt photos and videos in iMessage.

Green, also citing sources, said the scanning would be “client-side,” or done on an individual’s iPhone. In a follow-up tweet, Green said the system would be used on a phone’s photo library and only if you have iCloud Backup turned on. So it would “only scan data that Apple’s servers already have.” The system would use a hashing algorithm to match photos on a phone to known child abuse images.

“These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear,” Green tweeted.

Green notes that someone might be able to make “problematic images that ‘match’ entirely harmless images.”

“Imagine someone sends you a perfectly harmless political media file that you share with a friend. But that file shares a hash with some known child porn file?,” Green wrote.

However, Green also noted that eventually the scanning “could be a key ingredient in adding surveillance to encrypted messaging systems,” which is why he argued Apple’s move would be a “really bad idea.”

“I don’t particularly want to be on the side of child porn and I’m not a terrorist. But the problem is that encryption is a powerful tool that provides privacy, and you can’t really have strong privacy while also surveilling every image anyone sends,” Green tweeted.

U.S. law enforcement like the FBI and Department of Justice have long urged tech companies to create backdoors for them to access encrypted messaging, often citing the need to do so to find people who share child abuse material.

Meanwhile, privacy and tech advocates have also long noted that adding a backdoor for authorities could inevitably lead to that backdoor being exploited.

Apple Insider notes that Apple has not confirmed the plan.


Read more about Big Tech

Nearly 60,000 people call on Apple to stop plan to scan iPhones
Real-time deepfakes could bring chaos to your next Zoom call
‘America has a monopoly problem’: Groups urge House to pass big tech antitrust bills package
Amazon’s purchase of MGM Studios should be blocked to stop its ‘growing dominance,’ leading advocates say
Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.
Share this article
*First Published: Aug 5, 2021, 12:30 pm CDT