Robert Triggs/Android Authority
TL; PhD
- A new report says that Apple plans to subvert iPhone privacy in the name of preventing child abuse.
- According to reports, the company plans to scan user photos to find evidence of child abuse. If found, the algorithm will push the photo to manual review.
- The idea that Apple employees accidentally monitor the legitimate photos of users’ children is certainly worrying.
Update, August 5, 2021 (04:10 PM EST): Soon after we published the following article, Apple confirmed the existence of its software, which looks for child abuse. In an article titled “Expand the protection of children,” The company has a plan to help curb Child Sexual Abuse Materials (CSAM).
As part of these plans, Apple will introduce new technologies in iOS and iPadOS that “will allow Apple to detect known CSAM images stored in iCloud photos.” Essentially, all media stored in iCloud Photos are scanned on the device. If the software finds the image suspicious, it will send it to Apple, and Apple will decrypt the image and view it. If it finds that the content is actually illegal, it will notify the authorities.
Apple claims that “the chance of mislabeling a given account every year is one in a trillion.”
Original article, August 5, 2021 (03:55 PM EST): In the past few years, Apple has been working hard to consolidate its reputation as a privacy-conscious company. It often quotes its “walled garden” approach as a gospel of privacy and security.
However, a new report comes from Financial Times This reputation is questioned. According to reports, Apple is planning to launch a new system that can browse photos and videos created by users on Apple products (including iPhone). The reason why Apple sacrifices iPhone privacy in this way is to hunt down child abusers.
You can also take a look: What you need to know about privacy screen protectors
The system is said to be called “neuralMatch”. Essentially, the system will use software to scan images created by users on Apple products. If the software finds any media that may be characterized by child abuse (including child pornography), it will notify a human employee. Humans will then evaluate the photos to decide what action should be taken.
Apple declined to comment on these allegations.
Is iPhone privacy coming to an end?
Obviously, the exploitation of children is a huge problem, and anyone who wants to deal with it quickly and forcefully. However, the idea of someone at Apple viewing the harmless photos of your child, neuralMatch was accidentally marked as illegal, which seems to be a very real problem waiting to happen.
Another idea is that software designed to detect child abuse can now be trained so that other things can be discovered later. For example, what if it is not child abuse but drug use? To what extent is Apple willing to help the government and law enforcement agencies catch criminals?
Apple may disclose this system within a few days. We need to wait to see how the public reacts, if it does happen, and when.