Eric Zeman/Android Authority
TL; PhD
- The Apple CSAM photo scanning feature announced a month ago will be postponed.
- Apple said it needs to “spend more time in the next few months” to perfect the feature.
- The policy will use algorithms to “scan” user photos to find evidence of child abuse.
In early August, Apple announced a very controversial new policy. In order to curb the exploitation of children, the company said it will begin to safely scan every photo that people upload to iCloud. Although this scan is done by an algorithm, any signs in the algorithm will see human follow-up actions.
Obviously, Child Sexual Abuse Materials (CSAM) is a huge problem that almost everyone wants to fight with. However, Apple’s CSAM policy makes many people feel uneasy because it seems to violate privacy.Now, the company is delaying the rollout of this feature (via 9to5Mac).
You can also take a look: The best privacy web browser for Android
Apple promises that its algorithm for scanning user data is very accurate, claiming that “the chance of mislabeling a given account every year is one in a trillion.” However, this promise did not stop the uneasiness. Apple’s statement about the delay is very clear:
Last month, we announced a number of functional plans designed to help protect children from predators who use communication tools to recruit and exploit them, and to limit the spread of child sexual abuse materials. Based on feedback from customers, advocacy groups, researchers, and others, we decided to spend more time collecting opinions and making improvements in the coming months before releasing these vital child safety features.
The statement indicates that Apple will not launch this feature anytime soon. “The next few months” may mean the end of this year or possibly 2022. It may even be postponed indefinitely.