Eric Zeman / Android Authority
- The Apple CSAM photo scanning function announced a month ago is being postponed.
- Apple says it will need “additional time” in the coming months to improve functionality.
- The policy provides that user photos are algorithmically “scanned” for evidence of child abuse.
In early August, Apple announced a very controversial new policy. To curb child exploitation, the company announced it would securely scan every single photo people upload to iCloud. Although this scanning would be done algorithmically, all flags of the algorithm would be followed up by a human.
Obviously, Child Sexual Abuse (CSAM) material is a huge problem that almost everyone wants to tackle. However, Apple’s CSAM policy has made many people concerned because it puts privacy limits. Now the company is delaying the introduction of the feature (via 9to5Mac).
See also: The best privacy web browsers for Android
Apple promised its user material scanning algorithm was incredibly accurate, claiming there was a “one in a trillion chance a year of falsely flagging a particular account.” However, that promise couldn’t stop the discomfort. Apple’s statement about the delay makes this very clear:
Last month we announced plans for features to help protect children from predators who use means of communication to recruit and exploit them, and to limit the distribution of child sexual abuse material. Based on feedback from customers, stakeholders, researchers, and others, we’ve decided to take additional time in the coming months to gather input and make improvements before releasing these critically important child safety features.
The statement suggests Apple won’t be introducing this anytime soon. “In the coming months” could mean the end of this year or possibly until 2022. It could even be postponed indefinitely.