The American technology giant Apple stated that the entire photo libraries of iPhone users will be examined for known child abuse images if they are stored in the online iCloud service.
The technology giant made the disclosure following a series of media briefings in which Apple is seeking to banish alarm over the announcement it made last week that it will scan users’ phones, tablets and computers for millions of illegal pictures.
Security experts said that they expected governments would seek to force the iPhone maker to expand the system to peer into devices for other material. In a posting to its website, Apple said it would fight any such attempts, which can occur in secret courts.
“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,” Apple wrote.
Last day, Apple officials said the company’s system, which will be launched this fall with the release of its iOS 15 operating system, will check existing files on a user’s device if users have those photos synced to the company’s storage servers.
Ms. Julie Cordua, chief executive of Thorn, a group that has developed technology to help law enforcement officials detect sex trafficking, said that about half of child sexual abuse material is formatted as video. Apple’s system does not check videos before they are uploaded to its cloud, but the company said it plans to expand its system in unspecified ways in the future.
The technology giant has been facing increased international pressure for the low numbers of its reports of abuse material compared with other providers. Some European jurisdictions are debating legislation to hold platforms more accountable for the spread of such material.
Last day, the company executives argued that on-device checks preserve privacy more than running checks on Apple’s cloud storage directly. The architecture of the new system does not tell Apple anything about a user’s content unless a threshold number of images has been surpassed, which then triggers a human review.