Saturday, Oct 23, 2021 | Last Update : 04:22 PM IST

  Technology   In Other news  04 Sep 2021  Apple's child protection features get delayed after privacy outcry

Apple's child protection features get delayed after privacy outcry

ANI
Published : Sep 4, 2021, 3:12 pm IST
Updated : Sep 4, 2021, 3:12 pm IST

The company detailed the iCloud Photo scanning system at length to make the case that it didn't weaken user privacy

Apple's original press release about the changes, which were intended to reduce the proliferation of child sexual abuse material (CSAM), had a similar statement at the top of the page. (ANI Photo)
 Apple's original press release about the changes, which were intended to reduce the proliferation of child sexual abuse material (CSAM), had a similar statement at the top of the page. (ANI Photo)

Washington: Apple's child protection features, which the company had announced last month, has now been delayed by the tech giant owing to criticism that the changes could diminish user privacy.

According to The Verge, the outcry was regarding one of the features that would scan users' photos for child sexual abuse material (CSAM). The changes had earlier been scheduled to roll out later this year.

 

In a statement to The Verge, Apple said, "Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material."

The statement further added, "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

Apple's original press release about the changes, which were intended to reduce the proliferation of child sexual abuse material (CSAM), had a similar statement at the top of the page.

 

That release detailed three major changes in the works. One change to Search and Siri would point to resources to prevent CSAM if a user searched for information related to it.

The other two changes came under more significant scrutiny. The first would alert parents when their kids were receiving or sending sexually explicit photos and would blur those images for kids.

The second one would have scanned images stored in a user's iCloud Photos for CSAM and report them to Apple moderators, who could then refer the reports to the National Center for Missing and Exploited Children, or NCMEC.

The company detailed the iCloud Photo scanning system at length to make the case that it didn't weaken user privacy. In short, it scans photos stored in iCloud Photos on a user's iOS device and would assess those photos alongside a database of known CSAM image hashes from NCMEC and other child safety organizations.

 

Still, several privacy and security experts heavily criticized Apple for the new system, arguing that it could have created an on-device surveillance system and that it violated the trust users had put in Apple for protecting on-device privacy.

As per The Verge, in an August 5 statement, the Electronic Frontier Foundation said that the new system, however well-intended, would "break key promises of the messenger's encryption itself and open the door to broader abuses."

Tags: apple, child protection, apple feature, child safety features
Location: United States, Washington