WhatsApp Manager Says Apple’s Child Safety Update is “Watching System”

0

A day after Apple confirmed plans for new software that will allow it to detect images of child abuse in users’ iCloud photos, Facebook’s WhatsApp manager says he is “concerned” about the plans.

In a wire On Twitter, Will Cathcart called it “an Apple designed and operated surveillance system that could very easily be used to scan private content for anything they or a government decides to control.” He also raised questions about how such a system can be exploited in China or other countries, or abused by spyware companies.

An Apple spokesperson took issue with Cathcart’s characterization of the software, noting that users can choose to turn off iCloud Photos. Apple also said the system is only trained on a database of “known” images provided by the National Center for Missing and Exploited Children (NCMEC) and other organizations, and that it would not be possible to make it work region-specific since it is integrated with iOS.

It’s no surprise that Facebook is challenging Apple’s plans. Apple has spent years criticizing Facebook for its record on privacy, even as the social network embraces end-to-end encryption. More recently, companies have clashed over privacy updates that hampered Facebook’s ability to track its users, an update the company says will hurt its ad revenue.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through any of these links, we may earn an affiliate commission.



Source link

Leave A Reply

Your email address will not be published.