According to WhatsApp CEO Will Cathcart, the company will not embrace Apple's new Child Safety safeguards, which are intended to prevent the transmission of child abuse material. He outlines his conviction that Apple "has created software that can scan all the private photographs on your phone" in a Twitter thread, and claims that Apple has gone down the wrong route in trying to enhance its reaction to CSAM (child sexual abuse material).
Apple's strategy, which was disclosed on Thursday, includes comparing hashes of photos uploaded to iCloud with hashes of known CSAM images in a database. Apple claims that this kind of approach allows it to keep user data encrypted and do analysis on-device while also reporting users to the authorities if they are discovered to be spreading child abuse images. Another aspect of Apple's Child Safety approach is the ability for parents to be notified if their child under the age of 13 sends or sees photographs with sexually explicit material. People would be "worried about the consequences" of the systems, according to an internal Apple memo.
Apple's approach, according to Cathcart, is "extremely worrisome," since it would allow governments with differing notions about what kinds of photos are and aren't acceptable to ask Apple to add non-CSAM images to the databases it compares images to. According to Cathcart, WhatsApp's anti-child abuse system, which relies in part on user complaints, maintains encryption, similar to Apple's and has resulted in the business sending over 400,000 incidents to the National Center for Missing and Exploited Children by 2020. (Apple is also collaborating with the Center on CSAM detection.)
Facebook, which owns WhatsApp, has cause to be angry with Apple over privacy issues. Apple's changes to how ad tracking works in iOS 14.5 sparked a feud between the two firms, with Facebook buying newspaper advertisements in response to Apple's privacy restrictions, which it said were damaging to small businesses. Apple retaliated, claiming that the update "simply demands" that consumers be given the option of whether or not they want to be monitored.
However, Apple's new Child Safety policies have been panned by more than just WhatsApp. Edward Snowden, the Electronic Frontier Foundation, academics, and others are among those who have expressed alarm. Before the feature was made public, Matthew Green, an associate professor at Johns Hopkins University, had reservations about it. He tweeted about Apple's plans and how governments and other parties might exploit the hashing mechanism.
The Electronic Frontier Foundation (EFF) issued a statement blasting Apple's proposal, calling it a "thoroughly documented, meticulously thought-out, and narrowly-scoped backdoor." The EFF's press statement goes into great depth on how it believes governments might exploit Apple's Child Safety features and how they compromise user privacy.
The possible hazards to LGBT children and Apple's initial lack of clarity surrounding age restrictions for the parental alerts function are discussed in a thread by Kendra Albert, a teacher at Harvard's Cyberlaw Clinic. Edward Snowden retweeted a Financial Times piece on the system, describing Apple's actions in his own words.
0 Comments