internal apple photomiller9to5mac
Apple yesterday officially announced a range of new features coming later this year, dubbed Expanded Protections for Children. The new features include protections for sensitive images in iMessage, iCloud Photo scanning for child sexual abuse material (CSAM) content, and new knowledge for Siri and Search.
In an internal memo distributed to the teams that worked on this project and obtained by 9to5Mac, Apple acknowledges the “misunderstandings” around the new features, but doubles down on its belief that these features are part of an “important mission” for keeping children safe.
Apple has faced a significant amount of pushback for these features, including from notable sources such as Edward Snowden and the Electronic Frontier Foundation. The criticism centers primarily on Apple’s plans to scan iCloud Photos to check against a database of child sexual abuse material (CSAM) for matches and the potential implications of such a feature.
The memo, which was distributed late last night and obtained by 9to5Mac, was written by Sebastien Marineau-Mes, a software VP at Apple. Marineau-Mes says that Apple will continue to “explain and detail the features” included in this suite of Expanded Protections for Children.
Marineau-Mes writes that while Apple has seen “many positive responses” to these new features, it is aware that “some people have misunderstandings” about how the features will work, and “more than a few are worried about the implications.” Nonetheless, Marineau-Mes doubles down on Apple’s belief that these are necessary features to “protect children” while also maintaining Apple’s “deep commitment to user privacy.”