In an internal memo, Apple acknowledges concerns and “misunderstandings” about its new Photo scanning feature but says it is necessary to “protect children” (Chance Miller/9to5Mac)

sources monday spotifykafkavox

internal apple photomiller9to5mac

Expanded Protections for Children is the name of a number of new features that Apple formally unveiled yesterday. iMessage sensitive picture safeguards, iCloud Photo CSAM content analysis, and additional information for Siri and Search are just a few of the new features.

Apple admits the “misunderstandings” around the new capabilities in an internal document given to the teams working on the project, which 9to5Mac was able to receive. Despite this, Apple maintains that these features are part of a “vital purpose” to keep youngsters safe.

With regard to these capabilities, Apple has encountered a lot of opposition, particularly from well-known figures like Edward Snowden and the Electronic Frontier Foundation. Apple’s intentions to check iCloud Photos for matches with a database of child sexual abuse material (CSAM) and the possible repercussions of such a function are the main targets of criticism.

The letter was authored by Sebastien Marineau-Mes, a software VP at Apple, and it was released late last night. 9to5Mac was able to secure a copy of the document. According to Marineau-Mes, Apple will keep “explaining and detailing the elements” present in this collection of Expanded Protections for Children.

Marineau-Mes reiterates Apple’s belief that these features are necessary to “protect children” while upholding Apple’s “deep commitment to user privacy,” despite the fact that the company has received “many positive responses” to the new features. However, Apple is aware that “some people have misunderstandings” about how the features will work, and “more than a few are worried about the implications,” she writes.

The memo is detailed below:

Expanded Protections for Children was officially unveiled to the public today, and I wanted to take this opportunity to thank each and every one of you for your dedication over the last several years. Without your never-ending commitment and fortitude, we never would have reached this milestone.

Keeping kids safe is such a crucial goal. Following this path has needed a strong cross-functional commitment from Engineering, GA, HI, Legal, Product Marketing, and PR, in typical Apple way. The result of this fantastic partnership is what we’re announcing today; it provides tools to safeguard youngsters while also upholding Apple’s steadfast commitment to user privacy.

Today has seen a lot of supportive replies. We are aware that some individuals have misconceptions and that many are concerned about the ramifications, but we will keep describing and detailing the features so that people may grasp what we have created. While there is still more work to be done in order to provide the features in the next months, I wanted to share this message that NCMEC sent us today. I think you will find it to be very inspiring as well.

Being a part of such a fantastic team at Apple makes me proud. I’m grateful.

A letter from the National Center for Missing and Exploited Children is also included in the document, and it is signed by Marita Rodriguez, executive director of strategic partnerships. On the upcoming iCloud scanning functionality, Apple and NCMEC are collaborating closely.

Total
0
Shares