More than 90 policy and rights organizations requested that Apple not move forward with its plans to monitor children's IPhones for sexually explicit content over concerns of censorship and privacy violations.
Apple hopes to protect minors by implementing software in IPhones that will scan devices for child pornography and warning users that the content they are sending or receiving may be sexually explicit. The new system will then inform the organizer of a family account, typically a parent, when someone under 13-years-old elects to send or accept graphic content.
In a letter to Apple CEO Tim Cook, dated Thursday, the groups advocating for civil, human and digital rights outlined the dangers that could come from a surveillance system while also acknowledging that the tech manufacturer's intentions may have been pure.
"Algorithms designed to detect sexually explicit material are notoriously unreliable," the letter reads. "They are prone to mistakenly flag art, health information, educational resources, advocacy messages, and other imagery. Children’s rights to send and receive such information are protected in the U.N. Convention on the Rights of the Child."
"Moreover, the system Apple has developed assumes that the 'parent' and 'child' accounts involved actually belong to an adult who is the parent of a child, and that those individuals have a healthy relationship," it continued. "This may not always be the case; an abusive adult may be the organiser [sic] of the account, and the consequences of parental notification could threaten the child’s safety and wellbeing."
The letter went on to note that while Apple currently only intends to flag sexual content found in iMessages, the federal government may later attempt to compel the company into identifying non-sexually explicit content that they deem objectionable.
Apple's operating system will also feature a tool that detects child sexual abuse material provided by the National Center for Missing and Exploited Children. Each graphic image uploaded to iCloud will be scanned against the database of sensitive child images. If a match is found, the user's account will be disabled and authorities will be alerted.
The letter warns that, once this surveillance system is in effect, Apple could face governmental pressure and legal requirements to monitor devices for content beyond just underage sexual material.
"Those images may be of human rights abuses, political protests, images companies have tagged as 'terrorist' or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them," the letter says of what content the federal government could consider to be objectionable. "And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance and persecution on a global basis."
This comes after officials in the Biden administration made efforts to identify individuals who participated in the Jan. 6 Capitol riot by scanning images posted to social media. Last year, Trump officials used the same tactic to discover who took part in the Black Lives Matter riots that destroyed several U.S. cities following the death of George Floyd.