This week Apple announced that as a part of the iOS 15/iPadOS 15 update coming later this year, the company would enable new detection capabilities for known Child Sexual Abuse Material stored in iCloud Photos and iMessage in partnership with the National Center for Missing and Exploited Children, an organization that works in collaboration with law enforcement agencies in the United States.
Apple’s opening statement said “At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe. We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).”
The 3 new features coming to iOS15
Apple says it is introducing three new child safety features, developed in collaboration with child safety experts:
- The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.
- iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.
- Updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.
Apple says that their new program is ambitious and “protecting children is an important responsibility” But perhaps the most troubling of all is the last sentence in the opening, “These efforts will evolve and expand over time.”
The Electronic Frontier Foundation said in a response, “Apple is planning to build a backdoor into its data storage system and its messaging system.” The EFF went on to echo our own sentiment about how protecting children is important before stating “Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”
“Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”India Mckinney and Erica Portnoy at the EFF
Epic Games CEO & Founder, Tim Sweeney said in response “I’ve tried hard to see this from Apple’s point of view. But inescapably, this is government spyware installed by Apple based on a presumption of guilt. Though Apple wrote the code, its function is to scan personal data and report it to government.”
Apple’s new iMessage feature is essentially opening the door to scanning your communications in iMessage and then applying a Twitter-style warning label in an attempt to censor your speech. While Apple is saying this is to protect the children, it could be used censor your speech much like how Facebook, Google and Twitter have in the recent past under the guise of anything they like. The EFF echoed that sentiment with “All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.”
Even Edward Snowden who famously made public all the ways the US Federal Government was scooping up every bit of digital data generated by Americans, says “
No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs—*without asking.*” in a recent Tweet.
The EFF even gave examples of what they called “mission creep” in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of “terrorist” content that companies can contribute to and access for the purpose of banning such content. The database, managed by the Global Internet Forum to Counter Terrorism (GIFCT), is troublingly without external oversight, despite calls from civil society. While it’s therefore impossible to know whether the database has overreached, we do know that platforms regularly flag critical content as “terrorism,” including documentation of violence and repression, counterspeech, art, and satire.
The full message from Marita Rodriguez of the NCMEC in support of Apple:
I wanted to share a note of encouragement to say that everyone at NCMEC is SO PROUD of each of you and the incredible decisions you have made in the name of prioritizing child protection.
It’s been invigorating for our entire team to see (and play a small role in) what you unveiled today.
I know it’s been a long day and that many of you probably haven’t slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority.
Our voices will be louder.
Our commitment to lift up kids who have lived through the most unimaginable abuse and victimizations will be stronger.
During these long days and sleepless nights, I hope you take solace in knowing that because of you many thousands of sexually exploited victimized children will be rescued, and will get a chance at healing and the childhood they deserve.
Thank you for finding a path forward for child protection while preserving privacy.
I know it’s been a long day and that many of you probably haven’t slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority.Marita Rodriguez of the NCMEC
Clearly emotions are high, with the NCMEC’s Marita Rodriguez statement unprofessionally calling out detractors as the “screeching voices of the minority.”
The slippery slope
Almost every argument against Apple’s new features is the “slippery slope” argument, which is perhaps the strongest augment in history especially when there is a powerful entity is involved such as a nation state or a Trillion Dollar Goliath like Apple.
History is riddled with cautionary tales of large central authorities gaining a so-called “inch and then taking a mile.” The governmental framework of the United States at its founding was one of a decentralized country in which most of the governmental power resided in the individual states. That way a large centralized Federal Government wouldn’t be needed and would never resemble the monarchies or dictatorships of the past. But through many instances where the government used “fear” to gain more power by promising more security, Apple is also using the same argument, give us more power to protect the children. There are some things we can all agree on, we all want children to be safe and no, anyone who argues against that isn’t for child abuse.
Of course, that argument is too simplistic and you must always ask “how.” Because Apple using this argument, it is highly likely to result in an outcome that will be the same as every other time this argument has been used. If we give Apple an inch; just like every other powerful entity, they will take a mile. What option will users have then, get a Google-controlled Android phone, there will be no escape. The mobile Duopoly has struck again.