Connect with us

Anti-Trust

In the Land of Trillion Dollar Goliaths | You won’t have to surrender your privacy for the children…yet

Recently, Apple announced that as a part of the iOS 15/iPadOS 15 update coming later this year, the company would enable new detection capabilities for known Child Sexual Abuse Material stored in iCloud Photos and iMessage in partnership with the National Center for Missing and Exploited Children, an organization that works in collaboration with law enforcement agencies in the United States. Instead, Apple is choosing to delay the feature due to major pressure from users and privacy advocates alike.

Apple said in an announcement “Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” Apple said in a statement to The Verge. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Apple’s original opening statement said “At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe. We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).” The 3 new features Apple wanted to introduce, developed in collaboration with child safety experts is described below:

  • The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.
  • iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.
  • Updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

Apple says that their new program is ambitious and “protecting children is an important responsibility” But perhaps the most troubling of all is the last sentence in the opening, “These efforts will evolve and expand over time.”

The Electronic Frontier Foundation said in a response, “Apple is planning to build a backdoor into its data storage system and its messaging system.” The EFF went on to echo our own sentiment about how protecting children is important before stating “Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”

“Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”

India Mckinney and Erica Portnoy at the EFF
apple csam flow chart

The slippery slope

Almost every argument against Apple’s new features is the “slippery slope” argument, which is perhaps the strongest augment in history especially when there is a powerful entity is involved such as a nation state or a Trillion Dollar Goliath like Apple.

History is riddled with cautionary tales of large central authorities gaining a so-called “inch and then taking a mile.” The governmental framework of the United States at its founding was one of a decentralized country in which most of the governmental power resided in the individual states. That way a large centralized Federal Government wouldn’t be needed and would never resemble the monarchies or dictatorships of the past. But through many instances where the government used “fear” to gain more power by promising more security, Apple is also using the same argument – ‘give us more power to protect the children.’ There are some things we can all agree on, we all want children to be safe and no, anyone who argues against that isn’t for child abuse.

Of course, that argument is too simplistic and you must always ask “how.” Because Apple using this argument, it is highly likely to result in an outcome that will be the same as every other time this argument has been used. If we give Apple an inch; just like every other powerful entity, they will take a mile. What option will users have then, get a Google-controlled Android phone, there will be no escape. The mobile Duopoly has stuck again.

A former industry insider and tech industry enthusiast.

Advertisement
Comments