Opinion

Apple Doesn’t Care About Your Kids

|
Posted: Sep 29, 2021 10:16 AM
The opinions expressed by columnists are their own and do not necessarily represent the views of Townhall.com.
Apple Doesn’t Care About Your Kids

Source: AP Photo/Marcio Jose Sanchez, File

"What happens on your iPhone, stays on your iPhone." In many ways, Apple, like its slogan, appears to be totally divorced from reality. After all, this is a company that prides itself on protecting users' privacy, yet constantly fails to live up to its promises. Tim Cook models himself as an ethical, more amiable version of Steve Jobs -- but he’s no different from his predecessor. Like Jobs, Cook is clearly an intelligent man; and like Jobs, he oversees a company that capitalizes on our insecurities and private communications.

With its latest rollout, ostensibly designed to protect children, Apple is attempting to trespass even further. In short, the company is trying to build a backdoor into users' iPhones. With 113 million iPhone users in the US, that's a lot of backdoors. 

This rather alarming fact hasn’t stopped an increasing number of organizations from calling on Apple to implement its new scanning technologies. Take the good people at WeProtect Global Alliance, for example, an agency created to protect children from sexual exploitation and online abuse. They recently waxed lyrically about Apple’s “significant new steps to strengthen its response to child sexual exploitation and abuse.” With the announcement, they chirped, “there was a genuine sense of hope that this and other newly developed technologies could help to turn the tide in this crisis that affects so many children and families across the world.” Now, before I am accused of being a sarcastic monster, let me first defend myself. WeProtect is an important organization, and its employees carry out important work. Although the comments no doubt come from a genuine place, they are misplaced and somewhat delusional.

Yes, online child abuse is very much a problem, but Apple is not a company to be trusted. If in doubt, let me point you in the direction of a recent Tech Transparency report.  Apple, according to the authors, "is failing to take even the most basic steps to protect children" in the App Store. The failures are as numerous as they are egregious: failures in age verification have exposed a number of children “to pornography, gambling, and a host of other supposedly age-limited apps.” Apple, a company that wants to save the kids, can't even protect kids in its own App Store.  Which brings us back to Apple's new scanning technologies.  Although the company recently delayed plans to roll out its new policy, it’s important to acknowledge the word ‘delayed.’ Apple hasn’t canceled its plans. Why? Because the roll out is still coming.  The execution has simply been rescheduled because the guillotine is being serviced. 

Why should we be concerned? After all, who doesn’t want to help innocent children? Only a monster, right?

If Big Tech companies have taught us anything, it’s this: their words are so often betrayed by their actions. Akin to pathological liars, we should view everything that comes out of their collective mouth with a great degree of suspicion. Big Tech companies, Apple included, are not in the business of helping people; they are in the business of collecting data. The more of it, the better. This, not surprisingly, helps the companies and harms us. We the people are little more than dairy cows being milked for data. It doesn’t matter if you’re a 7-year-old child or a 70-year-old retiree. We are all just fodder for the algorithmic machine. This has always been the case, and there’s little reason to believe that things will ever change.

Apple’s “let’s help the kids” three point policy deserves a great deal of scrutiny. Unless changes are made, all photos sent or received by someone under 18 will be scanned for inappropriate content. If those at Apple believe the images to be illicit, then law enforcement agencies will likely be notified. One needn’t possess more than a handful of neurons to see how a policy like this could metastasize into something truly contemptible. It starts with scanning your photos for unlawful content, but where does it end? It doesn’t. In many ways, it’s the equivalent of the foot-in-the-door technique. More specifically, the equivalent of the foot-in-the-backdoor-technique.  As Edward Snowden, a man who knows a thing or two about backdoors, recently warned, “Apple’s new system, regardless of how anyone tries to justify it, will permanently redefine what belongs to you, and what belongs to them.” Apple’s new policy is an all-out assault on privacy. Couple Snowden’s warnings with Apple’s shady history of breaching privacy laws, and you have a recipe for outrageous abuses. Although many will try to justify Apple’s new system, Sowden argues that the company is looking to “permanently redefine what belongs to us, and what belongs to them.”

Of course, Mr. Snowden is right. To scan photos, Apple must first breach encryption, the very thing that Apple promised it would never do. Promises, as we all know, are cheap. When Big Tech makes them, they’re basically worthless. The importance of encryption can’t be emphasized enough, largely because it’s the only thing that keeps our information reasonably safe. Encryption is the roof on our informational houses. Without it, sensitive data is exposed, and private conversations become public knowledge. It’s not just individuals that rely on encryption; banks and government agencies couldn’t function without it. Without encryption, they would be just as vulnerable as the average citizen. 

Lastly, as researchers at The Electronic Frontier Foundation (EFF), a nonprofit organization dedicated to “defending civil liberties in the digital world,” warn, it’s literally impossible “to build a client-side scanning system that can only be used for sexually explicit images sent or received by children.” In reality, by essentially destroying end-to-end encryption, Apple is exposing its customers to “broader abuses,” like state surveillance, for example. One can put lipstick on a pig, but it’s still a pig. As the EFF report puts it, “Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.” This backdoor must remain shut. If Apple successfully unlocks it, we all lose. If there was ever a time for the US government to take action, it's now. 

Trending Townhall Video