Privacy & trust in the era of tech overlords

Tech companies know all your secrets.

Google, Facebook, Amazon, Microsoft and Apple are studying your every thought, word, and deed. AI analysis has already given them spooky abilities to predict your behavior for advertisers, with enough accuracy to make them the richest companies on the planet. Now they are exploring ways to influence your behavior, which opens up tremendously difficult moral, legal and political questions.

We will be dealing with the implications of this data collection for years. Politicians will attempt to create laws that protect us from bad behavior and overreaching without stifling innovation. Twitter will be lighting up with anecdotes about invasions of privacy. Journalists will struggle to convey nuance in a world that seems only to respect black and white. The tech companies will move forward with their data collection and analysis without fully understanding the societal consequences of what they’re doing.

Let’s see if we can put together a framework that will help you put the policy debates into context. These are the thoughts in my mind at the beginning of this new era of tech overlords.

•  TODAY: Data about you is being gathered by the big tech companies (as well as many other companies) in startling ways. Their ability to analyze that data and predict your behavior is more like magic than technology – and it has the potential to change the world, for better or worse.

•  It is impossible for us to detach from the big tech companies or prevent that data from being collected.

•  Although the scale of data-gathering is unprecedented, there is nothing new about big companies observing our behavior and it is not necessarily an invasion of privacy.

•  Some acts by the tech companies absolutely do invade our privacy; well-considered regulation can help protect our privacy.

•  Our individual decisions about the big tech companies should be driven by trust and transparency, not unfocused angst about invasions of privacy.

 

Data collection has become the central mission of the large tech companies

Data collection and behavioral surplus

When Google started, the focus of its business was creation of an index of Internet websites. It helped us find things online, and it sold space to advertisers related to what we searched for.

In the early days, Facebook gave us a place to post our thoughts and keep track of family and friends, with advertisements related to what we posted.

Quickly, though, Google and Facebook recognized that there was more to learn if they watched us closely enough, and that there was more to do with the fruits of that surveillance than sell advertising. We are constantly creating a cloud of data about who we are and what we’re going to do. When you search on Google, Google knows what words you typed – and it knows what time you typed them, and how long you looked at the page of search results, and which one you clicked on, and how long you stayed on that page, and whether you clicked the Back button to go back to the search results, and what else you browsed online at the same time. Facebook learns not just from what you post but from what you pause to watch, what you scroll past, what you click on, what you ignore, how long you stay online, what device you use, and what you buy from an ad.

As they developed their skills, the big companies’ appetite for data grew. Google, Facebook, Amazon, the phone carriers, the ISPs, all began tracking and trading data about your location, your contacts, your job, your children, your habits, and everything else that can be learned by studying you in minute detail. As The Guardian put it recently, the big companies are dealing in “how far and where your morning run takes you, the conditions of your commute, the contents of your text messages, the words you speak in your own home and your actions beneath all-seeing cameras, the contents of your shopping basket, your impulse purchases, your speculative searches and choices of dates and mates – all recorded, rendered as data, processed, analysed, bought, bundled and resold like sub-prime mortgages.”

If all that data was handed to you in some giant spreadsheet, you wouldn’t be able to discover anything useful. But when your behavior is analyzed by an AI that can compare it to the behavior of millions of other people, the patterns paint a picture of everything you are thinking, what you have done, and what you are planning to do.

Shoshana Zuboff’s important new book The Age Of Surveillance Capitalism calls this data our “behavioral surplus,” and argues that the big tech companies unilaterally began to collect this data without any societal responsibility or accountability – and that what they can accomplish with it stands traditional industrial capitalism on its head. In the short run the companies seem only to be trying to serve up more effective ads, but the companies can and will be incentivized to nudge us towards “better” decisions, and the same tools can be used to identify any signs of deviance, dissent, or radical intent. In Zuboff’s words:

“Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioural data. Although some of these data are applied to service improvement, the rest are declared as a proprietary behavioural surplus, fed into advanced manufacturing processes known as ‘machine intelligence’, and fabricated into prediction products that anticipate what you will do now, soon, and later. Finally, these prediction products are traded in a new kind of marketplace that I call behavioural futures markets. Surveillance capitalists have grown immensely wealthy from these trading operations, for many companies are willing to lay bets on our future behaviour.”

If algorithms and large amounts of data allow human behavior to be perfectly modeled, predicted, and controlled, it will have potentially huge effects on our economics, politics, and worldview. In Homo Deus, the author Yuval Noah Harari makes a persuasive argument that “dataism” could replace humanism as the organizing principle for humankind, effectively a “data religion.”

Those are big questions that I will leave for greater minds. For now, let’s focus on the privacy concerns of individuals in 2019. The important part is that everything about your interactions with devices and technology is being recorded, stored, and analyzed. In a world with omnipresent cameras and technology as part of virtually every interaction we have in the world, that means our behavior is being studied constantly by machines that can use the information to predict our behavior with startling insights and accuracy.

 

NEXT:

Can we stop the companies from collecting our behavioral data?

Is this an invasion of privacy?

Share This