The nature of the self in the digital age

This is the original English version of an op-ed I wrote for Zeit Online. It is the first in a new series to take a critical look at technology as part of their 20th-year celebrations. This piece is based on a talk I gave at Bucerius Lab in Hamburg last month titled Digital Emancipation: Ownership of the Self in the Digital Age.

Mark Zuckerberg walks by an unsuspecting audience plugged into VR headsets.
The future we must avoid.

The nature of modern technology

Your smart television, the watch on your wrist, your kid’s new Barbie doll and the car that you drive (drives you?) all have one thing in common: they all work by gathering data – personal information – about you, your friends, and your family.

While this might sound creepy in and of itself, it is not the real problem.

Modern technology works by collecting swathes of (often personal) data. This is simply a fact of life. We’re not going to change that.

The important question is this: who owns and controls the data about you and the mechanisms by which it is collected, analysed, and transformed into useful services?

If the answer to this question is "I do" then we don’t have a problem. In such a world, technology would work to empower individuals with greater information about themselves and the world around them and translate that information into useful superpowers.

Sadly, we do not live in that world.

Today, the answer to the question is that multinational corporations like Google and Facebook own and control both your personal data and the means of collecting, analysing, and deriving value from it.

Today, our data is owned by corporations and by extension made available to governments. We live in a corporatocracy.
Today, corporations – not individuals – own and control our data and technology. We live in a corporatocracy, not a democracy.

This is a socio-techno-economical state that Shoshana Zuboff of the Harvard Business School calls Surveillance Capitalism.

To understand why Surveillance Capitalism is so problematic, we must first understand two fundamental concepts: the nature of the self and the nature of data in the digital age.

The nature of the self in the digital age

According to Steve Krug, author of Don’t Make Me Think, any well-designed technology should play the role of a butler when interacting with a human being. Say I want to remember something for later and I have my smart phone with me. The conversation between us could go something like this:

Me: Butler, remember this for later.

My phone: Sure, sir, I’ve noted it down in the Notes app for you.

Me: Thank you.

In fact, with technologies like Siri, you can have this exact same conversation today.

This is the mainstream way of viewing our relationship to technology: as the conversation between two actors. In this case, me and my phone. If this is how we see technology, surveillance is signals capture between two actors. This is no different to what the Stasi did when they bugged your home and listened in on your conversations. It’s not nice but it is what surveillance traditionally has been.

But what if this is not the nature of our relationship with technology?

An illustration showing a person having a conversation with their phone.
Is your phone a butler, or is it much more than that?

What if, when I write down a thought on my phone to remember it later, what I am actually doing is extending my mind, and thereby extending my self using the phone.

Today, we are all cyborgs. This is not to say that we implant ourselves with technology but that we extend our biological capabilities using technology. We are sharded beings; with parts of our selves spread across and augmented by our everyday things.

Perhaps it is time to extend the boundaries of the self to include the technologies by which we extend our selves.

An illustration showing a person’s phone within the boundaries of her self.
Extending the boundaries of the self.

If this is how we begin to see our everyday things – not as separate actors but as extensions of our selves – then several things become very clear:

Firstly, surveillance no longer becomes signals capture but a violation of the self. Consider the current Apple vs FBI case where the FBI wants to set a precedent so that they can access anyone’s phone. I’ve heard the request likened to a request by law enforcement to access a safe [NPR]. Nothing could be further from the truth. My iPhone is not like a safe any more than my brain is like a safe. It is a part of my self. In which case, if you want to get into my iPhone, what you really want to do is to violate my self. This is an assault on the self. And we already have a rich body of laws and regulations that protect the sanctity of the self and the rights of human beings.

Surveillance into the self is assault
Surveillance of the self is an assault; a violation of the self.

Secondly, it becomes clear that we don’t need to concoct a new Internet Bill of Rights or a Magna Carta for the Web or any such nonsense: all we need to do is to apply the Universal Declaration of Human Rights – the human rights we already have – to the digital era. There isn’t a digital world and a real world. There isn’t human rights and “digital rights”. The things that we are talking about are one and the same.

And, finally, we can begin to understand the true nature of those who peddle in our personal data and start to effectively regulate this egregious practice.

But first, we must also understand the nature of data.

The nature of data

We often hear data referred to as a valuable asset. According to Wired magazine, it is the new oil. It is only because we do not understand the true nature of data that we are not absolutely repulsed by such a comparison.

Let me illustrate:

Say I have a small figurine. If I have enough data about this figurine, I can take a 3D printer and I can create an exact copy of it. Now imagine what I can do if I have enough data about you.

Data about a thing, if you have enough of it, becomes the thing.

Data about you is you.

Personal data isn’t the new oil, personal data is people.

Now this is not to say that Google, Facebook, and the countless other startups in the cult of Silicon Valley want to 3D print you. No, of course not. They simply want to profile you. To simulate you. For profit.

The business model of surveillance capitalism – the business model of Google, Facebook, and countless other Silicon Valley startups – is to monetise human beings. We all know that Facebook and Google operate huge server farms. Have you ever stopped to ask yourself what it is, exactly, that they are farming? Because if you do, you might quickly come to the conclusion to that is it us. What are Google and Facebook if not factory farms for human beings?

A server farm
We call them server farms… have you ever stopped to ask yourself what it is, exactly, that they are farming?

If this sounds familiar, it should: we have been practising variations of this business model for a very long time.

We call the very lucrative and yet dispicable business of selling people’s bodies "slavery". The business model of mainstream technology today is to monetise everything about you that makes you who are apart from your body. What should we call this?

An etching showing a dispicable scene of the slave trade.
We have a shameful history of selling people. Today, the business model of mainstream technology is to sell everything about you that makes you who you are apart from your physical body. What should we call that?

This isn’t a technology problem…

The modern-day system of colonialism and sharecropping being constructed by the new East India Company that is Silicon Valley isn’t uncouth or stupid enough to put people in physical shackles. It doesn’t want to own your body, it is content with owning your simulation. And yet, as we have already seen, the more data they have about you – the higher the fidelity of your simulation – the closer they are to owning you.

Your simulation is not a static thing either – it is a living, breathing construct (in algorithms, if not biological cells). It lives in the labs of Google, Inc., and Facebook, Inc., and is constantly subject to hundreds if not thousands of experiments aimed to analyse and better understand you. These are the sort of experiments which, if they were performed on your captive physical person, would land the executives at these companies in prison for crimes against humanity.

All of this personal information, and the wealth of insight derived from it, belong to the corporations and, by extension (as Edward Snowden has shown us), are shared with governments.

This creates a huge power differential between individuals and corporations and between individuals and their governments.

If I take a camcorder and walk into Google, Inc, I will be arrested. However, Google records countless homes with its Nest cameras. The world of Surveillance Capitalism is one in which those who have a right to privacy – individuals – do not have it while those who should be transparent – corporations and democratic governments – do.

When Mark Zuckerberg says "privacy is dead", he’s only talking about your privacy, not his. When he buys a house, he buys the houses on both sides also. His privacy, the privacy of Facebook, Inc., and the privacy of your government are still very much alive and well.

If this doesn’t sound like democracy, it is because it is not. Surveillance Capitalism isn’t compatible with democracy.

The system we live in today can best be described as a corporatocracy; a feudalism of corporations.

Ours is a neo-colonial age of multinational monopolies.

A digital imperialism, if you will.

The rise of corporatocracy is our reward for decades of unchecked neoliberalism and Californian ideology. It brings with it an unparalleled level of systemic inequality that has resulted in 62 people having as much wealth as half of the world’s poorest population combined (that’s 3.5 billion people). It carries alongside it the wholesale destruction of our habitat through resource depletion and climate change. It is, to put it bluntly, an existential threat for our species.

This is not a technology problem.

It is a capitalism problem.

And the answer is better, stronger democracy.

Decentralised, zero-knowledge alternative technologies can play an important role is helping us achieve better civil liberties and democracy but technology is not a silver bullet. Without regulatory and statutory changes, those technologies will simply be deemed illegal and those of us who build them will become the new Snowdens and Mannings.

Our challenge is great: The alternatives that we create must be convenient and accessible. They must be ethically designed and non-colonial in nature. This is no small task. But neither is it infeasible. I know because I am first-hand coding such alternatives today (and others are, also).

Ethical design pyramid: products should respect human rights, effort, and experience.
The alternatives must be ethically designed.

The battle for our civil liberties and democracy will be fought with our new everyday things. The outcome will determine whether we remain quantified serfs toiling in a digital feudalism or whether we live as free citizens, empowered by technology that we own and control as individuals to explore the potential of our species in the stars.

I wish and work for the latter future.

I hope you will, also.

For more, see the recording of my talk at Bucerius Lab from last month, titled Digital Emancipation: Ownership of the Self in the Digital Age. You can also read the edited German version on Zeit Online.