Out of the frying pan and into the fire
Mariana Mazzucato1 has an article in MIT Technology Review titled Let’s make private data into a public good.
Let’s not.
While Mariana’s criticisms of surveillance capitalism are spot on, her proposed remedy is as far from the mark as it possibly could be.
Yes, surveillance capitalism is bad
Mariana starts off by making the case, and rightly so, that surveillance capitalists2 like Google or Facebook “are making huge profits from technologies originally created with taxpayer money.”
Google’s algorithm was developed with funding from the National Science Foundation, and the internet came from DARPA funding. The same is true for touch-screen displays, GPS, and Siri. From this the tech giants have created de facto monopolies while evading the type of regulation that would rein in monopolies in any other industry. And their business model is built on taking advantage of the habits and private information of the taxpayers who funded the technologies in the first place.
There’s nothing to argue with here. It’s a succinct summary of the tragedy of the commons that lies at the heart of surveillance capitalism and, indeed, that of neoliberalism itself.
Mariana also accurately describes the business model of these companies, albeit without focusing on the actual mechanism by which the data is gathered to begin with3:
Facebook’s and Google’s business models are built on the commodification of personal data, transforming our friendships, interests, beliefs, and preferences into sellable propositions. … The so-called sharing economy is based on the same idea.
So far, so good.
But then, things quickly take a very wrong turn:
There is indeed no reason why the public’s data should not be owned by a public repository that sells the data to the tech giants, rather than vice versa.
There is every reason why we shouldn’t do this.
Mariana’s analysis is fundamentally flawed in two respects: First, it ignores a core injustice in surveillance capitalism – violation of privacy – that her proposed recommendation would have the effect of normalising. Second, it perpetuates a fundamental false dichotomy – that there is no other way to design technology than the way Silicon Valley and surveillance capitalists design technology – which then means that there is no mention of the true alternatives: free and open, decentralised, interoperable ethical technologies.
No, we must not normalise violation of privacy
The core injustice that Mariana’s piece ignores is that the business model of surveillance capitalists like Google and Facebook is based on the violation of a fundamental human right. When she says “let’s not forget that a large part of the technology and necessary data was created by all of us” it sounds like we voluntarily got together to create a dataset for the common good by revealing the most intimate details of our lives through having our behaviour tracked and aggregated. In truth, we did no such thing.
We might have resigned ourselves to being farmed by the likes of Google and Facebook because we have no other choice but that’s not a healthy definition of consent by any standard. If 99.99999% of all investment goes into funding surveillance-based technology (and it does), then people have neither a true choice nor can they be expected to give any meaningful consent to being tracked and profiled. Surveillance capitalism is the norm today. It is mainstream technology. It’s what we funded and what we built.
It is also fundamentally unjust.
There is a very important reason why the public’s data should not be owned by a public repository that sells the data to the tech giants because it’s not the public’s data, it is personal data and it should never have been collected by a third party to begin with. You might hear the same argument from people who say that we must nationalise Google or Facebook.
No, no, no, no, no, no, no! The answer to the violation of personhood by corporations isn’t violation of personhood by government, it’s not violating personhood to begin with.
That’s not to say that we cannot have a data commons. In fact, we must. But we must learn to make a core distinction between data about people and data about the world around us.
Data about people ≠ data about rocks
Our fundamental error when talking about data is that we use a single term when referring to both information about people as well as information about things. And yet, there is a world of difference between data about a rock and data about a human being. I cannot deprive a rock of its freedom or its life, I cannot emotionally or physically hurt a rock, and yet I can do all those things to people. When we posit what is permissible to do with data, if we are not specific in whether we are talking about rocks or people, one of those two groups is going to get the short end of the stick and it’s not going to be the rocks.
Here is a simple rule of thumb:
Data about individuals must belong to the individuals themselves. Data about the commons must belong to the commons.
I implore anyone working in this area – especially professors writing books and looking to shape public policy – to understand and learn this core distinction.
There is an alternative
I mentioned above that the second fundamental flaw in Mariana’s article is that it perpetuates a false dichotomy. That false dichotomy is that the Silicon Valley/surveillance capitalist model of building modern/digital/networked technology is the only possible way to build modern/digital/networked technology and that we must accept it as a given.
This is patently false.
It’s true that all modern technology works by gathering data. That’s not the problem. The core question is “who owns and controls that data and the technology by which it is gathered?” The answer to that question today is “corporations do.” Corporations like Google and Facebook own and control our data not because of some inevitable characteristic of modern technology but because of how they designed their technology in line with the needs of their business model.
Specifically, surveillance capitalists like Google and Facebook design proprietary and centralised technologies to addict people and lock them in. In such systems, your data originates in a place you do not own. On “other people’s computers,” as the Free Software Foundation calls it. Or on “the cloud” as we colloquially reference it.
The crucial point here, however, is that this toxic way of building modern technology is not the only way to design and build modern technology.
We know how to build free and open, decentralised, and interoperable systems where your data originates in a place that you – as an individual – own and control.
In other words, we know how to build technology where the algorithms remain on your own devices and where you are not farmed for personal information to begin with.
To say that we must take as given that some third party will gather our personal data is to capitulate to surveillance capitalism. It is to accept the false dichotomy that either we have surveillance-based technology or we forego modern technology.
This is neither true, nor necessary, nor acceptable.
We can and we must build ethical technology instead.
Regulate and replace
As I’m increasingly hearing these defeatist arguments that inherently accept surveillance as a foregone conclusion of modern technology, I want to reiterate what a true solution looks like.
There are two things we must do to create an ethical alternative to surveillance capitalism:
-
Regulate the shit out of surveillance capitalists.
The goal here is to limit their abuses and harm. This includes limiting their ability to gather, process, and retain data, as well as fining them meaningful amounts and even breaking them up.4
-
Fund and build ethical alternatives.
In other words, replace them with ethical alternatives.
Ethical alternatives do exist today but they do so mainly thanks to the extraordinary personal efforts of disjointed bands of so-called DIY rebels.
Whether they are the punk rockers of the tech world or its ragamuffins – and perhaps a little bit of both – what is certain is that they lead a precarious existence on the fringes of mainstream technology. They rely on anything from personal finances to selling the things they make, to crowdfunding and donations – and usually combinations thereof – to etch out an existence that both challenges and hopes to alter the shape of mainstream technology (and thus society) to make it fairer, kinder, and more just.
While they build everything from computers and phones (Puri.sm) to federated social networks (Mastodon) and decentralised alternatives to the centralised Web (DAT), they do so usually with little or no funding whatsoever. And many are a single personal tragedy away from not existing at all.
Meanwhile, we use taxpayer money in the EU to fund surveillance-based startups. Startups, which, if they succeed will most likely be bought by larger US-based surveillance capitalists like Google and Facebook. If they fail, on the other hand, the European taxpayer foots the bill. Europe, bamboozled by and living under the digital imperialism of Silicon Valley, has become its unpaid research and development department.
This must change.
Ethical technology does not grow on trees. Venture capitalists will not fund it. Silicon Valley will not build it.
A meaningful counterpoint to surveillance capitalism that protects human rights and democracy will not come from China. If we fail to create one in Europe then I’m afraid that humankind is destined for centuries of feudal strife. If it survives the unsustainable trajectory that this social system has set it upon, that is.
If we want ethical technological infrastructure – and we should, because the future of our human rights, democracy, and quite possibly that of the species depends on it – then we must fund and build it.
The answer to surveillance capitalism isn’t to better distribute the rewards of its injustices or to normalise its practices at the state level.
The answer to surveillance capitalism is a socio-techno-economic system that is just at its core. To create the technological infrastructure for such a system, we must fund independent organisations from the common purse to work for the common good to build ethical technology to protect individual sovereignty and nurture a healthy commons.
-
According to the bio in the article: “Mariana Mazzucato is a professor in the economics of innovation and public value at University College London, where she directs the Institute for Innovation and Public Purpose.” The article I’m referencing is an edited excerpt from her new book The Value of Everything: Making and Taking in the Global Economy. ↩︎
-
Although she never explicitly uses that term in the article. ↩︎
-
Centralised architectures based on surveillance. ↩︎
-
Break them up, by all means. But don’t do anything silly like nationalising them (for all the reasons I mention in this post). Nationalising a surveillance-based corporation would simply shift the surveillance to the state. We must embrace the third alternative: funding and building technology that isn’t based on surveillance to begin with. In other words, free and open, decentralised, interoperable technology. ↩︎