In her new book, “The Age of Surveillance Capitalism,” Shoshana Zuboff offers a disturbing picture of how Silicon Valley and other corporations are mining users’ information to predict and shape their behavior.
The Gazette recently interviewed Zuboff about her belief that surveillance capitalism, a term she coined in 2014, is undermining personal autonomy and eroding democracy — and the ways she says society can fight back.
Question & Answer
GAZETTE: The digital revolution began with great promise. When did you start worrying that the tech giants driving it were becoming more interested in exploiting us than serving us?
ZUBOFF: In my 2002 book, “The Support Economy,” I looked at the challenges to capitalism in shifting from a mass to an individual-oriented structure of consumption. I discussed how we finally had the technology to align the forces of supply and demand. However, the early indications were that ( the people framing that first generation of e-commerce were more preoccupied with tracking cookies and attracting eyeballs ) for advertising than they were in the historic opportunity they faced.
For a time I thought this was part of the trial and error of a profound structural transformation, but, certainly by 2007, I understood that this was actually a new variant of capitalism that was taking hold of the digital milieu. The opportunities to align supply and demand around the needs of individuals were overtaken by a new economic logic that offered a fast track to monetization.
GAZETTE: What are some of the ways we might not realize that we are losing our autonomy to Facebook, Google, and others?
ZUBOFF: I define surveillance capitalism as the unilateral claiming of private human experience as free raw material for translation into behavioral data. These data are then computed and packaged as prediction products and sold into behavioral futures markets — business customers with a commercial interest in knowing what we will do now, soon, and later. It was Google that first learned how to capture surplus behavioral data, more than what they needed for services, and used it to compute prediction products that they could sell to their business customers, in this case advertisers. But I argue that surveillance capitalism is no more restricted to that initial context than, for example, mass production was restricted to the fabrication of Model T’s.
Right from the start at Google it was understood that users were unlikely to agree to this unilateral claiming of their experience and its translation into behavioral data. It was understood that these methods had to be undetectable. So from the start the logic reflected the social relations of the one-way mirror. They were able to see and to take — and to do this in a way that we could not contest because we had no way to know what was happening.
We rushed to the internet expecting empowerment, the democratization of knowledge, and help with real problems, but surveillance capitalism really was just too lucrative to resist. This economic logic has now spread beyond the tech companies to new surveillance–based ecosystems in virtually every economic sector, from insurance to automobiles to health, education, finance, to every product described as “smart” and every service described as “personalized.” By now it’s very difficult to participate effectively in society without interfacing with these same channels that are supply chains for surveillance capitalism’s data flows. For example, ProPublica recently reported that breathing machines purchased by people with sleep apnea are secretly sending usage data to health insurers, where the information can be used to justify reduced insurance payments.
GAZETTE: Why have we failed even now to take notice of the effects of all this surveillance?
ZUBOFF: There are many reasons. I chronicle 16 explanations as to “how they got away with it.” One big reason is that the audacious, unprecedented quality of surveillance capitalism’s methods and operations has impeded our ability to perceive them and grasp their meaning and consequence.
Another reason is that surveillance capitalism, invented by Google in 2001, benefitted from a couple of important historical windfalls. One is that it arose in the era of a neoliberal consensus around the superiority of self-regulating companies and markets. State-imposed regulation was considered a drag on free enterprise. A second historical windfall is that surveillance capitalism was invented in 2001, the year of 9/11. In the days leading up to that tragedy, there were new legislative initiatives being discussed in Congress around privacy, some of which might well have outlawed practices that became routine operations of surveillance capitalism. Just hours after the World Trade Center towers were hit, the conversation in Washington changed from a concern about privacy to a preoccupation with “total information awareness.” In this new environment, the intelligence agencies and other powerful forces in Washington and other Western governments were more disposed to incubate and nurture the surveillance capabilities coming out of the commercial sector.
A third reason is that these methodologies are designed to keep us ignorant. The rhetoric of the pioneering surveillance capitalists, and just about everyone who has followed, has been a textbook of misdirection, euphemism, and obfuscation. One theme of misdirection has been to sell people on the idea that the new economic practices are an inevitable consequence of digital technology. In America and throughout the West we believe it’s wrong to impede technological progress. So the thought is that if these disturbing practices are the inevitable consequence of the new technologies, we probably just have to live with it. This is a dangerous category error. It’s impossible to imagine surveillance capitalism without the digital, but it’s easy to imagine the digital without surveillance capitalism.
A fourth explanation involves dependency and the foreclosure of alternatives. We now depend upon the internet just to participate effectively in our daily lives. Whether it’s interfacing with the IRS or your health care provider, nearly everything we do now just to fulfill the barest requirements of social participation marches us through the same channels that are surveillance capitalism’s supply chains.
GAZETTE: You warn that our very humanity and our ability to function as a democracy is in some ways at risk.
ZUBOFF: The competitive dynamics of surveillance capitalism have created some really powerful economic imperatives that are driving these firms to produce better and better behavioral-prediction products. Ultimately they’ve discovered that this requires not only amassing huge volumes of data, but actually intervening in our behavior. The shift is from monitoring to what the data scientists call “actuating.” Surveillance capitalists now develop “economies of action,” as they learn to tune, herd, and condition our behavior with subtle and subliminal cues, rewards, and punishments that shunt us toward their most profitable outcomes.
What is abrogated here is our right to the future tense, which is the essence of free will, the idea that I can project myself into the future and thus make it a meaningful aspect of my present. This is the essence of autonomy and human agency. Surveillance capitalism’s “means of behavioral modification” at scale erodes democracy from within because, without autonomy in action and in thought, we have little capacity for the moral judgment and critical thinking necessary for a democratic society. Democracy is also eroded from without, as surveillance capitalism represents an unprecedented concentration of knowledge and the power that accrues to such knowledge. They know everything about us, but we know little about them. They predict our futures, but for the sake of others’ gain. Their knowledge extends far beyond the compilation of the information we gave them. It’s the knowledge that they have produced from that information that constitutes their competitive advantage, and they will never give that up. These knowledge asymmetries introduce wholly new axes of social inequality and injustice.
GAZETTE: So how do we change this dynamic?
ZUBOFF: There are three arenas that must be addressed if we are to end this age of surveillance capitalism, just as we once ended the Gilded Age.
First, we need a sea change in public opinion. This begins with the power of naming. It means awakening to a sense of indignation and outrage. We say, “No.” We say, “This is not OK.”
Second, we need to muster the resources of our democratic institutions in the form of law and regulation. These include, but also move beyond, privacy and antitrust laws. We also need to develop new laws and regulatory institutions that specifically address the mechanisms and imperatives of surveillance capitalism.
A third arena relates to the opportunity for competitive solutions. Every survey of internet users has shown that once people become aware of surveillance capitalists’ backstage practices, they reject them. That points to a disconnect between supply and demand: a market failure. So once again we see a historic opportunity for an alliance of companies to found an alternative ecosystem — one that returns us to the earlier promise of the digital age as an era of empowerment and the democratization of knowledge
by John Laidler - Harvard Correspondent
Surveillance Capitalism - It works by providing free services that billions of people cheerfully use, enabling the providers of those services to monitor the behaviour of those users in astonishing detail – often without their explicit consent.
“Surveillance capitalism,” she writes, “unilaterally claims human experience as free raw material for translation into behavioural data. Although some of these data are applied to service improvement, the rest are declared as a proprietary behavioural surplus, fed into advanced manufacturing processes known as ‘machine intelligence’, and fabricated into prediction products that anticipate what you will do now, soon, and later. Finally, these prediction products are traded in a new kind of marketplace that I call behavioural futures markets. Surveillance capitalists have grown immensely wealthy from these trading operations, for many companies are willing to lay bets on our future behaviour.”
While most of us think that we are dealing merely with algorithmic inscrutability, in fact what confronts us is the latest phase in capitalism’s long evolution – from the making of products, to mass production, to managerial capitalism, to services, to financial capitalism, and now to the exploitation of behavioural predictions covertly derived from the surveillance of users. In that sense, her vast (660-page) book is a continuation of a tradition that includes Adam Smith, Max Weber, Karl Polanyi and – dare I say it – Karl Marx.
The combination of state surveillance and its capitalist counterpart means that digital technology is separating the citizens in all societies into two groups: the watchers (invisible, unknown and unaccountable) and the watched. This has profound consequences for democracy because asymmetry of knowledge translates into asymmetries of power. But whereas most democratic societies have at least some degree of oversight of state surveillance, we currently have almost no regulatory oversight of its privatised counterpart.
If we fail to tame the new capitalist mutant rampaging through our societies then we will only have ourselves to blame, for we can no longer plead ignorance.
Interview Snipits;
John Naughton: At the moment, the world is obsessed with Facebook. But as you tell it, Google was the prime mover.
Shoshana Zuboff: Surveillance capitalism is a human creation. It lives in history, not in technological inevitability. It was pioneered and elaborated through trial and error at Google in much the same way that the Ford Motor Company discovered the new economics of mass production or General Motors discovered the logic of managerial capitalism.
Surveillance capitalism was invented around 2001 as the solution to financial emergency in the teeth of the dotcom bust when the fledgling company faced the loss of investor confidence. As investor pressure mounted, Google’s leaders abandoned their declared antipathy toward advertising. Instead they decided to boost ad revenue by using their exclusive access to user data logs (once known as “data exhaust”) in combination with their already substantial analytical capabilities and computational power, to generate predictions of user click-through rates, taken as a signal of an ad’s relevance.
Operationally this meant that Google would both repurpose its growing cache of behavioural data, now put to work as a behavioural data surplus, and develop methods to aggressively seek new sources of this surplus.
The company developed new methods of secret surplus capture that could uncover data that users intentionally opted to keep private, as well as to infer extensive personal information that users did not or would not provide. And this surplus would then be analysed for hidden meanings that could predict click-through behaviour. The surplus data became the basis for new predictions markets called targeted advertising.
Here was the origin of surveillance capitalism in an unprecedented and lucrative brew: behavioural surplus, data science, material infrastructure, computational power, algorithmic systems, and automated platforms. As click-through rates skyrocketed, advertising quickly became as important as search. Eventually it became the cornerstone of a new kind of commerce that depended upon online surveillance at scale.
The success of these new mechanisms only became visible when Google went public in 2004. That’s when it finally revealed that between 2001 and its 2004 IPO, revenues increased by 3,590%.
JN: So surveillance capitalism started with advertising, but then became more general?
SZ: Surveillance capitalism is no more limited to advertising than mass production was limited to the fabrication of the Ford Model T. It quickly became the default model for capital accumulation in Silicon Valley, embraced by nearly every startup and app. And it was a Google executive – Sheryl Sandberg – who played the role of Typhoid Mary, bringing surveillance capitalism from Google to Facebook, when she signed on as Mark Zuckerberg’s number two in 2008.
By now it’s no longer restricted to individual companies or even to the internet sector. It has spread across a wide range of products, services, and economic sectors, including insurance, retail, healthcare, finance, entertainment, education, transportation, and more, birthing whole new ecosystems of suppliers, producers, customers, market-makers, and market players.
Nearly every product or service that begins with the word “smart” or “personalised”, every internet-enabled device, every “digital assistant”, is simply a supply-chain interface for the unobstructed flow of behavioural data on its way to predicting our futures in a surveillance economy.
JN: In this story of conquest and appropriation, the term “digital natives” takes on a new meaning…
SZ: Yes, “digital natives” is a tragically ironic phrase. I am fascinated by the structure of colonial conquest, especially the first Spaniards who stumbled into the Caribbean islands. Historians call it the “conquest pattern”, which unfolds in three phases: legalistic measures to provide the invasion with a gloss of justification, a declaration of territorial claims, and the founding of a town to legitimate the declaration. Back then Columbus simply declared the islands as the territory of the Spanish monarchy and the pope.
The sailors could not have imagined that they were writing the first draft of a pattern that would echo across space and time to a digital 21st century. The first surveillance capitalists also conquered by declaration. They simply declared our private experience to be theirs for the taking, for translation into data for their private ownership and their proprietary knowledge. They relied on misdirection and rhetorical camouflage, with secret declarations that we could neither understand nor contest.
Google began by unilaterally declaring that the world wide web was its to take for its search engine. Surveillance capitalism originated in a second declaration that claimed our private experience for its revenues that flow from telling and selling our fortunes to other businesses. In both cases, it took without asking. Page [Larry, Google co-founder] foresaw that surplus operations would move beyond the online milieu to the real world, where data on human experience would be free for the taking. As it turns out his vision perfectly reflected the history of capitalism, marked by taking things that live outside the market sphere and declaring their new life as market commodities.
We were caught off guard by surveillance capitalism because there was no way that we could have imagined its action, any more than the early peoples of the Caribbean could have foreseen the rivers of blood that would flow from their hospitality toward the sailors who appeared out of thin air waving the banner of the Spanish monarchs. Like the Caribbean people, we faced something truly unprecedented.
Once we searched Google, but now Google searches us. Once we thought of digital services as free, but now surveillance capitalists think of us as free.
JN: Where does surveillance capitalism go from here?
SZ: Surveillance capitalism moves from a focus on individual users to a focus on populations, like cities, and eventually on society as a whole. Think of the capital that can be attracted to futures markets in which population predictions evolve to approximate certainty.
This has been a learning curve for surveillance capitalists, driven by competition over prediction products. First they learned that the more surplus the better the prediction, which led to economies of scale in supply efforts. Then they learned that the more varied the surplus the higher its predictive value. This new drive toward economies of scope sent them from the desktop to mobile, out into the world: your drive, run, shopping, search for a parking space, your blood and face, and always… location, location, location.
The evolution did not stop there. Ultimately they understood that the most predictive behavioural data comes from what I call “economies of action”, as systems are designed to intervene in the state of play and actually modify behaviour, shaping it toward desired commercial outcomes. We saw the experimental development of this new “means of behavioural modification” in Facebook’s contagion experiments and the Google-incubated augmented reality game PokĂ©mon Go.
It is no longer enough to automate information flows about us; the goal now is to automate us. These processes are meticulously designed to produce ignorance by circumventing individual awareness and thus eliminate any possibility of self-determination. As one data scientist explained to me, “We can engineer the context around a particular behaviour and force change that way… We are learning how to write the music, and then we let the music make them dance.”
This power to shape behaviour for others’ profit or power is entirely self-authorising. It has no foundation in democratic or moral legitimacy, as it usurps decision rights and erodes the processes of individual autonomy that are essential to the function of a democratic society. The message here is simple: Once I was mine. Now I am theirs.
JN: What are the implications for democracy?
SZ: During the past two decades surveillance capitalists have had a pretty free run, with hardly any interference from laws and regulations. Democracy has slept while surveillance capitalists amassed unprecedented concentrations of knowledge and power. These dangerous asymmetries are institutionalised in their monopolies of data science, their dominance of machine intelligence, which is surveillance capitalism’s “means of production”, their ecosystems of suppliers and customers, their lucrative prediction markets, their ability to shape the behaviour of individuals and populations, their ownership and control of our channels for social participation, and their vast capital reserves. We enter the 21st century marked by this stark inequality in the division of learning: they know more about us than we know about ourselves or than we know about them. These new forms of social inequality are inherently antidemocratic.
At the same time, surveillance capitalism diverges from the history of market capitalism in key ways, and this has inhibited democracy’s normal response mechanisms. One of these is that surveillance capitalism abandons the organic reciprocities with people that in the past have helped to embed capitalism in society and tether it, however imperfectly, to society’s interests.
First, surveillance capitalists no longer rely on people as consumers. Instead, supply and demand orients the surveillance capitalist firm to businesses intent on anticipating the behaviour of populations, groups and individuals.
Second, by historical standards the large surveillance capitalists employ relatively few people compared with their unprecedented computational resources. General Motors employed more people during the height of the Great Depression than either Google or Facebook employs at their heights of market capitalisation.
Finally, surveillance capitalism depends upon undermining individual self-determination, autonomy and decision rights for the sake of an unobstructed flow of behavioural data to feed markets that are about us but not for us.
This antidemocratic and anti-egalitarian juggernaut is best described as a market-driven coup from above: an overthrow of the people concealed as the technological Trojan horse of digital technology. On the strength of its annexation of human experience, this coup achieves exclusive concentrations of knowledge and power that sustain privileged influence over the division of learning in society. It is a form of tyranny that feeds on people but is not of the people. Paradoxically, this coup is celebrated as “personalisation”, although it defiles, ignores, overrides, and displaces everything about you and me that is personal.
JN: Our societies seem transfixed by all this: we are like rabbits paralysed in the headlights of an oncoming car.
SZ: …We are trapped in an involuntary merger of personal necessity and economic extraction, as the same channels that we rely upon for daily logistics, social interaction, work, education, healthcare, access to products and services, and much more, now double as supply chain operations for surveillance capitalism’s surplus flows. The result is that the choice mechanisms we have traditionally associated with the private realm are eroded or vitiated. There can be no exit from processes that are intentionally designed to bypass individual awareness and produce ignorance, especially when these are the very same processes upon which we must depend for effective daily life. So our participation is best explained in terms of necessity, dependency, the foreclosure of alternatives, and enforced ignorance…
JN: Doesn’t all this mean that regulation that just focuses on the technology is misguided and doomed to fail? What should we be doing to get a grip on this before it’s too late?
SZ: …Despite existing economic, legal and collective-action models such as antitrust, privacy laws and trade unions, surveillance capitalism has had a relatively unimpeded two decades to root and flourish. We need new paradigms born of a close understanding of surveillance capitalism’s economic imperatives and foundational mechanisms.”
For example, the idea of “data ownership” is often championed as a solution. But what is the point of owning data that should not exist in the first place? All that does is further institutionalise and legitimate data capture. It’s like negotiating how many hours a day a seven-year-old should be allowed to work, rather than contesting the fundamental legitimacy of child labour. Data ownership also fails to reckon with the realities of behavioural surplus. Surveillance capitalists extract predictive value from the exclamation points in your post, not merely the content of what you write, or from how you walk and not merely where you walk. Users might get “ownership” of the data that they give to surveillance capitalists in the first place, but they will not get ownership of the surplus or the predictions gleaned from it – not without new legal concepts built on an understanding of these operations.
Another example: there may be sound antitrust reasons to break up the largest tech firms, but this alone will not eliminate surveillance capitalism. Instead it will produce smaller surveillance capitalist firms and open the field for more surveillance capitalist competitors…
…My hope is that careful naming will give us all a better understanding of the true nature of this rogue mutation of capitalism and contribute to a sea change in public opinion, most of all among the young.
Surveillance capitalism describes a market driven process where the commodity for sale is your personal data, and the capture and production of this data relies on mass surveillance of the internet. This activity is often carried out by companies that provide us with free online services, such as search engines (Google) and social media platforms (Facebook).
These companies collect and scrutinise our online behaviours (likes, dislikes, searches, social networks, purchases) to produce data that can be further used for commercial purposes. And it’s often done without us understanding the full extent of the surveillance...
The Big Data Economy
The late 20th century has seen our economy move away from mass production lines in factories to become progressively more reliant on knowledge. Surveillance capitalism, on the other hand, uses a business model based on the digital world, and is reliant on “big data” to make money.
The data used in this process is often collected from the same groups of people who will ultimately be its targets. For instance, Google collects personal online data to target us with ads, and Facebook is likely selling our data to organisations who want us to vote for them or to vaccinate our babies.
Third-party data brokers, as opposed to companies that hold the data like Google or Facebook, are also on-selling our data. These companies buy data from a variety of sources, collate information about individuals or groups of individuals, then sell it.
Smaller companies are also cashing in on this. Last year, HealthEngine, a medical appointment booking app, was found to be sharing clients’ personal information with Perth lawyers particularly interested in workplace injuries or vehicle accidents.
https://en.wikipedia.org/wiki/Surveillance_capitalism