UNPLUGGED

Book Review – The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power

Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. Profile Books.

Ella Hafermalz

KIN Research Group, School of Business and Economics, Vrije Universiteit Amsterdam, The Netherlands

 

Citation: M@n@gement 2021: 24(4): 70–75 - http://dx.doi.org/10.37725/mgmt.v24.5519.

Copyright: © 2021 Ella Hafermalz. Published by AIMS, with the support of the Institute for Humanities and Social Sciences (INSHS).
This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), permitting all non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Published: 15 December 2021

*Correspondence to: Ella Hafermalz, Email: e.w.hafermalz@vu.nl

 

Abstract

Surveillance capitalists like Google and Amazon will do whatever they can to corner supply routes to data about us and our actions. In Zuboff’s lengthy book The age of surveillance capitalism, we learn about the strategic and often underhand means by which these data are captured, and the ‘instrumentarian’ ideology that provides the logic for this enterprise. Zuboff shows that the aim of advertisers and ‘people analytics’ advocates is to use our personal data to determine our behavior. At stake is free will and our ‘right to the future tense’. In this book review, I reflect on Zuboff ’s analysis of how Big Tech, as Big Other, is controlling our lives. I first highlight the prescience of the book’s arguments. I then compare aspects of the book with earlier tomes that were critical of new technology, to argue that taking a deterministic view of peoples’ relationship with technology may inadvertently support the hyped narrative that data analytics and algorithms are all-powerful.

Keywords: Surveillance capitalism; Zuboff ; Book review; Instrumentarianism; Digital technology

 

The opportunity to review Zuboff’s The age of surveillence capitalism came to me through Twitter. I searched for the book on Google and purchased it both on Kindle and on Audible as an audiobook. Only when I later saw the hardcover sitting on a colleague’s bookshelf did I realize what I had signed up for. It is over 650 pages long with 18 chapters. In an era where journal articles are a ‘long read’, The age of surveillance capitalism is an Everest.

This preamble not only is meant to convey the intimidating length of the book but also illustrates a point relevant to Zuboff’s argument. In the aforementioned process, I engage with several firms that are or are becoming Surveillance Capitalists: Google, Amazon, and Twitter. Surveillance Capitalists are an inescapable presence in our daily lives. Zuboff says we are in a ‘Faustian compact’ with them. ‘Faustian’ in the sense that ‘it is nearly impossible to tear ourselves away, despite the fact that what we must give in return will destroy life as we have known it’ (Zuboff, 2018, p. 11). This statement sets the scene for The age of surveillance capitalism, which acts as a warning siren to all who have become accustomed to the invasive, extractive actions of technology companies that make money from tracking, capturing and altering our behavior.

The book’s stated aim is to conduct an ‘initial mapping of a terra incognita’ (p. 17) where the concepts and frameworks Zuboff develops will enable others to see the ‘puppet master’ (p. 16) more clearly, preparing the way for further research and action. She identifies in our era an unprecedented ‘sea change’: our personal experiences are now mined for profit, which drives technology companies to compete for data, that is to invade our private lives and track our activities wherever and whenever possible. Her goal is to ‘isolate the deeper pattern in the welter of technological detail and corporate rhetoric’, so that readers can gain a foothold of understanding in a ‘rapid flow of events that boil around us as Surveillance Capitalism pursues its long game of economic and social domination’ (p. 18). Zuboff argues that as consumers we have been ‘psychically numbed’ by the sophisticated tactics of Surveillance Capitalists. The message that we need to Wake Up! underpins the book.

Professor Shoshana Zuboff is a Charles Edward Wilson Professor Emerita at Harvard Business School. In her book, she describes her methods as being those of ‘a social scientist inclined toward theory, history, philosophy, and qualitative research with those of an essayist’ (pp. 21–22). Ideas from classic texts by Durkheim, Marx, Weber, Arendt, Adorno, Polanyi, Sartre, and Milgram add richness and historical context to many of her arguments. Zuboff’s research spans 7 years, with a particular focus on Facebook and Google. She explains her data as follows: ‘In studying the Surveillance Capitalist practices of Google, Facebook, Microsoft, and other corporations, I have paid close attention to interviews, patents, earnings calls, speeches, conferences, videos, and company programs and policies. In addition, between 2012 and 2015 I interviewed 52 data scientists from 19 different companies with a combined 586 years of experience in high-technology corporations and startups, primarily in Silicon Valley’ (p. 24).1 Although interview material crops up on occasion to illustrate her insights, the majority of the book is based upon a huge amount of secondary data that are carefully stitched together to give a compelling overview: not only of the history of Google and Facebook’s expansion for example, but also of the red threads that tie together Silicon Valley ideology and the context that made Surveillance Capitalism possible.

Part 1 focuses on Google (now Alphabet). Google started with lofty intentions to ‘organize the world’s information and make it universally accessible and useful’ (p. 59)—but without a clear model for profit. Everything changed when they discovered the value of what Zuboff presciently terms ‘behavioral surplus’: the side effect of our behavior online. Technology companies prefer the term ‘data exhaust’: made up of clicks, views, likes, our search terms, profiles, emails, and shopping patterns – everything is collected, tracked, stored, and processed. Although a certain amount of this data is used to improve a company’s services (Zuboff is okay with this, as it represents ‘reciprocity’ between producer and consumer), Google and others are now going further to collect more data than they need, grabbing whatever they can to get a fuller picture of who we are and what we want, in order to influence what we do.

Zuboff argues that companies are becoming better equipped to gather and process ‘behavioral surplus’ and that Surveillance Capitalists are starting to use these data to manipulate users through increasingly accurate inferences and predictions. Content and contextual targeting in advertising, for example via Google’s AdSense, is cast as an ‘unprecedented and lucrative brew: [of] behavioral surplus, data science, material infrastructure, computational power, algorithmic systems, and automated platforms’ (p. 83) that increase click-through rates, the gold standard of advertising revenue. This ‘brew’ is what has institutionalized behavioral surplus ‘as the cornerstone of a new kind of commerce that depended upon online surveillance at scale’ (p. 83). Google’s transition to an advertising business model is in this way framed as the antecedent to its interest and growing expertise in mass surveillance.

Zuboff frequently asserts that the developments she documents and gives name to are unprecedented. For example, the familiar adage ‘if it’s free, you’re the product’ may come to mind when thinking about Google’s business model, but Zuboff claims that this old characterization is inaccurate when it comes to behavioral surplus. We are not the product itself, but rather ‘we are the sources of Surveillance Capitalism’s crucial surplus: the objects of a technologically advanced and increasingly inescapable raw-material-extraction operation’ (p. 10).

A range of other metaphors are used to drive this point home. Behavioral surplus is said to be ‘hunted’, ‘extracted’, ‘captured’, and ‘poached’, leaving us ‘dispossessed’ and ‘exiled’ from our own human experience, which has been ‘kidnapped’ by Surveillance Capitalists for profit. This language crescendos throughout the book and peaks in a comparison to the ivory trade: ‘…Big Other poaches our behavior for surplus and leaves behind all the meaning lodged in our bodies, our brains, and our beating hearts, not unlike the monstrous slaughter of elephants for ivory. Forget the cliché that if it’s free, “You are the product.” You are not the product; you are the abandoned carcass. The “product” derives from the surplus that is ripped from your life’ (p. 377).

But why have consumers and citizens been so willing to accept Surveillance Capitalists, by using their products and services with little concern for privacy or the value we are generating for big corporations without remuneration? Zuboff’s explanation is almost entirely top-down on this point. Google, in particular, has been deliberate and strategic, stealthily and cunningly manipulating consumers and governments into submission. They do so via what she terms the ‘Dispossession Cycle’. The four stages of this cycle are incursion, habituation, adaptation, and redirection (an ongoing analogy to Spanish colonization of the Caribbean Islands in the 15th Century is used to develop this argument).

A compelling case study of Google’s Street View is offered to illustrate each of these steps. With some adaptation this part of the book could offer a perfect critical addition to a Digital Innovation teaching case. It could be used to explore how privacy can be slowly eroded in the name of convenience and the role (or absence) of governance and regulation in this process. Interestingly, Germany’s resistance to and legislation against Google’s Street View stands out in stark contrast to US responses to Surveillance Capitalism’s ‘incursions’. Further comparative case studies that work to explore the impact of national context on Zuboff’s ‘dispossession cycle’ would likely be worthwhile.

An important thread throughout Part 1 is the role of US politics in supporting the rise of Surveillance Capitalism. From the ‘state of exception’ politics prompted by 9/11 in regards to citizen privacy, to a neoliberal belief in the power of ‘self-regulation’, to extensive and systematic lobbying strategies implemented by Big Tech firms. Especially to readers from outside the United States, this narrative gives fascinating and informative context to how and why Surveillance Capitalists have been able to scale their operations so quickly, mostly unfettered. Monopolistic practices are however in Zuboff’s view not the real source of the problem. If we break up Big Tech, the result will just be more Surveillance Capitalists. She also points out that ‘goods and services are merely surveillance-bound supply routes’ (p. 132), and that Surveillance Capitalists are more interested in cornering these supply routes of data than monopolizing a market. Unlike monopoly strategies, cornering practices are not aimed at driving up prices, rather they cut off and control access to key resources – in this case sources of behavioral surplus.

Part 2 shows how competition for behavioral surplus plays out. The Surveillance Capitalism business model is shown to rely upon securing lines of supply of behavioral surplus. Accumulation of behavioral surplus is needed to produce ‘prediction products that approximate certainty’ (p. 19). It is this need, not necessarily the needs of consumers, that drives demand for ‘smart’ products which introduce data generation and collection into new and ever more intimate areas of our lives, such as the home. Companies are competing for both scale of data collection and scope and depth – they are pursuing, via the Internet of Things, ‘your bloodstream and your bed, your breakfast conversation, your commute, your run, your refrigerator, your parking space, your living room’ as well as supply lines into ‘your personality, moods, and emotions, your lies and vulnerabilities’ (p. 201). Smart technologies are not only just somewhat creepily listening in but are also supply lines that ship data back to Surveillance Capitalists to be analyzed for patterns of behavior and associated opportunities for profitable persuasion.

At this point there is a move in the argument that I found difficult to follow and to some extent difficult to accept. Zuboff argues that the real profit in Surveillance Capitalism comes not only from predicting our behavior, but also in modifying it. Zuboff argues that a smart and ‘muscular apparatus is being assembled around us’, where: ‘…Surveillance Capitalists make the future for the sake of predicting it’ to the point of offering ‘guaranteed outcomes’ (p. 203). The argument seems to be that the best way to predict behavior, is to control it. ‘Under this regime, ubiquitous computing is not just a knowing machine; it is an actuating machine designed to produce more certainty about us and for them’ (p. 203). This is however hard to comprehend as a business model. Who is the producer? Who is the consumer? And what role are different intermediaries playing? What is the ‘original’ behavior, versus the ‘modified’ behavior, and which is being sold in the form of prediction profits? For example, did I really choose to review this book? Or am I the victim of behavior modification engineered by Twitter, Google, and Amazon on behalf of their clients?

The prospect of behavior modification at the hands of Surveillance Capitalists is not only frightening, but also vague and difficult to verify. Examples are given throughout this section (e.g., now infamous Facebook news feed experiments and manipulations, Pokemon Go’s use of sponsored locations to ‘herd’ users toward its customers, etc.) but at key moments where I wanted to understand what kind of technology and processes of data analysis we are dealing with, and what makes them such a threat, the language becomes figurative rather than technical. For example: ‘From the vantage point of Surveillance Capitalism and its economic imperatives, world, self, and body are reduced to the permanent status of objects as they disappear into the bloodstream of a titanic new conception of markets’ (p. 212). Algorithms are anthropomorphized, for example, one is described as ‘seizing’ a car (by shutting off the ignition) on behalf of an insurance company. ‘Rendition’ is an interesting and helpful metaphor here for understanding how behavior is made legible, but what technologies are actually used, what they are doing, and importantly, their fallibilities and limitations remain opaque.

Part 2 relies largely on laying out the worldview of many of the people who are building, selling, and envisioning what have been termed ‘prediction machines’ (Agrawal et al., 2018), but it lacks a de-mystifying explanation of what these systems actually are comprised of (e.g., sensors and statistics) and a frank and accessible discussion of their technical limitations (see instead for example Broussard, 2018). At times, I wished for annotations by a computer scientist, who could shed light, and perhaps even cold water, on some of the claims that Zuboff sources from the publicity materials of vendors. While I find it convincing that advertising today is more persuasive because it is presented in a personalized, targeted way, a more technical analysis is especially needed before I can accept Zuboff’s argument that the data analytics and targeted advertising services offered by Facebook and Google are fundamentally threatening our ‘right to the future tense’.

Part 3 changes tack away from business models and focuses on diagnosing the ideology behind Surveillance Capitalism which Zuboff calls instrumentarianism. ‘Instrumentarian power’ is introduced and contrasted to totalitarianism. Drawing on Arendt, Zuboff explores how we are blindsided by new ideologies because we cannot appreciate what is new or different about them. Totalitarianism was experienced as ‘unprecedented’ with disastrous consequences: the inability of governments and populations to perceive its threats resulted in millions of deaths. Zuboff presents instrumentarianism as the new totalitarianism in that we are not equipped to see clearly the danger that it presents.

Totalitarianism pursued the inner realm of the ‘soul’, demanding complete submission to the regime, as was explored in Orwell’s 1984. Instrumentarianism in contrast is ‘indifferent’, it seeks only compliance observed from an external viewpoint. People are measured and known via their external behavior, as an ‘Other’ (used in a behaviorist, not a psychoanalytic sense). To show how this works, Zuboff recasts Orwell’s Big Brother in a new role, as Big Other: ‘a ubiquitous sensate, networked, computational infrastructure’ (p. 20). Big Other serves instrumentarian power, which ‘aims to organize, herd, and tune society to achieve a similar social confluence, in which group pressure and computational certainty replace politics and democracy, extinguishing the felt reality and social function of an individualized existence’ (p. 20).

A highlight of this eye-opening section is when Zuboff recounts her first-hand interactions with the radical behaviorist B. F. Skinner, whose thinking is fundamental to instrumentarian ideology. She explains that the conversations she had as a graduate student with Skinner at the Psychology Department at Harvard University left her ‘with an indelible sense of fascination with a way of construing human life that was—and is—fundamentally different from my own’ (p. 361). She gives an empathetic but sharply critical review of his work. His 1971 social philosophy ‘Beyond Freedom & Dignity’, in particular, is introduced to make sense of the Utopian views held by instrumentarians. Behavior is what can be observed externally; systems of behavior can be modified by manipulating context in order to serve the common good. Ideals of freedom, individuality, and privacy are myths that get in the way of this Utopia. Nudging (see, for example, Thaler & Sunstein, 2009), a form of behavior modification popular in policy discussions today, is shown to flow from this vein of thinking.

Zuboff presents today’s technical infrastructure (Big Other) as a response to Skinner’s hopes and dreams. Her stated aim in this latter section is ‘to infer the theory behind the practice, as Surveillance Capitalists integrate “society” as a “first class object” for rendition, computation, modification, monetization, and control’ (p. 417). Here an entire chapter is dedicated to a fairly scathing analysis of the work and thinking of Alex Pentland, director of the MIT Media Lab’s Human Dynamics Lab, who she refers to as ‘something of a high priest’ of instrumentarian power. Pentland and his students design technologies that sense and track interactions, for example via ‘sociometer’ badges worn by employees in the workplace, perhaps familiar to readers under the label ‘People Analytics’ and the company Humanyze.

Pentland’s work is then compared with Skinner’s ‘once reviled thinking’ to induce five overarching principles of instrumentarianism: (1) Behavior for the Greater Good (whose greater good is not questioned in instrumentarian interventions), (2) Plans Replace Politics, (3) Social Pressure for Harmony, (4) Applied Utopistics, and (5) The Death of Individuality. Taken together, these principles are presented as undermining long-held values of democratic societies: ‘These new architectures feed on our fellow feeling to exploit and ultimately to suffocate the individually sensed inwardness that is the wellspring of personal autonomy and moral judgment, the first-person voice, the will to will, and the sense of an inalienable right to the future tense’ (p. 444). While the picture that Zuboff paints of the instrumentarian ideology is compelling and recognizable, for example in the way that people analytics products are sold to managers, it is less clear how this ‘suffocation’ plays out in practice. Can employees not rebel and take off or manipulate their badges, particularly in the face of Big Other’s indifference to them as agents?

It is only toward the end of the book that Zuboff discusses what it is like to live our lives with these new kinds of technology, and it is mostly a negative assessment. Teenagers in particular are said to live out their social existence through platforms owned by Surveillance Capitalists. They cannot easily step out of this digital world. We all have lost a right to ‘sanctuary’ (p. 475). Sanctuary is put forward in Chapter 17 as a fundamental human need that is worth fighting for: a backstage or a home, or a place where we can relax and prepare our public ‘performances’ (Goffman, 1956) is needed for a true sense of self to be secured. Sanctuary is a beautiful metaphor and it is striking how nostalgic, or privileged, the notion seems today.

The Cambridge Analytica scandal, issues of Fake News, and the difficulties of protesting in a digitally connected world are acknowledged throughout the book and are revisited here. A final section on the threats to democracy appears in the concluding section, though it is not entirely clear how and whether the use of surveillance technologies to control populations is compatible with the characterization of a disinterested Big Other. The book ends with a call for readers to ‘use our knowledge, to regain our bearings, to stir others to do the same, and to found a new beginning’ (p. 525). I was not sure what action was asked of me, other than to speak out and name Surveillance Capitalism for what it is and to declare: ‘no more’. Data rights, alternative business models (e.g., subscription instead of advertising), encryption, General Data Protection Regulation (GDPR), and privacy laws are not given a great deal of weight.

The research that has gone into the book, in particular in documenting the rise of Google and Facebook, is immensely useful and important to scholars of business and management. Readers who came to The age of surveillance capitalism via Zuboff’s influential 1988 book In the age of the smart machine may however, like me, be left craving more primary material to flesh out some of the claims that are made here. In her earlier research, Zuboff used participant observation and interviews over a 5-year period at paper mills, as well as 4 years’ of visits at DrugCorp to inform her analysis. The result was a rich analysis of how technology impacts the workplace in practice over time and how knowledge and power are transformed in the process (Zuboff, 1988).

Surveillance Capitalists are clearly not amenable to this level of researcher access, but at times the resultant reliance on rhetoric (e.g., product releases and media reports) leaves the picture skewed either toward a vendor’s hyper-muscular view of what their technology is capable of, or toward media representations that are incentized to dramatize the threat of technology. The more mundane aspects of working in Google’s advertising divisions or the technical limitations that computer scientists face when trying to wrangle data sets to make their predictive models more accurate are, for example, largely missing. So too is the ‘behind the scenes’ vantage point that Zuboff’s earlier work so brilliantly and inspiringly offered scholars of technology and organizing.

Without this ethnographic anchoring, the book reminded me of other texts that put forward a strong warning about the dangers of modern technology-infused regimes without giving much space to the voices of those apparently most severely affected (or supposedly at fault). In particular, I thought of Jacques Ellul’s (1964) The technological society, and Frankfurt School critiques of The culture industry (Adorno, 2005; Adorno & Horkheimer, 1997).

Ellul, for example, outlined the dominance and danger of ‘technique’ as a threat to humanity in the 20th Century in a manner that bears similarity to Zuboff’s ‘instrumentarianism’. For Ellul, ‘technological’ does not just refer to machines but to a way of thinking that rationalizes human behavior and places efficiency at the center of all aims, and in The technological society’s final pages, ellul (1964, p. 432) asks the reader ‘Who is too blind to see that a profound mutation is being advocated here? A new dismembering and a complete reconstitution of the human being so that he can at last become the objective (also the total object) of techniques. Excluding all but the mathematical element, he is indeed a fit end for the means he has constructed. He is also completely dispelled of everything that traditionally constituted his essence’. This message is reminiscent of Zuboff’s warning that by being mined for behavioral excess in instrumentarian regimes of ‘social physics’, we are being exiled from our own behavior.

Ellul (1964, p. 428) was also concerned with the lack of escape from such cybernetic systems: ‘Enclosed within his artificial creation, man finds that there is “no exit”; that he cannot pierce the shell of technology to find again the ancient milieu to which he was adapted for hundreds of thousands of years’. This echoes Zuboff’s statement that ‘a hive with no exit can never be a home, experience without sanctuary is but a shadow, a life that requires hiding is no life, touch without feel reveals no truth, and freedom from uncertainty is no freedom’ (p. 523). A similar sentiment, that humanity is doomed by ideologies built of new technologies and technologically-inspired thinking, surrounds Frankfurt School critiques of mass culture. Popular culture such as television, film, and music were accused of undermining society’s capacity for critique. These historic critical texts have come to be seen as deterministic, offering little space for human agency, creativity, rebellion, and desire. The age of surveillance capitalism treads this territory at times, particularly when it comes to the effects of technology on the free will of consumers.

On the one hand, this at times deterministic tone is surprising because it is at odds with decades of media theory reactions to treating people like ‘cultural dopes’ (Garfinkel, 1967). On the other hand, it is refreshing to see a critique of machine learning Artificial Intelligence that addresses the heart of its ambitions: the erasure of ambiguity. While data analytics, algorithms, and AI are often criticized on the basis of bias (O’Neil, 2017), Zuboff is pointing out that there is a potentially more nefarious goal in AI’s sight: a fight for a claim to the ‘future tense’.

The modern secular era has celebrated notions of chance, uncertainty, and randomness as conditions of freedom and agency (Bernstein, 1996). While actuarial sciences have, for a long time aimed to reduce uncertainty and thereby risk, now such probabilistic methods are given much greater power with the advent of AI applications that are not limited to human computational abilities or capacities. Zuboff’s is one of the first texts I have read that diagnoses the danger that such ambition poses in the hands of powerful corporations. However, while the ideology of such aims is worth exploring and combatting, the basic underlying techniques remain far from all-powerful.

For example, the influence of push notifications and targeted online advertisements in influencing our behavior can be overstated. In 2018, a class action was brought against Facebook by advertisers who allege that Facebook misled them regarding the reach and efficacy of their advertisements – a ‘former employee’ claims that the ‘Potential Reach’ figure that helps determine the cost of advertising on Facebook is ‘like a made-up PR number’ (AdNews, 2018). Furthermore, the effectiveness of psychographics – the personality-based market segmentation strategies behind Cambridge Analytica’s campaigns – also lacks empirical evaluation (Rokka & Airoldi, 2018). There is a problem with buying into the premise that ‘Big Other’ infrastructure does or can predict and modify behavior to a point where free will is eliminated: this message may even help Surveillance Capitalists to convince advertisers to invest in these companies’ supposed predictive prowess.

As mentioned in my introductory anecdote, the products and services of Surveillance Capitalists are an integral part of our daily lives. While Zuboff has completely convinced me that these companies are problematic, and reaffirmed my view that greater action needs to be taken to regulate and govern them and protect privacy and data rights, I am not yet convinced that engaging with them leaves me dispossessed from my own experience. WhatsApp (Facebook/Meta) is how I and many others stay connected every day, with family spread across the world. Google Maps offers me confidence in exploring a foreign city. My entire consumption of Zuboff’s book relied on Amazon’s infrastructure. Such ‘positive’ technology-enriched experiences are not given much weight in Zuboff’s critique of Surveillance Capitalism, and they to some extent go against the idea that Surveillance Capitalism leaves me fundamentally and automatically less myself, less human, and less free.

The age of surveillance capitalism is an important book for management and organizational studies because it represents an extraordinary historical account and critical assessment of the corporate giants of our present era, as well as of the ideology that underpins their operations. In great detail, Zuboff traces how optimism for the revolutionary potential of the world wide web turned sour. Her assessment of Google and Facebook in particular shows readers how goodwill and utopian messages were used to lull populations and arguably governments into a false sense of reciprocity, while vast amounts of our data are extracted in increasingly invasive ways. Zuboff’s ‘Cycle of Dispossession’ gives us a useful analytic framework for tracing how this all happened, and for anticipating how it will happen again. While I was at times concerned that her ‘theory of change’ could be read by some actors as a how-to-guide for Surveillance Capitalist strategy, it will also prompt urgently needed analysis of emerging business models from the intended critical perspective.

Zuboff’s discussion of instrumentarianism in some ways stands on its own. This section could even have been moved to the start of the book, to give ideological context before the details of Surveillance Capitalism are outlined. It is a useful addition to discussions of datafication (Kelly & Noonan, 2017; Van Dijck, 2014) and analyses of the logic of ‘technique’ and critiques of neoliberalism that are already well established in our discipline. Researchers may be inspired to further empirically examine the claims made by, for example, People Analytics initiatives, to ensure that the voracity of this ideology is not accepted too quickly as proof of an effective set of technologies or regimes of action.

Other reviews have pointed out that The age of surveillance capitalism does not offer very much in the way of a critique of capitalism (Morozov, 2019) or even a substantial engagement with the vast literature on surveillance (Ball, 2019). It does, however, give us new ways to think and talk about how our behavior is translated into data and who is making money from this process, as well as a provocation to debate what the short and long term implications might be. That is an enormously important conversation for management and business scholars and while not everyone will make it to the last page, it is comforting to know that Zuboff has captured the zeitgeist within this tome at a moment when a sense of outrage could easily slip away. Consider this weighty book a warning beacon imploring us to stay alert and alarmed, as Surveillance Capitalism’s reach continues to expand around us.

References

AdNews. (2018). Facebook sued for ‘misleading’ advertisers on potential reach. Retrieved from https://www.adnews.com.au/news/facebook-sued-for-misleading-advertisers-on-potential-reach#7lotKyi4eVyiQvCs.99

Adorno, T. W. (2005). The culture industry: Selected essays on mass culture. Routledge.

Adorno, T. W. & Horkheimer, M. (1997). Dialectic of enlightenment (Vol. 15). Verso.

Agrawal, A., Gans, J. & Goldfarb, A. (2018). Prediction machines: The simple economics of artificial intelligence. Harvard Business Press.

Ball, K. (2019). Review of Zuboff’s The age of surveillance capitalism. Surveillance & Society, 17(1/2), 252–256. doi: 10.24908/ss.v17i1/2.13126

Bernstein, P. L. (1996). Against the gods: The remarkable story of risk. Wiley.

Broussard, M. (2018). Artificial unintelligence: How computers misunderstand the world. The MIT Press.

Ellul, J. (1964). The technological society. Vintage Books.

Garfinkel, H. (1967). Studies in ethnomethodology. Prentice-Hall.

Goffman, E. (1956). The presentation of self in everyday life. Harmondsworth.

Kelly, S. & Noonan, C. (2017). The doing of datafication (and what this doing does): Practices of edification and the enactment of new forms of sociality in the Indian public health service. Journal of the Association for Information Systems, 18(12), 872–899. doi: 10.17705/1jais.00477

Morozov, E. (2019). Capitalism’s new clothes. The Baffler. Retrieved from https://thebaffler.com/latest/capitalisms-new-clothes-morozov

O’Neil, C. (2017). Weapons of math destruction: How big data increases inequality and threatens democracy. Broadway Books.

Rokka, J. & Airoldi, M. (2018). Cambridge analytica’s ‘secret’ psychographic tool is a ghost from the past. The Conversation. Retrieved from https://theconversation.com/cambridge-analyticas-secret-psychographic-tool-is-a-ghost-from-the-past-94143

Thaler, R. H. & Sunstein, C. R. (2009). Nudge: Improving decisions about health, wealth, and happiness. Penguin.

Van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & Society, 12(2), 197–208. https://doi.org/10.24908/ss.v12i2.4776

Zuboff, S. (1988). In the age of the smart machine: The future of work and power. Basic Books.

Footnote

1. At several points Zuboff describes a lightning fire that destroyed her home and work in progress; it is somewhat ambiguous what empirical materials were lost in this tragic event and how this impacted the form of the book