Editorial

Editorial del Número 14 de "Memorias. Revista digital de Historia y Arqueología desde el Caribe"


Introduction
Networked technologies that collect data from the environment and transmit those data to others are increasingly being built into our homes, schools, streets, offices, and shopping centres. Although attempts to construct smart cities have met a number of obstacles-not the least being citizen concerns about surveillance-the data industries behind those attempts have slowly and quietly been building smart tech into the mundane objects that occupy our cities. "Smartness" has accordingly grown from the bottom up rather than the top down.
This issue begins to unpack that bottom-up growth by asking difficult questions about the ways in which data are flowing both between objects and between actors. There is a multiplicity of metaphors from which we could select to describe the current spread of smart things, and it would be easy to choose a neutral one. However, especially in light of the pandemic situation in which we still find ourselves, we draw on John McMurtry's 1999 book The Cancer Stage of Capitalism to consider smartness as an infection, as an infestation. This is useful because, while there are undoubtedly multiple uses for smart things, the inhabitants of our cities are constantly being bombarded with marketing and advertising rhetoric touting the promise of smartness. As smart devices continue to proliferate and connect together, we need to find ways to counter this boosterism and (hyper)normalization, in other words to make smart things strange and troubling again and thus render them open to critical analysis.

Smartness as Infection
The issue documents the growing evidence of the proliferation of smart things across a variety of contexts and locations. Often the placement of these things is unexpected-at least at first. Take Amazon's smart doorbell, Ring, for example. The simple idea behind the technology is to make it easier for residents to see who is seeking to enter their homes. But once Amazon bought the start-up that developed Ring, it was able to easily link the home's doorbell system with Alexa, its own smart personal assistant. In this way, both Ring and Alexa become vectors of infection that, like mould spores, each alight in a particular location to spread out and join up with other points of infection. Because of this bottom up sideways growth, smartness spreads to create something bigger than the sum of its smart parts. For example, Amazon not only uses the data from Ring and Alexa for the obvious "surveillance capitalism" goal of enriching its saleable marketing profiles (Zuboff 2015) and for training its burgeoning AI systems (Eliot and Murakami Wood 2021). It also partners with Police Departments, enabling policing and security to ride on the Ring infection vector to gain access to places where constitutionally and legally they might have some trouble accessing: inside private property. But Amazon does not intend to stop there. Multiple Alexa and Ring devices in a neighbourhood can now also be connected to each other, and here multiple sites of infection become a full-blown local

Editorial Smart Surveillance
infestation. This is something that is only just short of the much more conventional top-down "smart city" concept, something which Amazon (with no apologies to the Google subsidiary of the same name) calls Sidewalk: a seemingly bottom-up, mesh-like urban surveillance network open in various degrees to local people (or at least to Amazon device users), Amazon, and Police Departments. And it may in the future spread to others-if there is any lesson of surveillance technology procurement by police, it is that the same access is soon granted to multiple state agencies for a plethora of purposes.
We see the same kind of infestation in education, especially given the move to emergency distance learning during the COVID-19 pandemic. Early small signs of infection to enable "online proctoring" created a vector through which companies could find their way into school and university networks. From there, the infection spread onto the computers of students and ultimately thereafter into the private houses and bedrooms of students via the mandatory cameras needed, apparently, to check for the "right kind" of behaviour during examinations. Like the pandemic itself, this infection is highly classed and racialized: Black students are more likely to be misidentified, registered incorrectly as absent, and penalized for innocuous infractions. Poor students who don't have private spaces for study are also likely to be called out as in breach of examination regulations because of background activity caught by their own cameras. And indeed, the private space of the household itself becomes infected as the camera is used to assess whether the space and its contents are compliant with the rules set by the algorithms that drive the system. Proctoring companies have also aggressively resisted critical examination, using SLAPP 1 lawsuits to try to silence journalists and academics who have raised legitimate criticisms of their practices and technologies. This is an infection with lawyers.
Thinking of smartness in both these cases as an infestation helps us map how, like a mould, a technology which starts at particular points then spreads out and joins together to become a system that can blanket physical and social space. It is important to think this way because, if we start for example only with the high-level comprehensive plans and promises for smart cities, we might miss the incremental infiltration of smartness. This approach will also complement and enrich examinations of those at-first-glance more topdown smart initiatives: smart cities or even smart nations. With a Powers of Ten-like shift in scale, one can see that smart cities can be considered as loci of infection at the planetary scale. This is less of a perversion of metaphor than one might at first imagine: Mark Weiser's original vision for computing everywhere spoke of "pervasiveness" and the Internet of Things speaks directly to a network separate from, or even without, humanity. In that sense, a form of life is being built that is not simply in the McLuhanesque sense of communication technologies' "extensions of man" [sic] but is potentially beyond the human. The beginnings of this were noted by Kitchin and Dodge (2011) some time ago when they spoke of "the machinereadable world." And as Benjamin Bratton (2016) has been arguing for some time now that out of capitalism is emerging a planetary-scale infrastructure of computation. The most recent iterations of smart citieswhat one of us refers to as platform cities-appear to be developing in this direction. Initiatives like Saudi Arabia's 170km linear city of NEOM, or Nevada's proposed Special Investment Zones, take the most extremely exclusionary form of capitalist life and, using smart technologies, attempt to secure it and its entrepreneurial inhabitants or operators against the tide of social and ecological breakdown, even as that tide is rising thanks to the same capitalist processes. In other word, like an infestation, these new platform cities are inimical to human life in the broader sense. This is not just a natural infection, it is biological warfare.
Thinking of smartness in this dramatic analogical way is also important because it completely avoids the obvious boosterish logic of smartness as convenience and efficiency. Seeing each smart device instead as a kind of parasite, or even as a potential bioweapon, enables us to identify when these devices that worm themselves inside our homes (and yes, ultimately inside us) live with us rather than for us, fundamentally not for our benefit at all. To fully unpack what this means for us, we have to go beyond consumer activism to confront the ways in which these devices infect us and others.

Surveillance & Society 19(2) 152
This approach also builds on the insights of the useful but simplistic model of surveillance capitalism as data-gathering for profiling, as well as the more sophisticated version that understands that smartness is a goal in itself, that these data are being gathered as training data for Artificial Intelligence and are therefore self-perpetuating. But it also lets us extend those approaches and ask for whose benefit this is when entire communities are managed by surveillance machines. This is a bio-political economic question, an ecological question and a question about the future of humanity and human beings: what does this do to us, those of us who are still human? Is there ultimately a place for the majority of humanity in a smart world?
It shouldn't be necessary to say that we can't just let market forces govern this, but it needs saying because there is a particular kind of Silicon Valley political economic logic driving this that doesn't simply reject regulation in favour of the market, but actually tries to appropriate regulation, and perhaps even appropriate government itself. Apple, for example, is essentially rewriting privacy laws through technological development, creating "facts in the technology." This is immensely concerning for consideration of the purpose of government itself in a world of smart tech, especially since Apple is far from the only platform corporation that believes it can do better than government (Rider and Murakami Wood 2019).
When, as in the case of Amazon, we can see direct connections between smart devices and police forces and military, we also urgently need to ask critical questions about the new military-industrial complex that is being (re)constructed around data gathering and driven by the private sector. Ultimately corporations may begin to ask whether these entities might not be things that they can recreate themselves: the sudden rise of US vigilante start-up, "Citizen," which is creating what is essentially a private police force (Cox 2021), may presage such a direction, but it is already the reality across most of Latin America for those who can afford it. Smart private policing is the essence of the dual biopolitical-necropolitical functioning of surveillance, which sees a cocooning protection for the wealthier and whiter, and exclusionary brutality directed at the poorer, browner, and blacker.
Because of this, we need to be clearly and openly partisan in our critique. A recent Twitter exchange 2 saw a paper by AI researcher, Lindsey Barrett, dismissed for possessing a "pre-existing privacy bias" and therefore unworthy of consideration. For many of us, the immediate question was how could anyone not be biased in favour of human rights? If the advocates of smartness are able to destabilize the general consensus that makes human rights (and indeed the extension of rights to other living beings and ecosystems) a foundational element of human societies, then the results could be both to fracture (once again) the unity of human rights and to eliminate those rights from particular classes of human, but also to devalue humanity in toto in relation to both smart technology and to capitalism.
We have been talking here as if technological capitalism is not human: this is intentional and heuristic. McKenzie Wark's (2019) provocation to think about what comes after capitalism has been answered in various ways, particularly by Jodi Dean's (2020) consideration of a revitalized feudalism on a global scale, but another way might be to think of what happens when capitalism is fully automated, i.e., it is capitalismnot watches, not homes, not cities-that becomes "smart." In other words, what happens when an entirely self-perpetuating system emergent from human society can function parasitically on/in humanity? The ultimate goal of human rights is to preserve human dignity. As Arne Naess (1990) argued, debates about what it means to be human are about extending the "sphere of moral considerability," not of removing or reducing rights to advance the needs of platform corporations, nation-states, or any other institution. And, as David Livingstone Smith (2011) reminds us, dehumanization is the beginning of genocide. So we must be partisans of humanity to guard against what Jathan Sadowski (2020) defines as the "too smart"; maybe not to be the new Luddites that Sadowski would have us become, but at the same time to be very wary of the argument made by Bratton (2021) in his most recent addition to his thoughts on planetary computing that humanity must harness and shape "planetary sapience." We do not dismiss that as impossible, but we must recognise that to accept that direction would mean the emergence of a very different kind of humanity, one that will have learned to live with and love this disease, this parasitic infection of smartness, 3 and become something less in consequence.