• Digital instrumentarianism
  • Digital ubiquity
  • Radical indifference
  • Instrumentarian power

Digital Instrumentarianism and Radical Indifference

Digital instrumentarianism describes firms influencing our behaviour so that the predictability of our actions increases. Surveillance capital does not care about what we do, who we are or what our problems might be, so long as data can be captured and predictions can be extracted from it. Zuboff (2019)1 calls this “radical indifference”, referring to the indifference of surveillance capitalism to an individual’s actions, so long as predictive data can be gathered. Facebook’s Andy Bosworth described it as:

“…connecting people so deeply that anything that allows us to connect more people more often is de facto good… [not] for ourselves or for our stock price. It is literally just what we do. We connect people” (Andrew Bosworth, 2016)

Digital Ubiquity

Digital ubiquity is the core enabler for surveillance capitalism. Zuboff describes it as an intense, thick surround of digital instrumentarianism which subliminally shapes a user’s behaviour in a direction that favours a firm’s commercial outcomes. Surveillance capitalism will encourage actions that make a user more predictable. The familiar example of the ‘filter bubble’2 in the hands of surveillance capitalists does not just increase user engagement through enhanced user experience. Its main objective is to increase a user’s predictability in the online world through altered behaviour. The same devices that allow us to monitor, can now be used to actuate, both in the online world and increasingly in the physical world. Data scientists call this monitoring and actuation.

Monitoring and Actuation - from the laptop to the mobile phone

One of the best early examples of monitoring and actuation was provided by Kramer et al. (2014)3 who conducted a massive scale experiment in collaboration with the Facebook news feed team. They introduced an “emotional contagion”4 by reducing the volume of positive expressions within a user’s news feed. As a result, they observed the person producing fewer positive posts. To illustrate a real world context, Zuboff (2019)1 cites the Pokémon Go app. Pokémon Go is a mobile game that uses augmented reality to project Pokémon cartoon characters within physical locations. In order to progress through the game, the user must find these virtual characters by going to physical locations. It uses the same processes as online targeted advertising, but in this case business customers pay for future behaviour in the real world. Users are directed towards specific restaurants or shops outside of their conscious awareness. Pokémon Go is a good example of monitoring and actuation and how surveillance capitalists have moved away from the laptop and now rely on the mobile phone as the chief supply chain interface for its raw materials.

In the next post, I discuss the role of the smartphone as the primary tool for data capture and in the post after that I discuss Instrumentarian Power

Next: Smartphones and Privacy


  1. Zuboff, S., 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Profile Books. ↩︎

  2. Nguyen, T.T., Hui, P.-M., Harper, F.M., Terveen, L., Konstan, J.A., 2014. Exploring the Filter Bubble: The Effect of Using Recommender Systems on Content Diversity, in: Proceedings of the 23rd International Conference on World Wide Web, WWW ’14. ACM, New York, NY, USA, pp. 677–686. https://doi.org/10.1145/2566486.2568012 ↩︎

  3. Kramer, A.D.I., Guillory, J.E., Hancock, J.T., 2014. Experimental evidence of massive-scale emotional contagion through social networks. Proc. Natl. Acad. Sci. 111, 8788–8790. https://doi.org/10.1073/pnas.1320040111 ↩︎

  4. Hatfield, E., Cacioppo, J.T., Rapson, R.L., 1993. Emotional Contagion. Curr. Dir. Psychol. Sci. 2, 96–100. https://doi.org/10.1111/1467-8721.ep10770953 ↩︎