In the philosophy of technology, substantivism is a critical position opposed to the common sense philosophy of technology known as “instrumentalism”. Instrumentalists argue that tools have no agency of their own – only tool users. According to instrumentalism, technology is a mass of instruments whose existence has no special normative implications. Substantivists like Martin Heidegger and Jacques Ellul argue that technology is not a collection of neutral instruments but a way of existing and understanding entities which determines how things and other people are experienced by us. If Heidegger is right, we may control individual devices, but our technological mode of being exerts a decisive grip on us: “man does not have control over unconcealment itself, in which at any given time the real shows itself or withdraws” (Heidegger 1978: 299).
For Ellull, likewise, technology is not a collection of devices or methods which serve human ends, but a nonhuman system that adapts humans to its ends. Ellul does not deny human technical agency but claims that the norms according to which agency is assessed are fixed by the system rather than by human agents. Modern technique, for Ellul, is thus “autonomous” because it determines its principles of action internal to it (Winner 1977: 16). The content of this prescription can be expressed as the injunction to maximise efficiency; a principle overriding conceptions of the good adopted by human users of technical means.
In Chapter 7 of Posthuman Life, I argue that a condition of technical autonomy –self-augmentation – is in fact incompatible with technical autonomy. “Self-augmentation” refers to the propensity of modern technique to catalyse the development of further techniques. Thus while technical autonomy is a normative concept, self-augmentation is a dynamical one.
I claim that technical self-augmentation presupposes the independence of techniques from culture, use and place (technical abstraction). However, technical abstraction is incompatible with the technical autonomy implied by traditional substantivism, because where techniques are relatively abstract they cannot be functionally individuated. Self-augmentation can only operate where techniques do not determine how they are used. Thus substantivists like Ellul and Heidegger are wrong to treat technology as a system that subjects humans to its strictures. Self-augmenting Technical Systems (SATS) are not in control because they are not subjects or stand-ins for subjects. However, I argue that there are grounds for claiming that it may be beyond our capacity to control.
This hypothesis is, admittedly, quite speculative but there are four prima facie grounds for entertaining it:
- In a planetary SATS local sites can exert a disproportionate influence on the organisation of the whole but may not “show up” for those lacking “local knowledge”. Thus even encyclopaedic knowledge of current “technical trends” will not be sufficient to identify all future causes of technical change.
- The categorical porousness of technique adds to this difficulty. The line between technical and non-technical is systematically fuzzy (as indicated by the way modern computer languages derived from pure mathematics and logic). If technical abstraction amplifies the potential for “crossings” between technical and extra-technical domains, it must further ramp up uncertainty regarding the sources of future technical change.
- Given my thesis of Speculative Posthumanism, technical change could engender posthuman life forms that are functionally autonomous and thus withdraw from any form of human control.
- Any computationally tractable simulation of a SATS would be part of the system it is designed to model. It would consequently be a disseminable, highly abstract part. So multiple variations of the same simulations could be replicated across the SATS, producing a system qualitatively different from the one that it was originally designed to simulate. In the work of Elena Esposito a related idea is examined via the way users of financial instruments employ uncertainty as a way of influencing the decisions of others through one’s market behaviour. Esposito argues that the theories used by economists to predict market behaviour are performative. They influence economic behaviour though their capacity to predict it is limited by the impossibility of self-modelling (Esposito 2013).
If enough of 1-4 hold then technology is not in control of anything but is largely out of our control. Yet there remains something right about the substantivist picture, for technology exerts a powerful influence on individuals, society, and culture, if not an “autonomous” influence. However, since technology self-augmenting and thus abstract it is counter-final – it has no ends and tends to render human ends contingent by altering the material conditions on which our normative practices depend.
Esposito, E., 2013. The structures of uncertainty: performativity and unpredictability in economic operations. Economy and Society, 42(1), pp.102-129.
Ellul, J. 1964. The Technological Society, J. Wilkinson (trans.). New York: Vintage
Heidegger, M. 1978. “The Question Concerning Technology”. In Basic Writings, D. Farrell
Krell (ed.), 283–317. London: Routledge & Kegan Paul.
Roden, David. 2014. Posthuman Life: Philosophy at the Edge of the Human. London:
Winner, L. 1977. Autonomous Technology: Technics-out-of-control as a Theme in Political
Thought. Cambridge, MA: MIT Press.