Being a manipulative species, humans have mixed some notions of domination into the concept of control.
Heraclitus was quoted as writing, around 600 B.C., that
The wise is one, knowing the plan by which it steers all things through all.
ἓν τὸ σοφόν· ἐπίστασθαι γνώμην ὅκη κυϐερνῆσαι πάντα διὰ πάντων.
Kahn 1979, 54
The ‘knowing’ verb, ἐπίστασθαι, usually connotes mastery (literally ‘standing over’). But the exact form of the ‘steering’ verb κυϐερνῆσαι being uncertain (see Kahn 1979, 170), we might also translate ‘the thought by which all things steer themselves through all things.’ ‘Steer themselves’ would translate κυβερνᾶται, one of the most plausible readings (according to Kahn): being in the Greek middle voice, neither active nor passive, this reading is compatible with the concepts of autopoiesis, self-organization and enaction.
Various forms of the verb κυβερνάω (Greek root of the English govern) were commonly used in a ‘standard metaphor of cosmic steering’ (Kahn 1979, 272) even before Heraclitus. In the mid-20th century, the same Greek word was used to name the new field of cybernetics.
If Heraclitus did use the middle voice, this would open up a curious connection with another discipline developing in the early 20th century, generally called phenomenology. According to Henry Corbin:
The etymological meaning of the word ‘phenomenon,’ taken in the precise technical sense of phenomenology, is very much the original meaning of the Greek word phainomenon. This is the present participle of a verb in the middle voice; i.e., the subject is manifesting, appearing, and being shown to itself and for itself. It is the middle, the medium, the medial voice of the verb.
— Corbin 1948 (1998, 24)
There could be a more-than-verbal connection between the Greek middle voice and the semiotic concept of mediation (Peirce’s Thirdness), or the ‘middle way’ of Nagarjuna and Mahayana Buddhism. But English and its close relatives among languages lack a middle voice, so that our notion of ‘appearing’ tends to split into ‘subject’ and ‘object,’ while ‘control’ tends to split into a controller (active) and a controlled (passive). Cybernetics was defined by Norbert Wiener as ‘the science of control and communication, in the animal and the machine’ (Ashby 1956, 1). Some outsiders suspected that it was more about control of the animal (and human) by mechanical means, which was not the intention of the early cyberneticists.
Three centuries earlier, Descartes had used the same nautical metaphor, but without the sense that the living body steers itself; for him the body was merely a passive mechanism, and guidance was done by a separate mind or soul, the spiritual captain of the physical ship. This particular dualism has infected our thinking ever since. Intentionally or not, the word control tends to conjure up the dualistic vision of a controlling agency – captain, director, governor, dictator, boss – and a relatively passive subject (subject in the political sense of one who is governed). Once institutionalized, this becomes a domination system (Borg 2001) rather than a self-guidance system. Jean-Pierre Dupuy criticizes ‘the unfortunate choice of the very name “cybernetics”’ as ‘implying a theory of command, governance, and mastery’ (Petitot et al. 1999, 558). We can avoid some of these misconceptions by using the word guidance rather than control because it seems less suggestive of domination.
Even the notion of self-control may seem to split the whole self into two parts, with one imagined as controlling the other – although in Peirce’s usage, the continuity of semiosis involved in self-control is never in doubt.
Setting aside word choice, though, it seems that cybernetics (at least in its early days) was working with the same central ideas of closure and circularity that later found their way into autopoiesis theory and into this book.
Cybernetics might, in fact, be defined as the study of systems that are open to energy but closed to information and control – systems that are ‘information-tight’ (Ashby 1956, 4).
– or to put it another way, systems that are self-informed, ‘autonomous agents’ who are not directly controlled by external agencies.
Introducing an article about her father (Gregory Bateson) and mother (Margaret Mead), Mary Catherine Bateson provides this concise retrospective view of cybernetics:
Both my parents played important roles roles in the early development of cybernetics, participating for over a decade in the search for ways of thinking about the behavior of systems, their formal similarities and interactions, that could connect biology and the social sciences and inform various kinds of engineering and design … The way an organism adjusts to circumstances has similarities to the way a ‘smart’ missile stays on course, so by the time of my parents’ deaths the term had largely been usurped by engineering and computer science and had become associated in popular usage with mechanical, inhuman constructions.
— M.C. Bateson (2004, 44)
Polanyi (1962) was already associating the term with such mechanistic models. Likewise Robert Rosen (2000, Chapter 19) lumps cybernetics with information theory, ‘bionics’ and ‘artificial intelligence’ as developments of the organism-as-machine metaphor which goes back to Descartes. According to Rosen, ‘mechanical constructions’ were of the essence of these disciplines right from the beginning, because they never treated systems as complex in Rosen’s sense of the word. They were all simple because they were mechanical, whereas for Rosen ‘organism and machine are different in kind’ (2000, 295).
The difference between Rosen’s perspective and Bateson’s on this episode in history can serve to remind us that understanding what is meant by ‘complex’ in any given context is anything but simple. Rosen himself says that his own usage of the term ‘is completely different from that employed heretofore. This is unfortunate, but there were no other terms that could be used. In von Neumann’s terminology, every system is simple in my sense; what he calls complex I would merely call complicated’ (2000, 292). It was von Neumann who developed methods of quantifying ‘complexity,’ says Rosen (2000, 289), ‘and complexity in this sense became an explanatory principle for the characteristics of life’ – all of which kept ‘life’ firmly within the mechanical domain. But Rosen (2000, 303) also observes that a system controlling the system’s response to its environment amounts to a model of the environment – which brings us back to the meaning cycle.
These changing conceptions of ‘control’ and ‘complexity’ have taken yet another turn with the advent of infodynamics, linking information and thermodynamics. Salthe defines infodynamics as
the science of information changes in systems, especially in systems that are informed primarily from within. A combination of nonequilibrium thermodynamics and information theory based on the idea that, just as energy transformations lead to an increase in entropy, so do they, at least when viewed from within a system, lead to increases in information.
— Salthe (1993, 315)
The term (first used by David Depew and Bruce Weber in the late 1980s) reflects the twin sources of the concept. It differs from Shannon’s original information theory by focusing on nonequilibrium thermodynamics, a field not yet developed in Shannon’s time. While it remains a mathematical model, the ‘view from within’ or ‘internalist perspective’ embodied in infodynamics (Salthe 1993, 2004) aims to model dimensions of meaning not reflected in Shannon’s information theory.