Degeneracy, codes and laws

Different symbols can fill the same niche in meaning space, and two different acts of meaning can find expression in the same text. This inherent ambiguity or ‘polyversity’ of language is rooted in our biological heritage as complex adaptive systems.

A clear-cut example is seen in the genetic code. The code is made up of triplets of nucleotide bases, of which there are four kinds: G, C, A, and T. Each triplet, or codon, specifies one of the twenty different amino acids that make up a protein. Since there are sixty-four different possible codons – actually sixty-one, if we leave out three stop codons – which makes a total of more than one per amino acid, the code words are degenerate. For example, the third position of many triplet codons can contain any one of the four letters or bases without changing their coding specificity. If it takes a sequence of three hundred codons to specify a sequence of one hundred amino acids in a protein, then a large number of different base sequences in messages (approximately 3100) can specify the same amino-acid sequence. Despite their different structures at the level of nucleotides, these degenerate messages yield the same protein.

— Edelman (2004, 43-4)

This biological usage of the term degenerate is quite different from the mathematical sense used by Peirce (polyversity strikes again!); here degeneracy refers to the ability of different structures to serve the same systemic function. As Ernst Mayr (1988, 141) points out, this complicates evolutionary theory because it means that mutations consisting of base-pair substitutions can be ‘neutral’ with respect to selection. But this inconvenience is not one we could dispense with, as Edelman goes on to explain:

Degeneracy is a ubiquitous biological property. It requires a certain degree of complexity, not only at the genetic level as I have illustrated above, but also at cellular, organismal, and population levels. Indeed, degeneracy is necessary for natural selection to operate and it is a central feature of immune responses. Even identical twins who have similar immune responses to a foreign agent, for example, do not generally use identical combinations of antibodies to react to that agent. This is because there are many structurally different antibodies with similar specificities that can be selected in the immune response to a given foreign molecule.

What Edelman calls degeneracy is called ‘multiple realizability’ by Deacon (2011, 29), who gives the example of oxygen transport in circulatory systems. This is realized by hemoglobin in humans and other mammals, but by other molecules in (for instance) clams and insects.

For us humans, degeneracy is perhaps most interesting for its role in generating conscious experience. Neural processes related to the experience of having a world can be analyzed in terms of ‘maps,’ and the relations among these maps turn out to be degenerate. Visual experience alone may involve dozens of them, cooperating (in Edelman’s theory) by means of

mutual reentrant interactions that, for a time, link various neuronal groups in each map to those of others to form a functioning circuit.… But in the next time period, different neurons and neuronal groups may form a structurally different circuit, which nevertheless has the same output. And again, in the succeeding time period, a new circuit is formed using some of the same neurons, as well as completely new ones in different groups. These different circuits are degenerate – they are different in structure but they yield similar outputs …

— Edelman (2004, 44-5)

By its very nature, the conscious process embeds representation in a degenerate, context-dependent web: there are many ways in which individual neural circuits, synaptic populations, varying environmental signals, and previous history can lead to the same meaning.

— Edelman (2004, 105)

Even within a given context, there are many ways for implicit guidance to become explicit. So naturally different texts can yield the same meaning, and different verbal expressions of belief can yield the same practice.

Another aspect of this degeneracy is that different theories may articulate the same implicit models: for example, Edelman’s ‘theory of neuronal group selection’ appears to have the same significance as Bateson’s theory of ‘the great stochastic processes’: in each case evolution and learning are processes which differ only in time scale. ‘In this theory,’ says Edelman,

the variance and individuality of brains are not noise. Instead, they are necessary contributors to neuronal repertoires made up of variant neuronal groups. Spatiotemporal coordination and synchrony are provided by reentrant interactions among these repertoires, the composition of which is determined by developmental and experiential selection.

— Edelman (2004, 114)

The necessity of ‘variance and individuality’ is not confined to brains. ‘The biologist is constantly confronted with a multiplicity of detailed mechanisms for particular functions, some of which are unbelievably simple, but others of which resemble the baroque creations of Rube Goldberg’ (Lewontin 2001, 100). Degeneracy rules – and not only in a figurative sense, for it plays a crucial role in ‘the control hierarchy which is the distinguishing characteristic of life’ (Pattee 1973, 75). This differs from the hierarchy of scale in that it ‘implies an active authority relation of the upper level over the elements of the lower levels’ (75-6). This relation is also known as ‘supervenience’ or ‘downward causality’ (Pattee 1995), which is part of Freeman’s ‘circular causality’, as it ‘amounts to a feedback path between levels’ (Pattee 1973, 77). The development process in a multicellular organism offers an example. Each cell carries a copy of the entire genome in its nucleus; how does it manage to differentiate into a liver cell, or a blood cell, or a specific type of neuron? It receives ‘chemical messages from the collections of cells that constrain the detailed genetic expression of individual cells that make up the collection.’ Like all messages, these are coded, but the coding/decoding function is not to be found in the structure of the molecules carrying the message, whether they be enzymes, hormones or DNA. Likewise the control function is not found in any special qualities of those elements of the system which appear to be in ‘control’: rather it is found at ‘the hierarchical interface between levels’ (Pattee 1973, 79). The control function is degenerate in that the choice of particular elements to exercise control is to some degree arbitrary, and a different choice does not make a significant difference in the control itself.

Of course, this is also the general nature of social control hierarchies. As isolated individuals we behave in certain patterns, but when we live in a group we find that additional constraints are imposed on us as individuals by some ‘authority.’ It may appear that this constraining authority is just one ordinary individual of the group to whom we give a title, such as admiral, president, or policeman, but tracing the origin of this authority reveals that these are more accurately said to be group constraints that are executed by an individual holding an ‘office’ established by a collective hierarchical organization.

— Pattee 1973, 79

The control function is not a property of, and does not belong to, the individual who executes it. When someone tries to appropriate that function for himself, we call him a tyrant – a person who tries to control others for his own sake instead of serving the higher level of organization.

The polysemy of the term hierarchy is rooted in that of the Greek ἀρχη, which can mean either ‘a beginning, origin’ or ‘power, dominion, command’ (LSG). In English, first has a similar ambiguity: it can denote either one end of a time-ordered series or the ‘top’ spot in a ranking order.

In speaking of ‘control functions,’ we often need to distinguish between two kinds of ‘law’ or ‘rule,’ which we may call logos and nomos. The logos (or ‘logic’) of a system is its self-organizing function, while nomos is ‘assigned’ (LSG) artificially rather than arising naturally. Nomos is the kind of law which is formulated and ‘ordered’ so that it can be obeyed or ‘observed,’ while the ‘laws of nature’ are formulated (by science) in order to explain why the universe does what it is observed to be doing already. The distinction is denied by creationists, for whom nature itself is artificial (having been intentionally designed and manufactured by a God whose existence is prior to it), and perhaps by some who consider every formulation of science to be a disguised assertion of power. And the distinction is indeed problematic, because nomos in Greek can mean ‘usage’ or ‘custom’ as well as ‘law’ and ‘ordinance’ (LSG again). Are the ‘rules’ of a ‘natural’ language nomoi or logoi? I would say that the deepest grammatical rules are examples of logos, while the more ephemeral standards of usage are much more arbitrary, and therefore examples of nomos, even before they are formalized. But the boundary between them is fuzzy. You could put the question this way: How natural is human nature?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.