return

The Code of Mathematics

 

John von Neumann’s The Computer and the Brain. (New Haven/London: Yale Univesity Press, 1958.)

 

In: Key Works of Systems Theory, Dirk Baecker (Ed.), Opladen: Westdeutscher Verlag (2004, forthcoming).

 

1. Introduction

 

The Computer and the Brain (1958) is a published version of the Silliman Lectures which John von Neumann was invited to deliver at Yale in 1956. Although they were prepared by March 1956, they were never given, since Von Neumann was by that time already too sick to travel to New Haven. The author worked on the manuscript until his death on February 8, 1957. The manuscript remains unfinished, as his widow Klara von Neumann explains in her preface to the posthumous edition. However, the booklet can be read as a complete essay.

 

The essay of 82 pages is structured in two parts. The first part discusses the computer: its procedures, control mechanisms, and other characteristics. The second part focuses on the brain. The neural system is systematically compared with the computer in terms of the state-of-the-art at that time in the computer sciences. In what seems to have been the groundwork for a third part—but it is not organized as a separate part—Von Neumann draws some conclusions from the comparison with respect to the role of code and language. These conclusions are perhaps the most intriguing part of the book because Von Neumann addresses reflexive issues which had not previously been addressed in the cybernetic tradition (Corning, 2001; cf. Wiener, 1948).

 

2. The computer

 

After stating that he himself is neither a neurologist nor a psychiatrist, but a mathematician, Von Neumann embarks upon the first part of this essay by explaining the components of a computer in the language of a computer scientist. First, the difference between analog and digital computers is explained. This distinction will have some relevance for the discussion of the brain because—as stated in the second part—the brain can prima facie be considered as a digital computer. However, upon further reflection, some elements of analog computing (e.g., the chemistry) will also become relevant in understanding the functioning of the brain.

 

For similar reasons, Von Neumann introduces the difference between serial and parallel computing schemes (on p. 8). As we know, the operation of the brain is massively organized in terms of parallel processing, but certain elements of serial processing cannot be reduced to parallel processing. The author will argue in the second part that these considerations may therefore be important for our understanding of the functioning of the brain.

 

A third difference of the same kind is the one between “plugged control” and “logical tape control.” In the analog machinery, electromechanical relays control the processing at the physical level, while a logical control stored on tape can be superposed over this basic, “fixed connections” control. This allows for variety in the control system because the latter can be considered as independent of the underlying machinery. Digital machines can recombine organs for each basic operation, while analog machines in principle must contain enough organs for each basic operation, depending on the requirements of the problem at hand.

 

The remainder of Part One focuses on these higher-order control modes for digital machines. Two modes are distinguished: control by sequence points and memory-stored control. A branching point can be considered as typical for control by sequence points. For example, the system can be so constructed that it performs differently when receiving a positive or a negative current. The wiring can be increasingly complex.

 

Memory-stored control had replaced control by sequence points to a considerable extent already by Von Neumann’s time. While the sequence points were physical objects, the orders in memory-stored control are ideal entities which are attributed to registers of memory. Although Von Neumann recognizes that this is the direction of future developments in computing, he discusses this difference because the brain also has a physical dimension. Thus, one would expect mixed forms of control in the natural case. Some of the functions can be fully expressed in Boolean formats, but there may be advantages in having intermediate steps performed analogously.

 

For example, a density of pulses can be evaluated analogously and then be used for logical control. However, the main advantage of digital computation becomes manifest when one considers the precision of the computation. In the second part of the essay, Von Neumann estimates that the computer of 1956 had already outperformed the natural system of the brain in terms of precision by several orders of magnitude.

 

3. The brain

 

The second part of the essay begins with a description of the brain and the neuron in such a way that meaningful comparisons with the computer (as described in Part One) can be made. The perspective of the computer scientist is here deliberately reductionistic. For example, Von Neumann nowhere mentions the organization of the brain in terms of cortical and lower-level functions, although these distinctions are of much older date.

 

Von Neumann explains first the essentially digital operation of the neuron: it either fires or not depending on whether or not it is sufficiently activated. The Boolean operations of AND, OR, and NOT can then be constructed. The author, however, emphasizes that this prima facie consideration of the operation is too simple. Other factors introduce elements that are not digital, but which are “grounded” in the biological materials and the chemistry of the cells.

 

For example, the neuronal cell cannot fire repeatedly, since it needs a bit of time for recovery. After firing, the cell can be considered as “fatigued.” While the reaction time of a neuron is somewhere between 10-4 and 10-2 seconds, Von Neumann notes that “modern vacuum tubes and transistors can be used in large logical machines at reaction times between 10-6 and 10-7 seconds. (…) That is, our artifacts are, in this regard, well ahead of the corresponding natural components, by factors like 104 to 105” (ibid., at p. 47). According to the author, this makes comparisons between the computer and the brain fully relevant given the state of the art in 1955.

 

Technological advances in building computers have been accelerating by orders of magnitude since those days (e.g., Forester, 1980). When Von Neumann states in a later section that the natural system of the brain beats computer systems in other respects such as size, one knows that this relation has also changed historically. The major conclusion, however, is that the difference is not to be found in terms of the technical characteristics of computers versus brains. There are differences in the architecture and in the functioning of the components which also have to be taken into account.

 

While the neurons can be considered as the logical organs of the computing system in the brain, more complicated stimulation criteria emerge from its architecture. Neurons are stimulated not only because of nerve pulse combinations adding up in terms of their numbers, “but also by virtue of the spatial relations of the synapses to which they arrive” (p. 54). For example, the pulses may be correlated. This would create a non-linear dynamics. Von Neumann noted in this context that the concept of threshold “may turn around much more complicated relationships than the mere attainment of a threshold (i.e., of a minimum number of simultaneous stimulations), as discussed above” (p. 55).

 

Furthermore, simple (linear) notions like “summation” and “simultaneity” may hide more complex mechanisms in the time dimension that deserve to be studied experimentally. Von Neumann adds that in some cases it is not the size of the stimulus itself, but the size of its first-order derivative (the change) that may furnish a stimulation criterion. His conclusion is then formulated as follows:

 

It should be said, however, that all complications of this type mean, in terms of the counting of basic active organs as we have practiced it so far, that a nerve cell is more than a single basic active organ, and that any significant effort at counting has to recognize this. (…) Thus, all the complexities referred to here may be irrelevant, but they may also endow the system with a (partial) analog character, or with a “mixed” character. (ibid., pp. 59f.)

 

Thus, Von Neumann recognized the non-linear character of the operation of the brain, but he did not proceed by discussing levels of integration as they were familiar to the biology of that time. He preferred to express the non-linearity as a combination of digital and analog constructions of the machinery so that the computer metaphor could fully be exploited.

 

4. Memory and higher-order functions

 

The final part of the essay (pp. 60-82) can almost be considered as a separate part, but as noted Von Neumann has not made a division with the second part that focuses on the brain. In this part, the author first turns to the memory function of the brain. In his time, this function was, in his opinion, not clarified at all: “The only thing we know is that it must be a rather large-capacity memory, and that it is hard to see how a complicated automaton like the human nervous systems could do without one” (ibid., at p. 61.) The author then estimates (after some computation) the size of memory needed in the brain as 2.8 x 1020 bits. Even relative to today’s computers this would be a very large memory.

 

The discussion then dwells on where all this memory may be stored and how. Is there real forgetting in the memory function of the brain? In other words, do memory registers become available again for computation without prior conditioning? I am also not a brain scientist, and I don’t know whether and to what extent these questions have been addressed during the last few decades. Let me in this context point to Von Neumann’s conclusion (at p. 67) that the principles of computation can be expected to be “entirely different from the one that underlies the basic active organs.” As noted above, this conclusion was anticipated in terms of drawing attention to the possibility that the brain uses a “mixed mode” of control. The inference leads Von Neumann to a final consideration about the role of codes and languages in different domains of computation.

 

Reference is made to the work of the English logician Alan M. Turing. Turing (1947) had shown that it is possible to develop “short codes.” These codes enable a second machine to imitate the behavior of a fully coded machinery. Short codes were further developed because of the desire to be able to code more briefly for a machine than its own natural order system would allow, by “treating it as if it were a different machine with a more convenient, fuller order system which would allow simpler, less circumstantial and more straightforward coding.” (ibid., at p. 73) The imitating machine represents the imitated machine in its own domain.

 

Developing his argument that a discussion in terms of the concepts familiar in machine theory is appropriate when discussing the brain, Von Neumann hypothesizes that the logical structures of the latter may be different from the ones ordinarily used in logics or mathematics. “They are as pointed out before, characterized by less logical and arithmetical depth than we are used to under otherwise similar circumstances. Thus logics and mathematics in the central nervous system, when viewed as languages, must structurally be essentially different from those languages to which our common experience refers” (ibid., p. 82).

 

The language of the brain can then be considered as a short code. Statistical properties other than the pulse-trains along neurons can be expected to contribute to the transmission of information in what we would nowadays call a non-linear way. These additional channels allow for the bypassing of a lot of computation (cf. Rumelhart et al., 1986).  “However, (…) whatever the system is, it cannot fail to differ considerably from what we consciously and explicitly consider as mathematics.”

 

This final conclusion of Von Neumann brings the focus back to the mathematics. The analogy between the brain and the computer could not be formulated in terms of technical characteristics or structural properties. The two systems under discussion perform computational tasks by using other means because they operate in different substances. On the formal side, however, these different substances (“hardware” and “wetware”) may develop within their own domains analogs for evolutionary reasons, i.e., in terms of performance. The formal theorizing encompasses the substantive domains and their respective languages and codes in terms of abstract principles. This structuration can be expected to coevolve with the material dimensions of the systems under study.

 

5. Computation, communication, and control

 

In the four decades that have passed since Von Neumann wrote these lectures, all relevant fields of science and technology have witnessed spectacular developments. From the perspective of hindsight the essay under discussion here has primarily historical value. It reveals, among other things, the cautious hesitation of a great computer scientist and mathematician to jump too easily to another level of discourse, notably a biological one. While the biologist attempts to explain a level of integration (e.g., the cell) in terms of the underlying mechanisms, the mathematician has a different agenda: when we understand the mechanisms, to what extent can one then reconstruct a natural system? What are the parameters relevant for the reconstruction?

 

The theory of autopoietic systems (Maturana, 1978; Maturana & Varela, 1980; Varela & Goguen, 1978) has since 1956 provided us with a conceptual apparatus of consensual, semantic, and linguistic domains that can be operationally closed. In this tradition, the development of code and language is considered endogenous to a system’s level. The self-organizing cell, for example, has to develop an internal “language” in which molecules can be provided with biological “meaning.” The next-order coding selects upon the previous-order ones and can therefore be expected to be relatively at rest. In other words, it can be considered as “shorter” code that windows (e.g., through function calls) on domains containing longer code in potentially other languages (Simon, 1969, 1973).

 

Each substance communicates what it communicates. This generates variety. The reflections require first an embedded language for sustaining the organization of the system. At the meta-level a reflexive (i.e., human) language is needed for studying how the variety can be organized and self-organized into higher levels of organization. Von Neumann explored in this essay whether and how these basic principles of operations can be specified in terms of the language of computer science. The abstract mechanisms (e.g., the Boolean algebra) provide an analytical dimension which can be used to reconstruct the processes in the material domains under study.

 

How the two levels (the material and the formal) are matched remains to be studied empirically. Von Neumann emphasizes the word “experimental” in this context. Thus, one would have to ask: which is the relevant laboratory? “Give me a laboratory and I will raise the world” was Latour’s (1988) programmatic title with a reference to Pasteur’s micro-biological revolution in 19th-century France. Mathematics provides us with the abstract concepts of operators, and the computer with a domain for analyzing these operators in principle, but the substances under study are also specific.

 

In addition to a mathematical theory (e.g., Shannon’s (1948) mathematical theory of communication; cf. Leydesdorff, 1995), one needs substantive theories that identify the specific dynamics of the systems under study. The mathematical principles abstract from the specifics of the substances and thus enable us sometimes to provide analogies. The mathematically formulated analogy can thus be considered as a shorter code or a pathway from one domain of theorizing to another. Its value within the domain under study, however, remains to be investigated experimentally.

 

Von Neumann noted that the mechanism of using shorter code may also work the other way round: the material substance may contain mechanisms which are different from our common mathematics, but which are able to solve problems by using shorter code. This prediction has come true. For example, the so-called “traveling salesman problem” which is virtually impossible to solve using normal computation (because it is NP complete), can be solved by using DNA strings in a laboratory setting thanks to the properties of this material (Adleman, 1994; Liu et al., 2000; cf. Ball, 2000). The biochemistry of the system must be understood in addition to the mathematical problem. The recombination of formal and material insights provides us also with new mechanisms for the computation of complex problems. Thus, the mathematics can function as a formal bridge between the special theories that remain otherwise specific.

 

Reflexively, the example informs us also about the dynamics of science. While the scientific discourses themselves are paradigmatically codified, the next-order codification in the formal language of mathematics sustains the codification (Kuhn, 1977). The sciences codify their subjects of study, and this feedback sustains the self-organization of the historical phenomena into a knowledge-based order (Leydesdorff, 2001).

return

 

References

 

Adleman, L. M. (1994). Molecular Computation of Solution to Combinatorial Problems, Science 266 (11), 1021-1024.

Ball, P. (2000). DNA computer helps travelling salesman, at http://www.nature.com/nsu/000113/000113-10.html

Corning, P. A. (2001). "Control Information": The Missing Element in Norbert Wiener's Cybernetic Paradigm? Kybernetes, 30 (9/10), 1272-1288.

Forester, T. (Ed.). (1980). The Microelectronics Revolution. Oxford: Basil Blackwell.

Kuhn, T. S. (1977). A Function for Thought Experiments, The Essential Tension: Selected Studies in Scientific Tradition and Change (pp. 240-265). Chicago: Chicago University Press.

Latour, B. (1983). Give Me a Laboratory and I Will Raise the World. In K. D. Knorr-Cetina and M. J. Mulkay (Ed.), Science Observed (pp. 141-170.). London: Sage.

Leydesdorff, L. (1995). The Challenge of Scientometrics: the Development, Measurement, and Self-Organization of Scientific Communications. Leiden: DSWO/ Leiden University; at http://www.upublish.com/books/leydesdorff-sci.htm .

Leydesdorff, L. (2001). A Sociological Theory of Communication: The Self-Organization of the Knowledge-Based Society. Parkland, FL: Universal Publishers; at http://www.upublish.com/books/leydesdorff.htm .

Liu, Q., Wang, L., Frutos, A. G., Condon, A. E., Corn, R. M. & Smith, L. M. (2000). DNA computing on surfaces, Nature 403, 175.

Maturana, H. R. (1978). Biology of Language: The Epistemology of Reality. In G. A. Miller & E. Lenneberg (Eds.),  Psychology and Biology of Language and Thought. Essays in Honor of Eric Lenneberg (pp. 27-63.). New York: Academic Press.

Maturana, H. R., & F. J. Varela.  (1980).  Autopoiesis and Cognition: The Realization of the LivingDordrecht, etc.: Reidel.

Rumelhart, D. E., J. L. McClelland, & the PDP Research Group. (1986). Parallel Distributed Processing. Cambridge, MA/ London: MIT Press.

Shannon, C. E. (1948). A Mathematical Theory of Communication. Bell System Technical Journal, 27, 379-423 and 623-356.

Simon, H. A. (1969). The Sciences of the Artificial. Cambridge, MA/London: MIT Press.

Simon, H. A. (1973). The Organization of Complex Systems. In Howard H. Pattee (Ed.), Hierarchy Theory: The Challenge of Complex Systems (pp. 1-27). New York: George Braziller Inc.

Varela, F. J. & J. A. Goguen (1978). The Arithmetic of Closure, Journal of Cybernetics 8, 291-324.

Von Neumann, J. (1948). The Computer and the Brain. New Haven and London: Yale University Press.

Wiener, N. (1948). Cybernetics: Or Control and Communication in the Animal and the Machine. Cambridge, MA: MIT Press.

return