er, said Koch. "If you open a book on perception, you see things like 'input buffer,' 'CPU,' and the like, and they use language that suggests you see an object in the real world and open a computer file in the brain to identify it."
Neurophysiologists in particular have preferred to attend to the details and complexities rather than grasping at theories and patterns that sometimes seem to sail off into wonderland. "As a neuroscientist I have to ask," said Koch, "Where is this input buffer? What does it mean to 'open a file in the brain?'"
Biological systems are quintessentially dynamic, under the paradigm of neo-Darwinian evolution. Evolution designed the brain one step at a time, as it were. At each random, morphological change, the test of survival was met not by an overall plan submitted to conscious introspection, but rather by an organism that had evolved a specific set of traits. "Evolution cannot start from scratch," wrote Churchland et al. (1990), "even when the optimal design would require that course. As Jacob  has remarked, evolution is a tinkerer, and it fashions its modification out of available materials, limited by earlier decisions. Moreover, any given capacity . . . may look like a wonderfully smart design, but in fact it may not integrate at all well with the wider system and may be incompatible with the general design of the nervous system" (p. 47). Thus, they conclude, in agreement with Adams, "neurobiological constraints have come to be recognized as essential to the project" (p. 47).
If a brain does possess something analogous to computer programs, they are clearly embedded in the scheme of connections and within the billions of cells of which that brain is composed, the same place where its memory must be stored. That place is a terrain that Adams has explored in molecular detail. "I really don't believe that we're dealing with a software-hardware relationship, where the brain possesses a transparent program that will work equally well on any type of computer. There's such an intimate connection," he stressed, "between how the brain does the computation and the computations that it does, that you can't understand one without the other." Almost all modern neuroscientists concur.
By the time Marr died in 1981, artifical intelligence had failed to deliver very compelling models about global brain function (in general, and memory in particular), and two essential conclusions were becoming apparent. First, the brain is nothing like a serial computer, in structure or function. Second, since memory could not be as