"The location of nerve cells, the trajectory of nerve fibers and the spatial array of synaptic connections are invariant in all individuals of the same species....In this way, the main neuronal circuits achieve an architecture that is breathtaking in its complexity, but frugal in its variability."
(Jacobson & Hunt, 1965, p. 26)
The obvious presence of systematic organization within the brain encouraged the early
behavior to seek specific relationships between structure and function. The roots of physiological
psychology can be traced rather specifically to a series of 19th century experiments by Flourens
(1824). By observing the effects of surgical damage, Flourens established the general
the spinal cord, the cerebellum, the medulla and the hemispheres. At about the same time, the
more detailed physiological studies of Bell and Magendie (ca. 1825) revealed the anatomical and
functional distinctions between sensory and motor nerves. Somewhat later, Fritsch and Hitzig
(1870) used electrical stimulation techniques to show the important principle of
organization, the point-to-point mapping of brain cells and the muscle groups that they serve.
The results of these studies were readily accepted, not so much because they were without
contradiction, but more because of the Zeitgeist for classification schemes and molecular analysis
(cf., the cell theory and developments in chemistry).
All of these experiments were conducted within the framework of the subtractive model of
structure/function relationships. The principal method of testing this model is the ablation
experiment, which attempts to determine missing behavioral capacities following the surgical
removal of specific regions of the nervous system (i.e., the "subtraction" of a portion of the brain
to determine which aspect of behavior is missing).
The common sense appeal of the structure function notion probably would have been
sustain the use of the ablation experiment, but a combination of technological advances and key
experimental findings pushed its popularity almost to the point of uncritical acceptance. In 1908,
Horsley and Clarke invented the stereotaxic instrument which made it
possible to direct surgical
damage to specific structures deep within the brain. In 1939, this instrument was modified for use
on the rat (Clarke, 1939), setting the stage for a series of impressive
demonstrations of the
ablation experiment. Hetherington and Ranson (1940) showed that
specific destruction of the
ventromedial nucleus of the hypothalamus reproduced the so-called clinical obesity syndrome
which, previously, had been associated only very loosely with tumors of the hypothalamus or
pituitary gland. Soon it was shown (e.g., Teitelbaum & Stellar,
1954) that specific destruction
of the neighboring lateral hypothalamus (less than a millimeter away) resulted not in obesity, but
in the rejection of food to the point of starvation.
Thus, the stage was set for at least a quarter century of attempts to relate specific areas of the
brain to specific behaviors. During recent years, this method has fallen into disrepute as methods
that appear to be more sophisticated have been developed. The problems are not restricted to the
technical deficiencies of the lesion method, and we may as well face them now before we go into
some of the pharmacological data.
One of the most frequent criticisms of the subtractive logic method is the tendency to ascribe
functions to "holes" in the brain. Technically speaking, a lesion in the auditory cortex does not
cause a loss of hearing, it is that the remaining portions of the brain cannot mediate the function of
hearing. This point of logic seems to be at once subtle and obvious, but the fact remains that
errors in interpretation of these experiments are quite likely. The common example of the
naysayers is a hypothetical experiment which would first show that a frog will hop when a loud
noise is made. This behavior would disappear after removal of the legs. The conclusion: Frogs
hear with their legs. The real point to be made here is that this example seems absurd only
because we have so much independent knowledge about frog legs and hearing. We would not
have this luxury if we attempted to discover, for example, the role of the stria terminalis in eidetic
There are two major reasons why we should not cavalierly abandon the subtractive model of
structure/function relationships and the ablation method of testing this model. The first reason is
the simple assertion that function must have a structural basis, and that this structural basis lies
within the brain. Specific functions may not be closely tied to obvious structural organization, but
in the final analysis, some set of physical structures will be held accountable for behavior.
The second reason for holding onto the subtractive model of determining structure/function
relationships is, again, simple and compelling: subtractive logic is the only method available.
Flourens did not discover the subtractive model and the ablation method, but rather applied the
only tools and logic that the scientific community is willing to use to the specific problem:
(a) an assumption that the universe is lawfully organized (brain structure is related
to brain function),
(b) a belief that proximal cause and effect relationships exist (brain structure
actually accounts for brain function), and
(c) that cause and effect relationships can be determined by manipulation (changes
in structure produce changes in functions).
Subtractive Logic in
It is important to realize that pharmacological experiments are a direct extension of the ablation
method and subtractive logic. The logic being applied is that if a certain brain transmitter is
blocked and certain behaviors change, there is a (chemical) structure/function relationship (refer
to Fig. 2-1). All of the more sophisticated
chemical interventions (e.g., the mimickers, enzyme
blockers, alpha blockers, and others as outlined in chapter 3) are simple variants of this. Some
will argue that stimulation experiments (as opposed to lesion or chemical blockade) circumvent
these logical problems, but the arguments are less than compelling. A brief review of figure 1-10
will demonstrate the complexities that are involved. Stimulation of the parasympathetic nerve, for
example, results in either stimulation or inhibition of the muscle, depending on where one
measures the response. Likewise, chemical stimulation with NE results in either stimulation or
inhibition of the muscle, depending on where one measures the response. Stimulation
experiments are neither simple nor straightforward in their logic, and must be viewed with the
same eye for complexity as other experimental approaches.
Even developmental and comparative studies can be characterized by what might be referred
a receding subtractive model. The logic is based on the fact that young animals that do not have
fully developed behavior potential show a common emergence of structure and function, including
the emergence of specific pharmacological systems. If time went backward, it would be a lesion
study. In fact, the lesion study does occur in nature as time goes forward, with changing behavior
as a result of changing structures during senescence. The ablation method, subtractive logic, and
the associated problems are pervasive. The effects of alpha-adrenergic blockade, precommissural
fornix transection, or the development of olfactory capabilities are no more or no less "messy"
than those of Lashley's (1929) classical experiments that attempted to
specify the brain
mechanisms of intelligence.
The point of all of this discussion is to encourage you to approach the field of behavioral
pharmacology with the same squinty-eyed skepticism that has been applied to the old-fashioned
lesion methods. It is a relatively new field of inquiry, but only in terms of the procedures. Highly
sophisticated approaches and details of neurochemistry do not alter the simple fact that the
interpretation of these results is subject to the same pitfalls as the interpretation of brain lesion
experiments. And a healthy level of skepticism will lead us to more detailed and accurate
accounts of the relationships between drugs and behavior. Let us examine a set of clinical and
experimental results that encompass a wide range of anatomical, pharmacological and
developmental observations to see how these methods may be applied.
TRAUMA AND GENETIC
One of the most bizarre cases of accidental brain injury is that of Phinnaeus P. Gage, a railroad
worker (See Bloom et al 1985 for a detailed account). In the Fall of
1848, Gage and his crew
were blasting rock. The procedure involved drilling a hole in the rock, then stuffing the hole with
alternate layers of packing material and black powder. The packing material was tamped into
place with a long steel rod. In a moment of carelessness, Gage apparently tried to tamp the
powder layer, and a spark ignited the powder. The resulting explosion transformed the tamping
rod into a four-foot projectile which entered Gage's left cheek, passed through the top of his head,
and landed several feet away (Fig. 2.2).
To the amazement of Gage's crew, he sat up and began talking to them minutes later and,
short trip by ox cart ambulance, was able to walk up the stairs to his hotel room. The attending
physician determined that his entire index finger could be inserted into Gage's brain case. He
placed bandages on the wounds, and followed him through an uneventful period of healing.
Within a few weeks, Gage's physical recovery allowed him to return to his job as foreman.
had no apparent intellectual deficits or memory losses. Yet, his return to work quickly showed
the nature of the deficit that follows massive frontal lobe damage. The formerly mild mannered,
thoughtful and cooperative foreman had been transformed into a cursing, belligerent tyrant. He
lost his job, joined a traveling sideshow for a few years to capitalize (in a small way) on his
misfortune, and died of an epilepsy attack about 13 years later.
Phinnaeus P. Gage's tragic case history has become famous in the annals of abnormal
as an example of altered behavior resulting from traumatic brain injury. Thousands of other case
histories involving trauma, cerebrovascular accidents, tumors and the like have provided parallels
to animal experimental data to build up a fairly accurate picture of the functional relationships
between anatomy and behavior. Although the effects of such damage are never as simple as one
would like, there is nonetheless a certain degree of predictability of motor disturbances, sensory
problems, intellectual deficits, emotional disturbances, memory losses, and so forth.
In the same way that Gage's case history serves as a useful parallel to experimental lesion studies,
the disease known as phenylketonuria can serve as a useful parallel to experimental pharmacology
(Kaufman, 1975). Phenylketonuria, usually referred to as PKU, is a genetic
appears to follow the simple laws of Mendelian genetics. It is estimated that about 1 in 50
persons is a carrier of the recessive gene for this disorder. When two such carriers have children,
there is a 1 in 4 chance that the disorder will be manifested (on the average, 2 in 4 will be carriers;
1 in 4 will be normal). As a result, the incidence of the disorder is about 1 in 10,000 births.
Phenylketonuria appears to result from a disorder of a single gene which controls a single
which transforms the amino acid phenylalanine into another amino acid, tyrosine. When this liver
enzyme is missing or seriously deficient, phenylalanine accumulates in the blood, and related
metabolites called phenylketones begin to appear in the urine (hence, phenylketonuria). In the
absence of treatment, the prognosis is very bleak indeed. The patients typically show severe
mental retardation with an IQ less than 30, abnormalities of posture and reflexes, reduced
pigmentation, and a life span that rarely goes beyond 30 years.
The details of this disorder, which are still unfolding, can provide some valuable lessons in the
study of brain chemistry. In particular, it serves as a model of the complexity that must be
considered in any pharmacological manipulation. The first and perhaps most obvious
manifestations of PKU can be attributed to the effects of neurotoxins. In the absence of the
critical enzyme, phenylalanine levels rise to about 15 times their normal level and allow large
quantities of phenylketones to be produced. These substances may have some direct toxicity to
brain cell physiology, but more importantly, they appear to interfere with chemical transmitter
systems (Fig. 2.3). A series of different
neurochemicals called catecholamines (including
norepinephrine, DOPA and dopamine) share tyrosine as a precursor. In the absence of the
enzyme, there is less conversion of phenylalanine into tyrosine and deficits in the production of
these transmitters may occur. Another way of looking at this is that the PKU deficit makes
tyrosine an essential amino acid, since it can only be obtained from dietary sources.
Ironically, the current evidence suggests that this direct effect may be less important in the
resulting mental retardation than another, indirect effect. Another essential amino acid,
tryptophan, is converted into serotonin (5-HT), an important
neurotransmitter of the central
nervous system. There is some degree of overlap in the biochemical pathways that convert
tyrosine and tryptophan into their respective transmitters. As a result, high concentrations of
phenylalanine may inhibit the formation of serotonin. In untreated cases of PKU, serotonin levels
are dramatically low. The most successful treatment of PKU is the dietary restriction of
phenylalanine which prevents the build up of phenylalanine and related metabolites. This dietary
restriction produces a rapid elevation of serotonin levels and halts the progression of the disorder.
If such treatment is begun at birth, the major signs of retardation are averted, although some
minor difficulties always seem to remain, presumably because of prenatal factors.
The impaired function that results from untreated PKU cases is completely analogous to the
of Phinnaeus P. Gage--it is the result of localized brain damage. In the case of Gage's traumatic
lesion, the damage can be localized by an anatomical description. In the case of PKU patients, the
damage is localized by a biochemical description, i.e., a biochemical lesion of the serotonin system
and, probably, the catecholamines. The important point to remember is that drug effects and
transmitter disorders are no more or no less than physical lesions or electrical stimulation.
ONTOGENY OF BRAIN CHEMISTRY AND
Development of Brain Structure
The development of the human brain is nothing less than awe inspiring. Between the time of
conception and the time of birth, the full complement of some 100 billion individual neurons are
formed, migrating to their appropriate place within the brain structure. This number is perhaps a
little easier to comprehend when one considers that, on the average, a million new brain cells are
formed each four minutes between conception and birth! At the time of birth, all of the cells have
been formed, but the development of synaptic connections continues, perhaps for 30 or more
years in humans. Each neuron communicates with something on the order of 1,000 other neurons
and may have anywhere from 1,000 to 10,000 synaptic connections on its surface
times 1011 cells equals 1014 synapses with an even more staggering
number of possible synaptic
The orchestration of all of these connections during the process of development is further
complicated by the presence of numerous different chemical transmitter systems. Obviously, our
meager abilities to analyze the nervous system can hardly scratch the surface of this process, but
we can, nonetheless, gain some general insights into the important events that accompany the
development, maturation and ultimate degeneration of the brain.
In the first section of this chapter, we mentioned in passing that the developing brain could be
compared to a lesion experiment. Prior to the complete development of the brain, any behavior
that occurs must be mediated by the part of the brain that has developed, without the part that is
to be developed--exactly as in the case of Phinnaeus P. Gage, whose behavior depended on the
remaining brain tissue. If the development of the brain were completely random, the study of
developing behavior would provide little information. But to the extent that specific systems may
develop in sequence, we may gain some information by looking for parallels between the onset of
certain behavioral abilities and the appearance of certain brain systems.
It is important to keep in mind the types of changes that we must look for in the developing
Even in the case of human brains, which are relatively immature at the time of birth, all of the cells
are present and, to a large extent, in position. This does not mean that the brain is functionally
mature at birth. In some cases, the maturation of the biochemical machinery that is necessary for
chemical neurotransmission is not completed until some time after birth. Furthermore, the
physical complexity of some cells, especially the formation of synapses, may continue for many
years after birth. Thus, there is an interdigitation of both anatomical and chemical development.
In general, the increasing anatomical sophistication of the developing brain follows a plan that
been termed encephalization (see Fig. 2-4). The spinal cord and brain stem regions develop and
mature at an early age, while midbrain and forebrain regions develop later. The cortex, and
particularly the frontal regions, are the last to show fully developed anatomical complexity. These
anatomical changes cannot be ignored, but for the present purposes, we will concentrate on the
development of neurochemical systems.
Emergence of Behavior
and Brain Chemistry
One of the recurrent themes in developmental psychology is that of the emergence of behavior.
The term emergence has been used advisedly, because it brings forth the image of something
popping to the surface of a pool, rather than the piecemeal, brick by brick construction of some
edifice. Although a case can be made for both types of appearances of behavior, the more
remarkable case is that in which the relevant behavior or set of behaviors suddenly appears in
full-blown form. A common example of this is the motor skill of walking in human infants. No
amount of "training" can teach an infant to walk at age six months, and a complete lack of training
does not appreciably retard the age of onset of walking. The "skill" usually emerges within a
week or two following the first tentative steps.
We are currently at a very primitive level of understanding the emergence of more
cognitive behaviors. (Unfortunately, the accumulation of this knowledge appears to be more of
the ilk of brick by brick construction, rather than some epiphany popping to the surface.) One of
the most complete stories that we have available at this point is that involving the
neurotransmitter acetylcholine. There are many gaps in the data, but if we are willing to fill these
with a bit of conjecture, we can see a coherent example of the parallel emergence of behavior and
The story began with a theoretical review by Carlton (1963),
setting forth the notion that
behavioral excitation was under the control of brain systems that use norepinephrine as a
neurotransmitter, whereas behavioral inhibition was under the control of brain systems that
function through acetylcholine. Accordingly, an increase in activity could result either (a)
directly, as a result of stimulation of the norepinephrine system, or (b) indirectly, as a result of
blocking the acetylcholinergic system. (We will learn more about the specific drugs later, but for
the present purposes a general description will suffice.)
Campbell and associates (1969) extended our understanding of
these theoretical notions by
studying the development of these behaviors in young rat pups. When rat pups are about 14 days
of age, their ears have been opened for only a few days, their eyes are just opening, their fur just
barely conceals their skin, and they are very cute. Even at this early age, a significant increase in
their activity is produced by drugs that stimulate the catecholamine (norepinephrine and related
compounds) systems. By contrast, attempts to increase activity indirectly by blocking
acetylcholine functions was to no avail. By 18 to 21 days of
age, manipulation of the cholinergic
system began to have an effect, and by 28 days of age, the cholinergic blocking effect increased
activity in the same way that it does in adults.
The interpretation of these experiments suggests that the brain systems that use acetylcholine
neurotransmitter are not yet functional in the 14-day old rat. Obviously, drugs that interfere with
such a system could not be effective, because there would be no substrate for them to work upon.
Once the system has become functional at three to four weeks of age, the drugs have a substrate
to work upon and become effective.
It should be noted here that the results of these experiments using a combination of drugs,
behavior and age point toward a very specific and testable conclusion: Namely, that a detailed
look at the brain should show the maturation of acetylcholinergic cells when rats reach the age of
three to four weeks. The results of neurochemical assays (Matthews et al,
confirming evidence for the late development of the acetylcholine pathways. Furthermore, related
experiments that involved both behavioral and neurochemical assays showed the emergence of
brain systems that utilize serotonin as a transmitter at about 10-12 days of age. Thus, we begin to
see a general tendency for the sequential development of systems that can be defined on the basis
of the neurotransmitter.
The behavioral functions that are mediated by emerging cholinergic systems appear to be far
complicated than the locomotor activity that was measured in these early experiments. As
Carlton suggested in his early review (1963), the cholinergic systems
seem to mediate a more
general inhibition of behaviors that are non-rewarded or actually punished. Behavioral studies in
numerous laboratories, including that of the authors (cf., Hamilton &
Timmons, 1979), have
buttressed this view, showing that the ability to perform tasks that involve behavioral inhibition do
not reach maturity in the rat until about three to four weeks--the same time that the neurochemical
The neurochemical systems must be located somewhere, and this location defines the
basis for these same behaviors. Although the neurochemical systems are somewhat dispersed in
the brain, the acetylcholine fibers are heavily concentrated within the limbic system and the
forebrain (cf., Feigley & Hamilton, 1971). Literally hundreds of
experiments have shown that
physical damage to these areas impair the ability of animals to inhibit behaviors that are punished
or non-rewarded (cf., McCleary, 1966).
In the case of humans, these inhibitory abilities become very sophisticated and involve such
behaviors as inhibiting the attainment of immediate gratification because of complex social
contingencies that are likely to be in effect in the future (e.g., not eating the chocolate cake that
has been prepared for tomorrow night's birthday party). Interestingly, the inability to meet such
exigencies is characteristic of "childish" behavior, and inhibitory control emerges in rudimentary
forms at about 4 to 6 years of age. From that point on, the development of these abilities seems
to be more gradual, perhaps continuing until about 30 years of age. It is perhaps more than
coincidence that the anatomical complexity of the forebrain region shows a concomitant
development, and that many of these fibers utilize acetylcholine as the neurotransmitter.
Let us return for a moment to the case of Phinnaeus P. Gage. Despite massive physical
the forebrain, he did not exhibit straightforward deficits in intelligence. Nor did the literally
thousands of patients who underwent frontal lobotomies during the unfortunate period of history
when these were in vogue--IQ scores typically remained the same or actually increased by a few
points (cf., Valenstein, 1973 for related discussion). Thus, man's
proudest possession, the big
frontal lobes, seem to be minimally involved in intelligence scores. But, they are involved in
foresight and the social control of behavior that will have future consequences. This may be the
unique feature of humanity, and it may involve to a large extent, the neurons in the limbic system
and frontal lobes that function via acetylcholine.
If we follow this line of argument and the associated experimental evidence through
into senescence, the story continues to hold together. One of the emerging characteristics of
senility is that of so-called "childish" behavior. A closer examination of these behaviors reveals
that it is the failure to take into consideration the future effects of one's behavior upon others--a
failure to respond to social contingencies that require the inhibition of behaviors. It has been
known for some time that the aging brain shows selective deterioration of small fibers of the
cortex. Evidence from autopsies performed at Johns Hopkins (e.g., Coyle
et al, 1983) have
shown that the majority of the fibers that deteriorate seem to be those that utilize acetylcholine as
the neurotransmitter, especially in the case of Alzheimer's disease.
Thus, the small cholinergic fibers that are the last to appear are also the first to be lost (see Fig. 2-5). The behavioral capacity for
complex inhibition is the last to appear and the first to be lost. In
the middle, interference with acetylcholine neurochemistry or destruction of the areas where these
fibers are concentrated results in impairment of inhibitory behaviors. All of this may be more than
We have seen in this chapter some general approaches to the analysis of brain and behavior.
go now to some of the details of the methods that are involved, and will return later to show the
application of these details to particular interfaces of behavior and brain chemistry.
2. Certain metabolic disorders may interfere specifically with individual neurotransmitter
3. The major neurotransmitter systems mature at differing rates, with the behaviors that are
controlled by these systems emerging as cells become functional.
4. Senescence may involve the specific decline of some neurotransmitter systems while others
remain more or less intact.