The introduction of a drug into a biological system is far more complicated than
adding a compound into a test tube. The initial dosage of the drug, the route of
administration, the rate of absorption, the rate of elimination and many other
factors enter into this complex equation. Ultimately, the effectiveness of the drug
is usually determined by the concentration of drug molecules in the plasma that are
free to interact with receptor sites (cf., Chapter 3).
The distribution of drug molecules to the receptor sites may be complicated, but it
is only the tip of the iceberg in terms of the organism's overall response to the
drug. Early in the text, a somewhat loose distinction was made between a drug
action (how the drug interacts with a specific receptor, e.g., mimicking
acetylcholine at muscarinic receptors) and a drug effect (the physiological or
behavioral results of this drug action; e.g., a decrease in heart rate or an
increase in arousal level). There is yet another class of drug effects that may or
may not involve the specific receptors that mediate the drug action: The presence
of the drug may trigger any of several different responses that change the reaction
to future encounters with the drug. This altered response to the drug is usually a
decrease (tolerance), although increased response to the drug (sensitization) can
The body has two general methods of increasing its tolerance to a drug. One of
these is to reduce the opportunity of the drug to reach the receptors (i.e., reduce
the drug action), the other is to launch a biological counterattack against the drug
effect (i.e., a compensatory reaction).
There are several different ways in which the receptors can be insulated from the
drug. The entry of free drug molecules into the bloodstream can be reduced by
lowering the rate of absorption from the stomach and intestines. This might be
accomplished mechanically by a change in blood flow or peristaltic action, or
biochemically by a reduction of the transport mechanisms that may be required to
carry the drug molecules across the membranes and into the plasma compartment.
Another possibility is to allow the drug to enter the bloodstream in exactly the
same way, but to reduce the final level that is reached by increasing the rate at
which the drug is eliminated (e.g., by the kidneys or liver). Finally, it may be
possible to increase the binding of drug molecules into complexes with other, larger
molecules to render them inert. Figure 9.1 summarizes these mechanisms of
There are also several alternatives through which a compensatory response can be
made to drug effects. One of the most straightforward ways is to increase the
activity of an opposing system. For example, if a sympathetic agonist is increasing
the heart rate, this could be countered by an increase in parasympathetic activity
that reduces the heart rate. Although the contrapuntal relationship between the
sympathetic and parasympathetic systems has been overrated, mutual feedback systems
do tend to modulate and balance the activity of these systems, and some brain
systems may have comparable patterns of organization. The details of this
compensatory response are usually not clear for any individual case, but the general
features probably involve neuromodulation, which has already been discussed in other
contexts. If, for example, a drug reduces the amount of transmitter that is
released, the postsynaptic cells may respond by increasing the number and or
sensitivity of receptor sites to maximize the effect of the available transmitter
molecules. (This process is basically the same as denervation supersensitivity, a
phenomenon which can occur if the fibers coming into a cellular region have been cut
and allowed to degenerate. Following the degeneration, the cells that have lost
their inputs frequently show an increase in the number of receptor sites and become
supersensitive to even small amounts of the missing transmitter substance.)
Alternatively, agonists may cause the postsynaptic cells to reduce the number of
receptors to prevent excessive levels of stimulation. At a still more complicated
level, behavioral tolerance may enter into the picture, with the organism learning
or otherwise adapting to the effects of the drug, such that the behavior is
normalized in spite of any prevailing physiological changes that the drug may
produce. These compensatory mechanisms are summarized in Figure 9.2 (cf., Ellison
et al, 1978; Lee & Javitz, 1983; Schwartz & Keller, 1983).
The common feature of all of these mechanisms of tolerance is that the response to
subsequent drug administration is changed. Depending upon the nature of the
particular response, the tolerance might be evidenced by a change in the effective
dose, the lethal dose, the time course of the drug effect, the range of effects, or
some combination of these. Furthermore, the changes that occur within one system
can even alter the future responses to drugs that are in a different pharmacological
class (cf., Glowa & Barrett, 1983). We turn now to some specific examples of
tolerance to demonstrate some of these reactions to repeated drug injections.
The term tachyphylaxis literally means rapid protection and is exemplified by the
tolerance that develops to the effects of indirect acting drugs. Ephedrine, a drug
which stimulates the sympathetic nervous system, is such a drug. As shown in Figure
9.3a, a standard dosage of ephedrine produces a rapid and short lived increase in
blood pressure. If this same dosage is repeated at 10-minute intervals, the effect
becomes smaller and smaller until, after several dosages, there is virtually no
change in blood pressure. How could such a rapid tolerance develop?
The mechanism of this rapid tolerance can be inferred by the time course and by the
effects of other drugs. The tolerance does not represent a permanent change,
because the change in blood pressure will return to its original level if an
interval of several hours is allowed between doses. This pattern of rapid tolerance
that goes away quickly could be the result of fatigue of the smooth muscles that
cause the vasoconstriction. However, the effects of epinephrine show that this is
not the case. Repeated dosages of epinephrine continue to produce large elevations
in blood pressure, and a single dosage of epinephrine given at a time when ephedrine
has no effect, will produce a full scale change in blood pressure (see Figure 9.3b).
The interaction of these drugs with a third drug, reserpine, provides further
information about the mechanism of tachyphylaxis. Reserpine causes the gradual
depletion of norepinephrine from the sympathetic terminals. This results in a
decline in blood pressure, which can be readily reversed by epinephrine. By
contrast, ephedrine (even the first dosage) has no effect on blood pressure after
the transmitter substance has been depleted (see Fig. 9.3c).
The conclusion is that the tachyphylaxis is the result of a rapid emptying of the
transmitter substance from the synaptic vesicles (see Figure 9.3d). Ephedrine per
se has no direct effect on the smooth muscle receptors that mediate the change in
blood pressure. Rather, the elevated blood pressure is produced indirectly by
stimulating the release of the neurotransmitter from nerve terminals. Repeated
dosages of the drug in rapid succession release the transmitter faster than it can
be replaced, and the effectiveness of the drug declines. These conclusions are
further supported by the observations that norepinephrine administration not only
produces an increase in blood pressure, but it also partially restores the
effectiveness of ephedrine. The restoration occurs because the reuptake process
(cf., Chapter 3) incorporates some of the norepinephrine into the vesicles where it
can be released by the next dosage of ephedrine.
This form of tachyphylaxis is a special case of tolerance that does not involve any
particular reaction of the systems involved. It is a simple case of the drug effect
being limited by the capacity of the system to respond. The remaining types of
tolerance that will be discussed involve a much more dynamic and longer lasting
reaction to the effects of drugs.
Changes in Receptor Sensitivity
Tolerance can also be mediated by a change in the sensitivity of the relevant system
to the drug or transmitter. An example of this sort of tolerance can be seen in the
results of an experiment performed by Brodeur and DuBois in 1964. They administered
daily dosages of an acetylcholinesterase inhibitor to rats. This blockade of the
inactivation of acetylcholine allows the transmitter substance to accumulate.
Initially, these drug injections produced a variety of parasympathetic symptoms,
including tremor and convulsions. By the end of 60 days, however, tolerance had
developed and none of these effects was observed.
There are several possible ways that such tolerance could be developed. For
example, the drug could lose its ability to block acetylcholinesterase. However,
assays demonstrated that the degree of cholinesterase inhibition remained unchanged
over the 60-day treatment period. This leaves open the possibility that the
acetylcholine levels were brought under control by some other mechanism, but
measures of acetylcholine showed the same high levels were produced by the drug on
Day 60 as on Day 1. How could tolerance develop if the actions of the drug remained
Suppose the observed tolerance occurred because the neural systems had become
refractory to the high levels of acetylcholine. This notion was tested by
administering carbachol, a synthetic drug that acts on receptors for acetylcholine,
but is immune to the inactivating effects of acetylcholinesterase. In animals that
had not received prior drug treatment, the LD-50 was 2 mg/kg. The rats that had
developed tolerance to the cholinesterase inhibitor were twice as resistant to the
effects of carbachol, showing an average lethal dose of 4 mg/kg. In other words,
the drug continued to inhibit the action of acetylcholinesterase, the resulting
increase in acetylcholine levels were maintained, but the tremors and convulsions
disappeared: The drug actions remained constant while the drug effects declined.
The most likely mechanism for this form of tolerance is a change in the sensitivity
of the postsynaptic membrane. This can occur through the process of neuromodulation
(see figure 9.4). Synaptic activity is a dynamic process, which can be controlled
by either a change in the amount of transmitter substance that is released or a
change in the response to the transmitter. In the example described above, it would
appear that the neural systems responded to the high acetylcholine levels by
reducing the number of receptors (cf., Schwartz and Keller, 1983).
Although it is not necessary to go through the details of a specific example, it
should be pointed out that the same mechanisms can result in tolerance to drugs that
produce a decrease in transmitter substance. The initial effects of transmitter
reduction are typically greater than the chronic effects. This type of tolerance
can be attributed to an increase in the number of postsynaptic receptors. This
effect has been described in earlier discussions as it applies to the phenomenon of
denervation supersensitivity. The increased sensitivity that follows nerve damage
can be viewed as tolerance to the physical damage.
These changes in receptor populations involve some rather major commitments of
cellular metabolism. As such, they have the properties of both inertia and
momentum; it takes some time (perhaps days or weeks) for the tolerance to develop
and perhaps even more time for the system to return to initial levels when the drug
is no longer present. These are very important considerations which will be seen in
more detail in the later discussion of rebound phenomena.
Enzymes are protein molecules that increase the speed of chemical reactions. They
typically have a rather high affinity for a particular chemical structure (the
substrate) and the enzyme-substrate complex proceeds through the chemical reaction
faster than the substrate alone. We already have seen several examples of enzymes
that are involved in neurotransmitter systems (e.g., AChE, MAO, COMT and tyrosine
hydroxylase). The liver has a rather extensive library of enzymes that facilitate
physiological processes (especially digestion) and help to break down toxic
substances from both internal and external sources.
The chemical specificity of enzymes allow for the precise control of chemical
reactions, but it also poses a problem. It would be very inefficient (not to
mention impossible) for the body to produce and store all the enzymes that might be
needed. It would be much simpler to have a way to limit production to those that
actually are needed. This is what happens in the process known as enzyme induction
(see Figure 9.5). When a new foodstuff or drug is encountered it may induce the
metabolic machinery to produce an enzyme that has the specific ability to break it
down into simpler components that can be used by the body (in the case of
foodstuffs) or inactivated and eliminated (in the case of drugs).
The induction of enzymes involves protein synthesis, a process that may require
several hours or more to take place. What this means in terms of the metabolic fate
of drugs is that the drug molecules from the first injection may induce the
formation of the appropriate enzyme (usually by liver cells), but undergo metabolism
through the existing, sluggish pathways. Thus, the drug may stay in the system and
produce its effects for a long period of time. However, once the liver cells have
begun production of the enzyme, it is more readily available for encounters with the
drug molecules, and the breakdown reactions for subsequent dosages will proceed more
rapidly. It should be noted that this is not an all or none process, but rather one
which can be regulated by the number of times the inducing substance is encountered
and the amount that is presented. In any event, the induction of enzymes can result
in dramatically different rates of drug metabolism that are seen as examples of
The barbiturate drugs provide a good example of tolerance that is at least partially
the result of enzyme induction. Remmer (1962) administered high anesthetic doses of
pentobarbital to rats on three successive days. Rats in a control group received
daily injections of saline. Then all the rats received a lower test dosage to
determine if tolerance had developed. The rats that had been pre-treated with
pentobarbital slept only half as long as the rats in the control group (30 min vs 67
min). This change in sleeping time was paralleled by a change in the rate of
eliminating the drug from the system. The half-life of the drug (the time required
to inactivate half of the injected drug) in the control group was twice as long as
that of rats that had been pre-treated with pentobarbital.
It could be postulated that the relevant brain cells became less responsive to the
effects of pentobarbital in the same way that the cells became less responsive to
acetylcholine in the previous discussion. The evidence does not support this. The
concentration of pentobarbital in the blood at the time of awakening can be used as
an index of the sensitivity of the cells to the drug. As the concentration
gradually falls, it eventually reaches a level that is low enough to allow the
animal to awaken. The rats that had been pre-treated were at least as sensitive to
the drug as control rats, with waking levels of the drug that were even slightly
lower than those of the control group.
The phenomenon of enzyme induction can produce a dramatic tolerance in terms of the
effective dosage of a drug, but it does not necessarily confer the same degree of
protection against lethal dosage. It fact, the LD-50 can remain virtually the same,
while tolerance increases the requirements for an effective dose (the ED-50) until
it may be almost identical to the lethal dose. Let us examine this curious
phenomenon more carefully with a hypothetical extension of the pentobarbital
tolerance shown above.
The upper panel of Figure 9.6 shows the normal course of a barbiturate drug. After
injection, the drug is rather quickly absorbed into the plasma compartment. When a
certain concentration is reached, sleep ensues while the drug levels continue to
rise and produce a deeper level of anesthesia. Eventually, the drug will reach its
peak concentration, which is determined by the amount of drug, route of
administration and other factors discussed in Chapter 3. Meanwhile, the drug is
being metabolized and the plasma concentration begins to decline. When it reaches a
certain level, the animal awakens and the drug metabolism continues until the drug
has been eliminated from the system.
After several exposures to the drug, the enzyme that degrades the drug has been
induced, and the drug is removed from the system more rapidly (Figure 9.6b). This
shortens the sleeping time by allowing the animal to awaken more quickly. However,
the onset of sleep and the peak concentration of the drug in the plasma may show
little or no change if the absorption of the drug is fast relative to the drug
Now, suppose an attempt is made to duplicate the original drug effect (60 min of
anesthesia) by increasing the drug dosage. The faster rate of drug metabolism
requires a very high dosage to forestall awakening for the full hour. In this
hypothetical case (see Figure 9.6c), the peak plasma levels are very near the lethal
With the margin of safety (the ratio of the LD-50 to ED-50) reduced by tolerance, it
may be advisable to administer multiple doses over time (e.g., a supplemental dosage
every 20 min) to attain the same duration of action (see Figure 9.6d). These
effects demonstrate that tolerance can render a drug considerably more dangerous, a
finding that has important implications in the clinic, the laboratory, and on the
The various types of tolerance are, in some sense, an extension of the concept of
homeostasis. The physiology of the organism reacts to a challenge by attempting to
return the system back to normal. In the case of drugs, this can have important
consequences not only for the changes in the effectiveness of the drug, but also for
the rebound changes that occur when the drug is no longer present.
The rebound phenomena can be observed readily in the case of nicotine and caffeine.
On the surface, it would seem that individuals who use these relatively mild CNS
stimulants should be easily identifiable. They should, perhaps, have faster
reflexes, be more vigilant, require less sleep, or be more aware of the environment.
Or perhaps they should be more irritable, anxious, or jumpy. None of these effects,
positive or negative, is observed. Virtually every attempt to extract an
identifiable difference in physiology or personality between smokers, coffee
drinkers, and nonusers has failed. The differences are revealed when these groups
are compared without drug. The details of the effects are complicated, but nearly
everyone has seen or been either a coffee drinker before the first cup in the
morning or a smoker trying to quit. The major effect of these stimulant drugs is
not to produce an average state of arousal that otherwise could not be attained,
rather they come to prevent the rebound effects that would occur without the drug
(sleepiness, lack of energy, dysphoria, etc.). The mechanisms of tolerance actively
counterbalance the effects of the drug, and this balance can be unmasked by removing
the drug from the system (see Figure 9.7).
The rebound effects can be considerably more serious than a little early morning
grumpiness. The chronic, heavy use of CNS depressant drugs such as barbiturates or
alcohol can set up dangerous counter-effects. When these effects are released by
the abrupt withdrawal from the drug, hyperexcitability occurs, which in severe cases
can lead to convulsions and death. With chronic, heavy use of alcohol, this rebound
hyperexcitability may be seen as largely irreversible motor tremors, especially of
the hands (the DT's or delirium tremens). These rebound effects are most pronounced
when the drugs are withdrawn abruptly after a period of sustained high dosages, but
this is not a prerequisite. So swift is the body's ability to counter these drugs
that a single dosage can set up rebound effects. An overdose of alcohol or
barbiturate first presents the danger of death through its depressant effects,
followed by the susceptibility to seizure activity that may be equally dangerous.
It is for this reason that the most superficially obvious treatment of barbiturate
overdose-- the administration of stimulant drugs such as strychnine or picrotoxin--
Rebound effects are also major factors in the use of amphetamine and related drugs.
The actions of amphetamine are complicated and include both direct effects on the
postsynaptic receptors and the indirect release of the neurotransmitter substance
from the presynaptic vesicles. The behavioral effects include increased arousal,
greater physical energy, a heightened sense of well being, and even euphoria.
Tolerance to these effects occur readily and probably include all of the mechanisms
(transmitter depletion, receptor changes, and enzyme induction) that were discussed
above. A common sequel to the stimulant properties of amphetamine is a profound
depression, the depth of which is related to the amount and duration of the drug
administration. In practice, it is almost impossible to avoid the rebound
phenomena. In part because of the indirect actions of the drug, the effects tend to
be self limiting, with the depletion of transmitter rendering the drug ineffective
until a period of time has been allowed to restore the system toward its previous
B. BEHAVIORAL CONTRIBUTIONS
In studying drug actions and drug effects, it is sometimes easy to forget a basic
fact about the physiology of the nervous system: It was not designed for the
purpose of responding to drugs invented or harvested by man. The actions of drugs
can magnify, interrupt, speed up and delay, but the processes are those that are
inherent to normal functions and the maintenance of the brain and behavior.
Accordingly, we must not limit the phenomena of tolerance and withdrawal to the
realm of drug use, but must search out the relevance to normal functions.
Neurotransmitters can be viewed as endogenous drugs and they surely trigger their
own neuromodulatory and rebound effects. We turn now to some situations in which
behavioral processes impose dramatic limits on the effects of both drugs and
The phenomenon of tolerance can be no simpler than the actions of the drug to which
tolerance is developing. One of the fundamental principles of pharmacology is that
no drug has a single action (cf., side effects in Chapter 3). The extended
implication of this is that no drug can induce a single type of tolerance. If a
drug has a major effect and two distinct side effects, then it is very likely that
tolerance can develop to each of these independently. In some cases, this can be
the determining factor in the development of a drug for clinical use. For example,
if tolerance develops to a troublesome side effect within a few days or weeks, then
the patient may be able to benefit from the long term use of the drug. On the other
hand, if tolerance develops to the main effect, but not the side effect, then the
drug becomes less and less useful over time.
The development of tolerance to specific aspects of drug action may present some
difficulties, but the picture becomes still more complex when behavior is
considered. In the previous section, we developed the notion that behavior per se
could be likened to a drug, triggering rebound effects comparable to those occurring
in response to the administration of drugs. We turn now to a consideration of
behavioral contributions to drug tolerance and present some of the most intriguing
findings in the pharmacological literature.
The development of tolerance to the effects of a drug poses some of the same
problems of interpretation that are encountered in the recovery from brain damage.
In both instances, the initial effects frequently are more pronounced than the long
term effects. In the case of brain damage, the problems of interpretation are
particularly intractable. Does the recovery of some of the lost function reflect
the "take-over" by a related area of the brain? Is it due to some neuromodulatory
effect such as denervation supersensitivity? Or, is it the result of learning to
accomplish the same behavioral goal in a different fashion? There is evidence to
support each of these possibilities, but the conclusions remain tentative because of
the permanence of brain damage.
As researchers began to look more and more carefully at both tolerance and recovery,
some confusing observations began to appear. In some cases, clear recovery of
function following brain damage could be observed on one task but not another.
Likewise, drug tolerance could sometimes be observed in one measure but not another.
There was something missing in the interpretation of these effects. Fortunately,
most drug effects are considerably less permanent than brain damage (although some
of the tolerance and rebound effects may be very long lasting) and can be
administered repeatedly. This opens the door for experimental approaches that can
better answer some of the questions posed above.
One of these approaches, known as the pre-post design, can be used to demonstrate
the phenomenon known as behavioral tolerance. A series of experiments by Carlton
and Wolgin (1971) exemplify this approach. The rats in these experiments received a
restricted diet of food pellets and were given a daily period of access to a
drinking tube that contained sweetened condensed milk. After several days, the
amount of milk that was consumed during this period stabilized and served as a
baseline for the drug and drug tolerance effects. If amphetamine was injected a few
minutes before the daily test session, the milk consumption was greatly reduced.
However, if rats received amphetamine injections before each daily session, the rats
drank a little more milk each day until, by the end of the two-week experiment, the
rats that received amphetamine were drinking as much milk as the control rats that
received saline injections. Thus, tolerance had developed (see Figure 9.8).
The most obvious explanation of the increase in milk consumption is that one of the
pharmacological tolerance mechanisms described above had taken place to reduce the
effectiveness of the drug. Another possibility, however, is that the drug actions
remained essentially the same, but that the animals made a behavioral compensation
that allowed them to drink the milk despite the drug effects. The pre-post design
allowed a test of these alternatives by comparing the milk consumption of the
following treatment groups:
Group CON: Saline injections before each session
Group PRE: Amphetamine injections before each session
Group POST: Amphetamine injections after each session
The critical treatment group is the one that received the amphetamine injections
after each session. Obviously, the drug cannot be influencing the milk consumption
before it is given. (It could influence the consumption during the next session, 23
hr later, through residual effects of the drug or through conditioned aversion
effects, but these possibilities were controlled for in the complete design of the
study.) Even though the measure of milk consumption was taken deliberately in the
"wrong" place, one would still expect that pharmacological tolerance would be
developing to the amphetamine. Enzyme induction could occur, the number of
receptors could be changed, the amount of neurotransmitters could be changed, or
some combination of these could occur. But the measure of milk consumption would be
blind to these effects because it was taken long after the drug was given (23 hr).
The critical test occurred after these mechanisms of tolerance were given an
opportunity to develop. The rats that had been in Group POST now were given a test
dosage of amphetamine BEFORE the milk consumption. If tolerance to the amphetamine
had been developing, as it had in the rats in Group PRE, then the milk consumption
should have remained at the baseline level (see Figure 9.8). Instead, the milk
intake was substantially reduced! How could tolerance to amphetamine develop for
one group of rats but not the other? The answer (although it does not specify a
mechanism) is behavioral tolerance: the rats in Group PRE became tolerant to the
effects of amphetamine on milk consumption. Although pharmacological tolerance may
have developed over the treatment period, this was not sufficient to block its
effects on milk consumption. The necessary component was behavioral experience
while the drug was in effect. The rats, in some sense, had learned to consume milk
despite the effects of amphetamine.
A comparable experiment has been done by Campbell and Seiden (1973) using
performance of a drl task to assess the effects of amphetamine. The performance of
this task, which requires low rates of responding, is severely impaired by the
effects of amphetamine. However, tolerance develops with repeated injections, and
the behavior returns to the normal baseline that was obtained without drug. Again,
the pre-post test design showed that this return of normal behavior could not be
attributed to pharmacological tolerance. Rats that had received repeated injections
of amphetamine, but without the opportunity to perform the drl task while under the
influence of the drug, still showed serious impairment when the drug was given
before the drl test session.
The results of these experiments suggest some clinical considerations that probably
have not received the attention they deserve. The amphetamines and related
compounds are widely used by dieters in both prescription and over the counter
formulations. The long term effectiveness of this therapy is marginal at best. It
seems likely that humans, as well as rats, develop behavioral tolerance and learn to
consume the good things in life (including sweetened condensed milk) despite the
effects of the drug.
One possible explanation of behavioral tolerance is that it somehow blocks out the
ability to perceive the effects of the drug. This is probably not the case. Bueno
and Carlini (1972) showed that the ability of rats to climb a rope was impaired by
THC (the active component of marijuana). After tolerance developed, the rats were
able to climb the rope as well as control animals, but were nonetheless capable of
discriminating (in a different task) the presence or absence of THC.
Environment and ritual.
One of the most dramatic demonstrations of the power of behavioral tolerance has
been demonstrated by Siegel and coworkers (1982). The repeated administration of
opiate drugs produces a remarkable degree of tolerance. In order to maintain the
same level of analgesia over a period of days, the drug must be administered in ever
increasing dosages and can reach levels that may be several times higher than the
LD-50 established for naive animals. These investigators administered a schedule of
increasing dosages of heroin working the rats up to a dose that they could not have
initially tolerated. Half of the rats had received this series of injections in the
colony room. The other half of the rats were removed from the colony to a test room
that differed both visually and by the presence of a 60 dB white noise, where they
received their injection of heroin. As expected, all of the rats withstood the
increasing dosages of heroin.
After completing this series of increasing dosages, the rats were given a single
test dosage of 15 mg/kg of heroin. This is a very large dosage of the drug, being
close to the LD-100 (96% mortality) for rats that have had no experience with the
drug. Rats that had received the series of heroin injections showed a substantial
increase in the ability to withstand the drug, with only a 32% mortality rate.
However, a large portion of this protective effect was attributable to behavioral
rather than pharmacological tolerance. If the rats received the same injection, but
simply in a different room (colony rats in noise room; noise room rats in colony),
they were twice as likely to die (64%). The association of a particular environment
with the administration of a drug adds to the ability to compensate for the effects
of the drug. In this case, the learned aspects of the tolerance can literally mean
the difference between life and death.
Siegel proposes that many of the deaths that occur through drug overdose have a
behavioral component. Addicts frequently develop ritualistic behavior associated
with the administration of a drug (same place, same people, etc.) When this ritual
is changed, for example, by purchasing the drugs on the streets of another city, the
likelihood of death through overdose is increased. The common explanation that the
drug obtained was more potent than that usually used may be true in many cases, but
the behavioral component may be a major factor as well.
How is it possible for an animal to behaviorally reduce a drug effect? Some of
Siegel's earlier work provides a possible answer to this question (Siegel, 1975).
The work centered on the possibility of the Pavlovian conditioning of drug effects.
Suppose, for example that the injection procedure (the room, the handling, the
insertion of the needle) is always conducted in the same manner. This set of
stimuli could serve as a conditioned stimulus (CS) to predict the physiological
changes (UR) that would follow as a result of the drug injection (the unconditioned
stimulus, or US). What would happen after a number of such pairings if saline were
substituted for the drug? The CS (injection procedure) would be the same as always,
but would a conditioned response (the physiological change) be observed?
The drug under investigation was insulin, which lowers the blood sugar levels. If
insulin injections are given in the same manner, is it possible to get a conditioned
change in blood sugar levels that parallels the conditioned salivation that occurs
when a bell has signaled the presentation of food powder? The answer is yes, but
the direction of the effect is opposite that which one might first expect. Instead
of getting a conditioned lowering of blood sugar, Siegel observed a conditioned
increase in blood sugar. This makes perfect sense if the conditioned response is
viewed as an attempt to compensate for a predicted change in the environment.
Normally, the amount of insulin produced by the animal is controlled within rather
narrow limits to regulate the level of glucose utilization by the cells, prepare for
digestive loads, etc. The injection of an outside source of insulin disturbs this
balance, and the animal must reduce its own production of insulin in an attempt to
counteract this effect. According to Siegel's results, this compensatory response
can be learned, and a sham injection procedure causes a reduction in insulin and a
corresponding increase in blood sugar. Of course, little or none of this is
cognitive learning (try to imagine how you would voluntarily reduce your own insulin
levels), but the mere fact of association is sufficient to trigger these processes
according to the laws of Pavlovian conditioning.
Once again, the importance of these phenomena in the clinic should not be
overlooked. When drugs that cause pronounced physiological effects are given over
long periods of time, there is a very real possibility that Pavlovian learning
processes may take place to counteract the effects of the drug.
Opponent Process Theory
Solomon and Corbit's theory of opponent processes is modeled after well established
events that occur in the sensory systems. The most familiar of these are the
negative after images that occur in the visual system. If an individual stares
steadily at a relatively bright object, say a television screen in a dimly lighted
room, the absence of that stimulus produces a curious illusion. When the vision is
shifted to a neutral part of the room, a ghostlike image of the screen is projected
onto the surface and this image is dark rather than light. Hence, the term negative
after image. The negative after image is also familiar in the case of indoor
photography. The bright flash of light is followed by after images (usually
negative, but sometimes alternating positive and negative) of small dots that are
projected onto the "real" visual world. These after images even extend into the
realm of color vision, with the after images being of the complementary color (red
objects produce a green after image, blue objects produce yellow, and vice versa).
Comparable illusions can appear with the motor system, as evidenced by the "light-footed" feeling that occurs when a pair of heavy boots or roller skates is removed.
But does it make sense to apply these principles to something as complicated as
emotions? Probably yes.
There are three major components to the opponent process theory:
Affective contrast is the most fundamental of these, and closely parallels the
response of the visual system to light. The presentation of a bright light produces
a peak response followed by rapid adaptation to a stable level (the A-response in
Figure 9.9). When the light is turned off, a negative after image occurs and
gradually dissipates with time. The magnitude of these effects is related to the
intensity of the stimulus. Several observations can be cited to relate this to
emotions. An infant may be lying quietly in its crib, exhibiting no particular
emotion. If a nipple containing a sugar solution is offered, a positive response is
obtained (the A-response). Withdrawal of the nipple results in vigorous crying (the
B-response), an effect which would not have been observed if the positive stimulus
had not been presented. Comparable effects can be observed in the case of initially
negative stimuli. Electric shock administered to dogs can produce an increase in
heart rate. When the shock is terminated, there is a dramatic decline in heart rate
and the dogs may show behavioral excitement. This is very likely the laboratory
equivalent of the frenzied play activity that sometimes follows the administration
of a bath to a dog (or a child, for that matter!). A more familiar example to
students may be the excited chatter that frequently fills the hallways after a major
The story gets more complicated with affective habituation. If a bright light is
presented for a long period of time, habituation occurs and the perceived brightness
is greatly diminished (the A'-response in Figure 9.9). But when the light is
terminated, the negative after image is both stronger and more enduring than it was
following a brief, initial exposure (the B'-response). In the laboratory, this can
be observed with repeated presentation of shock to dogs. After a time, the shock no
longer produces a change in the heart rate, but the "after-image" (the decrease in
heart rate) becomes very pronounced. Again, the same pattern emerges in the case of
human emotions: The heart throbbing, adrenergic effects of a new amour might well
become a health hazard if they continued; but in the words of the songwriter,
"...after 16 years of marriage, the fires don't burn so hot!" (Harry Chapin).
Returning to the theory, the A'-response takes over, but the stage is set for a
tremendous B'-response if the stimulus should be terminated, e.g, the grief response
that follows the loss of a loved one.
The third aspect of the theory, affective withdrawal, is really just a sharpening of
the concepts described by the first two. We will describe two of the numerous
examples put forth by the theorists. One of these involves the sport (?) of
skydiving. For the naive jumper, the period before the jump is filled with anxiety.
This anxiety is galvanized into terror with the actual jump, and relief follows a
safe landing. Should the individual continue this pastime, the emotions that color
the experience undergo the pattern of affective habituation and contrast described
above. Anxiety is replaced by eagerness, the terror is downgraded to a thrill, and
relief is transformed into intense exhilaration--the raison d'etre for what would
otherwise be a silly thing to do. A similar pattern can be applied to the abuse of
a drug, for example, heroin. The initial presentation is preceded by a state of
rest, the drug's actions produce a "rush", and the aftereffect is one of craving.
With veteran users, there is a shift in the emotions, and the drug's actions produce
a state of contentment (rather than a rush). This contentment (the A'-response) is
followed by abstinence agony (the B'-response), which turns into an intense craving
for the drug, and the drug now has only the capacity to relieve the craving rather
than reproducing the initial rush-- and the circle continues (see Figure 9.10).
The opponent process model is also relevant to many of the paradoxical effects that
accompany goal attainment. Reinforcement in an operant schedule does not
necessarily spur the pigeon into immediate further action, but rather may be
followed by a post-reinforcement pause. The attainment of a long-sought goal such
as a college degree is frequently followed by a bout of depression, and the
postpartum blues are almost unavoidable. Solomon and Corbit emphasize the view that
all of these effects are noncognitive in nature. That is, they are not the result
of a logical, cognitive analysis of the present environment, but rather are the
result of a previous environment that no longer applies.
These changes may well be noncognitive, but they cannot-- if we continue our
attempts to view the brain and behavior in a lawful relationship-- be nonphysical.
Powerful stimuli produce powerful changes in the neurotransmitter systems, and these
in turn trigger the processes of neuromodulation, changes in receptor sensitivity,
and even transmitter depletion. These reactions, like tolerance to a drug, alter
the responses to standard stimuli and set the stage for withdrawal reactions.
Although the opponent process theory has not gained universal acceptance, we shall
risk pushing it one step further in terms of the noncognitive aspects of emotions.
There will not be many students of this book who will recall the Mary Tyler Moore
show, but one of the episodes provided a poignant example of opponent processes in
action. A dear friend of Mary's, Chuckles the Clown, died. Of course, she was
stricken with grief, and she decried all references to the lighter side of his life
and career. At the funeral, however, she was overcome by an uncontrollable urge to
laugh; not hysterical, unfeeling laughter, but true, euphoric, high spirited
laughter. Why? The grief reaction is understandable in the framework presented
above. However, the grief itself is a powerful stimulus that can set up its own
opponent processes, and as grief subsides, periods of unexplainable high spirits may
penetrate the prevailing negative mood (bringing with it a certain burden of guilt).
The point of all this is that the brain is a dynamic system that can respond quickly
in terms of neuronal action potentials, but more slowly in terms of the chemical
adjustment of the overall tonus of a transmitter system. These changes occur in
direct response to the changes in the environment, but the properties of inertia and
momentum do not always allow the changes to reflect, in a veridical manner, what is
happening at a particular moment. An appreciation of these facts can make the
emotional responses to a loss (or for that matter, a gain) a lot more
understandable. Clinicians now expect a recurring cycle of mood changes following a
loss such as a serious knee injury in an athlete. The first phase is one of denial
that the injury is serious or that the loss will have a major impact on the
individual's loss. This is followed by anger. The anger is followed by depression.
The depression, in turn, may be followed by denial that may even take on a flavor of
a high spirited, can-do attitude about coping with the injury. Most of these
changes can be characterized as noncognitive in that they bear little relationship
to the current changes in the outside environment. They are, almost literally, drug
C. FOUNDATIONS OF ABUSE
The terms drug addiction and drug abuse once seemed like eminently reasonable
descriptive terms. The use of certain drugs produced a physical dependence,
creating a situation in which the body required the presence of the drug to maintain
normal physiological functions. This physical dependence on the drug was the basis
for the individual's profound need for the drug, the addiction. These definitions
worked fairly well for certain classes of drugs and drug users, but there was a
growing list of instances in which the definitions seemed inappropriate. Tobacco
use, for example, certainly involves craving, but the degree of actual dependence
(i.e., physiological need) is much less dramatic than in the case of morphine or
barbiturates. There is virtually no danger of death or severe symptoms of
withdrawal even with complete abstinence. A distinction has sometimes been made
between a drug habit and a drug addiction to reflect, in a rather rough manner, the
differing physiological bases that control the use of the drug. These distinctions
blur, however, with differing patterns of use, and the three-pack-a-day smoker may
well have a greater physiological need than the individual who manages to limit the
use of morphine. The distinction is equally blurry when one tries to draw the lines
between moderate drinking, heavy drinking, and alcohol addiction.
As it became apparent that the physiological measures of dependence or the amount of
drug used could not clearly define addiction, new terminology began to arise. The
term addiction began to give way to the term drug abuse, which suggests a greater
behavioral contribution. If an individual's use of a drug is extensive enough to
interfere with work, family, or lifestyle, then the drug is being abused. There are
still fuzzy edges in this definition, but the term is somewhat more realistic than
the term addiction, because it reflects the pattern of use as well as the amount of
There also can be some argument concerning the term drug. Almost everyone will
agree that morphine is a drug, but some will balk at considering coffee as a drug,
and consensus becomes even more difficult in the case of chocolate bars, nutmeg or
peanut butter. This dilemma was met with yet another evolution of the terminology,
and researchers now speak of substance abuse. This too shall pass: There is a
growing recognition (especially with the burgeoning business of state lotteries)
that behavior itself can be the object of abuse. There is a commonality among the
heroin junkies, smokers, coffee drinkers, beer drinkers, gamblers, overeaters,
workaholics, and maybe even runners. They are neither inherently evil nor
necessarily burdens on society, but they are all caught, to some degree, in a
behavioral and pharmacological trap. We turn now to an examination of this trap.
Although physiological dependence may not be essential for addiction or abuse (cf.,
Bozarth & Wise, 1984), it certainly can be an important contributor. This is most
easily seen in laboratory models of addiction in which animals are given the
opportunity to self administer drugs. In some cases, the animals may simply be
given access to a solution that contains the drug and allowed to freely ingest the
substance. More commonly, the drug is used as a reinforcer in an operant
conditioning situation as shown in Figure 9.11. A catheter may be permanently
implanted into a blood vessel (e.g., the jugular vein or carotid artery), with
provision made to connect the catheter to an outside source via a small tube. When
the animal has fulfilled the requirements of the schedule of reinforcement, a small
amount of drug is injected as a reinforcer.
In general, there is a fairly close correspondence between the list of drugs that
are abused by humans and the list of drugs that animals will self administer in the
laboratory. Among these are the opiates, the barbiturates, amphetamines, and some
hallucinogens. The drugs that can be used as reinforcers appear to have three
characteristics in common:
The administration of morphine under laboratory settings not only parallels some of
the features of human drug use, but also parallels some of the aspects of
conventional drives and reinforcers. As in the case of food and water, the drug can
serve as a reinforcer for operant behavior. If the reinforcer is given outside of
the operant setting, there will be a corresponding decrease in lever pressing, while
withdrawal from the reinforcer will result in higher rates of responding when the
subject is returned to the operant chamber. But there is an important difference.
The drug not only acts as a reinforcer, but sets the stage for the development of
the motivation to obtain the drug. Presumably, the rat has not had a lifelong
yearning to obtain morphine, nor can we attribute the initial lever pressing to peer
pressure or the ills of society. The first dosage of morphine appears to have some
immediate reinforcing value, but more importantly, it initiates a chain of
physiological events that now result in a deprivation state that was not there
before: The absence of morphine is aversive. Eventually, the positive rewarding
effects of morphine may pale in comparison to the aversive effects of not having the
drug, and the resulting behavior may be more akin to avoidance behavior than
responding for reward.
Not all drugs that have abuse potential show such close parallels between human
usage and laboratory models. Researchers have found that it is almost
embarrassingly difficult to get laboratory animals to consume alcohol. The taste is
sufficiently aversive to the naive palate to prevent consumption in amounts that
lead to tolerance, rebound effects, etc. It is usually necessary to coerce the
animals to consume the alcohol by making it a part of their required food or water
supply. However, once the alcohol consumption has been established, the animals
will readily and voluntarily maintain the "habit". But why should it be so
difficult to establish alcohol abuse in the rat while it is so difficult to prevent
it in man? There is no simple answer to this question, but a more careful analysis
of the drug administration procedures may provide some important clues.
Goldberg and associates (1981) performed an experiment in which monkeys were given
the opportunity to press a lever to obtain a small intravenous injection of
nicotine. Although the monkeys pressed the lever enough to receive a few injections
(thereby having the opportunity to experience the drug effects), the rate of
pressing did not increase, but rather remained at the low level that was shown by a
control group that received saline injections. Again, the failure to demonstrate
self administration was curious in view of the ability of nicotine to reinforce
behavior in humans. These investigators made a clever extension of their results in
a second experiment: Whenever the rats earned a reinforcement, it resulted in both
the drug injection and the change of a green light into amber as the subjects
entered a 3-min period of darkness during which time the drug effect developed.
This additional stimulus had no effect on animals that were receiving saline
injections, but greatly enhanced the self administration of nicotine.
Why should the addition of an external stimulus aid the establishment of a nicotine
habit? And even if it works, does it not belittle the results somewhat to have to
resort to this sort of a crutch to demonstrate the drug administration? The answer
to both questions may be found in a series of experiments performed by Snowden
(1969). There was some controversy about whether the regulation of the amount of
food eaten is controlled by the acts of chewing, tasting and swallowing, or by
monitoring the caloric feedback from the food in the stomach. One way of testing
these alternatives was to place the rat in a situation in which all nutrients were
obtained by pressing a lever to inject liquid diet directly into the stomach. This
procedure is directly comparable to that used for the self administration of drugs,
and most experiments demonstrate that the rats are remarkably accurate in
controlling the overall calories that are ingested in this manner. But Snowden
showed that the results were not as clear-cut as they seemed. The liquid diet is
prone to spoilage in these long term experiments, but the problem can easily be
avoided by keeping the reservoir in an ice bath. This prevents the spoilage, but
when the rat earns a reinforcement, it receives not only a small amount of diet in
the stomach, but also experiences a cool tactile sensation as the liquid passes
through the tube under the skin of the head and neck en route to the stomach. When
Snowden warmed the liquid to body temperature before it reached the skin, the
ability of the liquid diet injections to serve as a reward was greatly diminished.
Why should this happen?
In both cases, the reinforcing value of a substance was enhanced by the addition of
some external stimulus. The most likely explanation of these results is that the
external stimulus helps to bridge the gap in time between the physical delivery of
the reinforcer and the actual physiological change that results. In the real world
and most laboratory situations the presence of these external mediators are the rule
rather than the exception. The sight, smell, taste, and texture of food are all
powerful reinforcers that signal the ultimate physiological reward, caloric energy.
In terms of the immediate ability to control behavior, these harbingers of
physiological change are more important than the real change. When the situation is
so tightly controlled that only the physiological change can be experienced, the
reinforcing value is greatly diminished. The situation becomes, in a sense, a
Pavlovian delayed conditioning procedure which is successful only after many trials,
if at all.
The picture that emerges is that environmental cues and behavior are an inextricable
part of drug effects and of drug abuse. These behaviors become an important part of
the overall pattern of abuse, even when the drug action is so fast that an external
stimulus is not essential to bridge the gap. Consider, for example, the
administration of nicotine by cigarette smoking. How does a drug that requires an
environmental bridge in the laboratory gain such control over so many people in the
natural environment? One reason is that the route of administration is ideal in
terms of the speed of the effect. The inhalation of nicotine in tobacco smoke
produces very rapid effects, reaching the brain within 8 seconds (Jaffe, 1980).
This is even faster than an intravenous injection into the arm, and is fast enough
that each puff of the cigarette can produce a discrete, detectable drug effect!
This rapid delivery of distinct reinforcements serves not only to maintain the
behavior, but also provides an excellent environment for the development of
secondary reinforcers of associated behaviors such as manipulation of the
cigarettes, oral contact, the smell of the smoke, and specific times and places
(e.g., after meals, while driving, while reading the paper, etc.)
Environmental and behavioral cues are not only important contributors to the
rewarding effects of drugs, but also to the motivational states that direct the
organism toward specific drug effects. Certainly, a major aspect of these
motivational states can be attributed directly to the physiological actions of the
drug. The effects of enzyme induction, neuromodulation and rebound phenomena all
contribute to an internal environment that can be "corrected" by an additional
dosage of the drug. But certain aspects of these physiological changes can be
influenced by learning, as we have already seen in the cases of insulin or morphine
injections. Stressful environments may be especially potent in this regard because
of previous situations in which engaging in the rewarded behavior (e.g., smoking a
cigarette) has led to a rapid, rewarding effect. The rewarding effects may be even
more pronounced with a drug such as alcohol, which has some inherent properties of
It is even possible to go a step further in the analysis of environmental cues and
show that these are important not only in helping to mediate the motivational state
and the rewarding effects, but also in contributing to the behavioral outcome of the
drug use. A particularly intriguing example of this has been shown in a clever
experimental design that was developed by John Carpenter (cf., Marlatt & Rohsenow,
1981). This design unveiled the phenomenon that has come to be known as the
Think-Drink effect. The critical feature of the design was the development of a
cocktail that tasted the same with or without alcohol. The recipe was four parts
tonic water, one part vodka, and one part lime juice. The nonalcoholic version of
this was simply five parts tonic water and one part lime juice. Preliminary tests
showed that the identification of these two recipes was at chance levels, the
protestations of the seasoned drinkers' palates notwithstanding. The design of the
experiment, shown in Figure 9.12, was a 2 X 2 design in which the subject either
received alcohol or not and were told that they were receiving alcohol or not.
Thus, some of the subjects expected the effects of alcohol when it was not present,
while other were not expecting the effects of alcohol when it was present. The
behavioral measures (including social aggressiveness, talkativeness, motor
coordination, and others) showed that the behavior was more closely related to what
the participants thought they were drinking than to what they actually were
drinking! Obviously, alcohol is a real drug, and with large dosages there is no way
to think one's way to normalcy. However, the results of these experiments suggest
that much of the stereotyped behavior associated with alcohol use may occur before
the physiological effects are present or at doses which would not be sufficient to
produce the behavior directly, and there is also evidence for behavioral tolerance
when specific behaviors are practiced under the influence of the drug (e.g., Wenger
et al, 1981).
The environmental factors become especially important when a distinction is made
between use and abuse of drugs. Alcoholism is certainly one of the most costly of
society's ills. The obvious solutions of voluntary abstinence or legal prohibition
seem not to work. Accordingly, many researchers have turned their attention to the
causes of alcoholism. One of the most interesting set of findings is that there are
some subcultures that use a fairly large amount of alcohol, but have very low
incidences of abuse (e.g., Aronow, 1980). Several features of alcohol use seem to
be common among these groups. Children are exposed to alcohol and use alcohol at an
early age, usually in the form of wine or beer as part of the meal. The parents do
not become inebriated and there are strong sanctions against those who do.
Inebriation is never viewed as something humorous or daring. The use of alcohol in
moderation is simply taken for granted, with neither positive nor negative
attributes attached. A glass of wine can be accepted or refused with the same
impunity as one accepts or refuses a slice of bread. This contrasts sharply with
the more traditional Middle America pattern that prohibits the use of alcohol in the
young, while viewing inebriation as a source of humor ("Did you hear the one about
the drunk who...") and the ability to drink as a sign of adulthood, authority and
power. As children approach adulthood (or as they want to approach adulthood), they
surreptitiously obtain and consume alcohol, usually in excess and almost always
under conditions of stress--the perfect conditions for establishing the use of the
drug for the purposes of aggrandizement and stress reduction.
Breaking the Cycle
One of the greatest ironies of humanity is that almost everyone assumes free will
and control over behavior while, at the same time, ruing the fact that they cannot
stop smoking, overeating, drinking coffee, gambling, drinking alcohol, taking
tranquilizers, or biting fingernails. Breaking these so-called habits is one of the
most difficult areas of behavior. There are those who claim it is simply a matter
of will power and that they could stop at any time they really wanted to. Indeed,
some do, but the rate of recidivism is high. One of the main reasons for the high
rate of returning to the habit is that it has been linked to stressful situations.
Abstinence is itself a stressful event, and only serves to increase the likelihood
of the behavior, especially during the early stages. Schachter has claimed (on the
basis of informal surveys; 1982) that the statistics are unnecessarily pessimistic
because they are based upon individuals who seek professional help. His
observations suggest that there are many individuals who lose weight or give up
smoking without professional help and with considerably lower chances for returning
to the original patterns of behavior. Whether or not these observations will hold
up under more rigorous scrutiny, it is clear that there are many cases in which the
behavior is especially intractable. There is no clear formulation that can
guarantee success in the attempt to break drug abuse patterns, but several
suggestions can be made, based upon the way in which the abuse pattern has been
established and maintained.
1. Change the US effects of the drug
If drug use is viewed as a straightforward example of conditioning, then the
rewarding drug effects can be considered as the unconditioned stimulus or US. One
of the more obvious ways to interfere with this chain of events is to change the
effects of the drug. Perhaps the best known example of this is the drug known as
Antabuse, which interferes with the metabolism of alcohol. The ingestion of alcohol
causes severe gastrointestinal illness when this drug is present, and it is
necessary to refrain from taking Antabuse for about three days before alcohol can be
consumed without experiencing these ill effects. This drug has been used with some
success in clinical settings, but one of the obvious drawbacks is that the
individual must take the Antabuse on a regular basis.
Another example of interference with the US effect is the substitution of methadone
for heroine abuse. The methadone is not without its own potential for abuse, but
the cravings for the drug and the withdrawal effects appear to be somewhat less
severe and (perhaps most importantly) it is usually prescribed under careful
conditions in known quantities and purity.
A third example is somewhat akin to fighting fire with fire. One of the ways in
which smokers have been aided in breaking the cycle of smoking cigarettes is to
produce the drug effects via a different route. Chewing gum that contains nicotine
can produce the drug effects without engaging in the sequence of behaviors that has
been established by smoking. Others may turn to what is by most standards (baseball
players excepted) the even less socially accepted habit of taking snuff or chewing
tobacco (Will restaurants adopt spitting and non-spitting sections?) Social customs
aside, these are valid methods of interrupting the cycle, because it provides for
the first time, a separation between the behavioral patterns and the drug effect.
If it is properly guided (and there is always the danger of substituting one habit
for another, or even adding one habit to the other), the individual can eliminate
many of the behavior patterns without suffering the physical symptoms that might
2. Change the reward structure
To some extent, this category overlaps with the previous one, but there are ways in
which the reward structure can be changed to aid in reducing some of the behavioral
components of the pattern. One way which has been moderately successful (although
care must be taken to avoid nicotine poisoning) is forced smoking. The individual
is forced to rapidly smoke one cigarette after another, rather than leisurely
puffing away in the normal manner. This has two consequences: It usually causes
some degree of discomfort due to the rapid effects of the high dosage (thereby
associating aversive consequences with the behavior of smoking), and it again allows
a way in which the drug effects can be obtained under unusual circumstances.
This general type of technique has also been used in the case of food abuse
(overeating) by attempting to limit eating strictly to mealtimes under rigid
3. Change the environmental cues
One of the hallmarks of substance abuse (it is unfortunate that the phrase abusive
behavior has the wrong connotation) is that it involves a high degree of
ritualization-- the cigarette after breakfast, the drink or two after work, the
pretzels while watching TV. An important part of any program to eliminate habitual
behaviors is to change these environmental cues whenever possible. This may involve
a new schedule, such as skipping breakfast or eating breakfast at a later hour,
avoiding certain locations, moving the TV to a different room, changing a work
schedule, etc. Usually, this is not easy. There are too many restrictions in most
lifestyles to allow very major changes. One of the major advantages of a formal
clinical setting is also one of the major disadvantages: On the one hand, the new,
controlled environment is a tremendous aid in helping to interrupt the patterns of
behavior that have maintained the pattern of abuse. On the other hand, once the
behavior has been changed and the patient leaves, it is very likely that the return
to the previous environment will re-trigger all sorts of cues that have supported
the habit in the past. Clinical psychology (and medical practice, for that matter)
would be much simpler if the patients did not have to return to the causes of their
disorders when they left the couch.
4. Avoid using the drug
This is an obvious truism, but it is mentioned here because of some recent
controversy concerning alcohol consumption. Several different support groups, most
notably Alcoholics Anonymous, have long advocated complete abstinence once the
drinking behavior had been disrupted. Their view is that there is never a cure, and
that any drinking will reestablish all of the behavioral patterns of abuse. It has
been suggested that this requirement may be a bit too Spartan, and that controlled
drinking might be allowable (cf., Sobell & Sobell, 1978). This seems not to be the
case: In a follow-up study of 20 alcoholics who participated in the controlled
drinking experiment (Pendery et al, 1982), the following dismal results were found:
1 was successful, 8 continued to have drinking problems, 6 returned to abstinence
voluntarily, 4 died, and 1 was missing.
The presence of the drug in the body not only serves to reactivate some of the
metabolic systems that were changed through previous exposure, but also recreates
internal conditions that have been strongly associated with the relevant behavior
patterns that supported the pattern of abuse. Apparently, truisms prevail.
1. The effect of a drug can be decreased or increased by changing the access of the
drug molecules to the receptors, or by setting up a physiological opposition to the
2. The mechanisms of tolerance (or sensitization) can change nearly any feature of
the drug's actions.
3. Indirect acting drugs may lose their effectiveness rapidly through depletion of
the transmitter stores.
4. As neuromodulatory processes take place, the changes in receptor number or
sensitivity is reflected by a gradual change in response to the drug.
5. The presence of a drug may induce the formation of enzymes that will inactivate
the drug more quickly when it is administered in the future.
6. If a drug that has been present for some time is abruptly withdrawn from the
system, it unmasks compensatory reactions and opposite, rebound effects may be
7. The opponent process theory applies many features of tolerance to emotional
responses and to some of the phenomena of addictive behaviors.
8. In some cases, the development of tolerance requires that specific behaviors
occur while the drug is in effect, a phenomenon known as behavioral tolerance.
9. The pre-post design has been used to separate the effects of pharmacological
tolerance from behavioral tolerance.
10. Some aspects of behavioral tolerance may be the result of the Pavlovian
conditioning of compensatory responses.
11. Drug addiction, drug abuse, and substance abuse are all terms that apply to
behavior that is maintained by acquired motives.
12. Self administration procedures are used as animal models of drug abuse in
13. Environmental stimuli associated with drug use may serve as important bridges
for the development and maintenance of habitual drug use.
14. The effectiveness of a drug may be significantly changed by the user's
expectations, as in the think-drink effect.
15. The development of substance abuse may interact with normally occurring states,
especially those involving stress.
16. The drug abuse cycle may be interrupted at several points that are specified by
the laws of learning.
Negative after image