01-1: Defining Learning
Psychology of Learning
Module 01: What is Learning?
Part 1: Defining Learning
Looking Back
This is the first module in your study of the psychology of learning—a journey into understanding one of the most fundamental processes in psychology. Before we can explore the mechanisms and principles of learning, we must establish a clear, scientific definition of what learning actually is. You might think you already know what learning means—after all, you’ve been doing it your entire life—but as we’ll discover, psychologists use a very specific definition that helps distinguish learning from other ways behavior can change (Domjan, 2015; Mazur, 2017).
What is Learning?
Think about all the ways your behavior has changed throughout your life. You can now read, write, ride a bike, use a smartphone, and navigate social situations. Some of these changes came from learning, but not all behavioral changes count as learning.
Consider these examples:
- A sunflower turns toward the sun throughout the day
- A baby reflexively grasps your finger when you touch their palm
- You feel drowsy after taking cold medicine
- You get better at playing guitar after practicing for months
- Your pupil dilates when you enter a dark room
Which of these involve learning? The answer might surprise you. The sunflower’s movement is a growth response called a heliotropism. The baby’s grasp is an innate reflex. Your drowsiness from medicine is a temporary drug effect. Your pupil dilation is an automatic physiological response. Only getting better at guitar through practice truly involves learning (Tarpy, 1997).
The Definition of Learning
Learning is an inferred change in an organism’s mental state that results from experience and influences what the organism can do (Kimble, 1961; Rescorla, 1988). Each part of this definition matters, so we’ll work through them one at a time.

Learning as a Hypothetical Construct
Learning is what psychologists call a hypothetical construct—a theoretical concept that cannot be directly observed or measured but is inferred from observable behaviors (MacCorquodale & Meehl, 1948). We can’t look inside someone’s brain and see “learning” happening in the way we might see a neuron firing or blood flowing. Instead, we must infer that learning has occurred by observing changes in performance or capability.
This is similar to other psychological constructs like intelligence, motivation, or memory. We can’t directly measure intelligence itself, but we can measure performance on tasks that require intelligence. We can’t see motivation, but we can observe goal-directed behaviors that suggest motivation. Similarly, we infer learning from changes in what an organism can do (Tolman, 1932).
If you study for an exam and then answer questions correctly, we infer that learning took place during your study session. If a rat presses a lever more frequently after receiving food for doing so, we infer that the rat learned the relationship between lever-pressing and food. The learning itself—the actual neural and cognitive changes—remains hidden from direct observation.
Learning is an Inferred Change in Mental State
Because learning is a hypothetical construct, we say it involves a change in the organism’s mental state rather than simply a change in behavior. The actual learning—the formation of new memories, neural connections, and knowledge—happens internally. We can only know that it happened by observing its effects on behavior or performance.
This distinction becomes especially important when we consider that learned capabilities don’t always show up immediately in behavior. An organism may learn something without immediately demonstrating that learning through performance (Tolman, 1948). We’ll explore this learning-performance distinction in detail shortly.
Learning Results from Experience
This component of the definition is crucial because it separates learning from other types of behavioral change (Domjan, 2015). Learning must come from interaction with the environment—from experience. This rules out several types of changes:
Maturation refers to behavioral changes that occur due to biological development and aging. A toddler learning to walk involves maturation of the nervous system and muscles, not learning. Similarly, the physical and hormonal changes of puberty are maturational, not learned (Gesell, 1929). While maturation sets the stage for learning, the two processes are distinct.
Evolutionary adaptations are changes in behavior that occur across generations through natural selection. Birds don’t learn to fly—flying is an evolved adaptation that develops through maturation. However, birds may learn specific flying techniques or routes through experience (Gallistel, 1990).
Fatigue and temporary states don’t involve learning. If you perform poorly on a test because you’re tired, hungry, or sick, that’s not a failure of learning—it’s a temporary state affecting your performance. Once you’re rested and healthy, your learned knowledge is still available.
Learning Influences What the Organism Can Do
The final part of our definition emphasizes that learning creates the potential for behavioral change. Notice we say “what the organism can do” rather than “what the organism does do.” This distinction is critical (Tolman, 1932).
You’ve learned many things that you’re not currently doing. You know how to tie your shoes, but you’re probably not doing it right now. You’ve learned your phone number, but you’re not reciting it. Learning creates capabilities—it expands your behavioral repertoire—even when those capabilities aren’t actively being used.
Learning Versus Performance
This brings us to one of the most important distinctions in learning psychology: the difference between learning and performance. Learning refers to the internal change in knowledge or capability—the hypothetical construct we’ve been discussing. Performance refers to the observable behavior at any given moment (Rescorla, 1988; Spence, 1956).
Why does this matter? Because learning and performance don’t always match up. You might learn something without immediately showing it in your behavior. Or you might fail to perform something you’ve actually learned.
Consider these scenarios:
Latent Learning: In classic experiments by Tolman and Honzik (1930), rats explored a maze with no reward, seemingly not learning anything. But when food was later placed in the maze, the rats immediately ran to it efficiently, taking the shortest route. The rats were learning during their explorations—building a cognitive map of the maze—but not performing that knowledge until it became useful. This demonstrated that learning can occur without being expressed in performance.
Performance Deficits: A student might thoroughly learn material for an exam but perform poorly due to test anxiety, lack of sleep, or illness. The learning occurred and the knowledge is stored, but temporary factors prevented its expression in performance (Yerkes & Dodson, 1908).
Motivation Effects: You know how to clean your room (learning), but whether you actually do it (performance) depends on motivation. The learning was always there; performance is situational and depends on factors beyond just what has been learned.

Clark Hull formalized this relationship in his mathematico-deductive theory of behavior (Hull, 1943). Hull argued that learning must combine with motivation for behavior to be exhibited—learning alone is insufficient. His theory identified multiple factors that influence whether learned capabilities translate into actual performance, including drive (motivation), incentive (properties of the reinforcer that increase or decrease its effectiveness), and emotion (physiological and psychological states that affect behavioral expression). Hull’s approach reminds us that performance is multiply determined; understanding why someone does or doesn’t perform a learned behavior requires considering factors well beyond the learning itself.


Distinguishing Learning from Other Changes
To sharpen the distinction, consider what doesn’t count as learning:
Reflexes are automatic, involuntary responses to specific stimuli. When your doctor taps your knee and your leg kicks, that’s a reflex, not learned behavior. Reflexes are innate—you’re born with them—and they don’t require experience to develop (Sherrington, 1906). However, reflexes can be modified through learning processes like classical conditioning, as we’ll explore later in the course.
Instincts are complex, species-typical behavior patterns that occur without prior experience. A spider spinning its first web doesn’t learn how; the behavior is instinctive (Tinbergen, 1951). While instincts can be modified by experience, the basic pattern exists without learning.
Drug Effects: If you take caffeine and become more alert, your behavioral change resulted from a chemical substance, not from experience. Once the drug wears off, you return to your baseline state. No learning occurred.
Sensory Adaptation: When you first enter a room with a strong smell, you notice it immediately. After a few minutes, you stop noticing the smell. This is sensory adaptation—your sensory receptors becoming less responsive to constant stimulation (Helson, 1964). It’s a temporary physiological change, not learning.
Temporary Versus Relatively Permanent Changes
Learning involves relatively permanent changes. This doesn’t mean learned information can never be forgotten, but it does mean that true learning creates lasting changes in the nervous system (Kandel, 2001).
If you study Spanish for several years and become fluent, that learning persists even if you don’t use Spanish for a while. You might get rusty, but you’ll relearn it much faster than you initially learned it—a phenomenon called savings (Ebbinghaus, 1885/1913). This shows that the original learning created lasting neural changes.
In contrast, if you’re asked to remember a phone number for 30 seconds, you might successfully do so through rehearsal. But if you don’t actually learn the number (transfer it to long-term memory), it will disappear once you stop rehearsing. Temporary retention through rehearsal isn’t the same as learning.
The Role of Practice & Repetition
Most learning involves repeated experiences. While you might learn to avoid a hot stove after touching it just once (Garcia & Koelling, 1966), most learning is gradual and improves with practice. This is because learning involves physical changes in the brain—the formation of new neural connections and the strengthening of existing ones (Hebb, 1949).
Practice doesn’t just mean repetition. Effective practice involves active engagement with material, getting feedback, and adjusting your approach (Ericsson, Krampe, & Tesch-Römer, 1993). When you practice a musical instrument, you’re not just repeating the same motions; you’re refining your technique, listening to the results, and making adjustments.
Why This Definition Matters
Why bother with such precision? The answer is that a precise definition helps us be scientists. It allows us to:
- Distinguish learning from other behavioral changes
- Design experiments that actually test learning rather than performance
- Develop theories about how learning works
- Create effective teaching and training methods
- Understand when learning failures occur and why
For example, if a student does poorly on an exam, our definition helps us ask the right questitransfer-appropriate processingons: Was the learning inadequate? Was it a performance problem due to anxiety or fatigue? Was there insufficient practice? Was the material tested in a way that didn’t match how it was learned (transfer-appropriate processing; Morris, Bransford, & Franks, 1977)? Each of these possibilities suggests different solutions.
The Neural Basis of Learning: Long-Term Potentiation
While learning is a hypothetical construct that we infer from behavior, neuroscientists have identified the neural mechanisms that underlie it. The most important of these is long-term potentiation (LTP)—the strengthening of synaptic connections between neurons that occurs when they are repeatedly activated together (Bliss & Lømo, 1973).
LTP provides the biological basis for Donald Hebb’s famous principle, often summarized as ‘neurons that fire together, wire together’ (Hebb, 1949). When two neurons are stimulated at the same time repeatedly, the connection between them strengthens—they become more likely to activate each other in the future. This is Hebb's Law, and LTP is its neural mechanism.
Consider what happens when you learn a new association—say, learning that a particular ringtone means your phone is ringing. The neurons that process that sound become repeatedly active at the same time as the neurons involved in reaching for your phone. Through LTP, these neural pathways strengthen, creating a lasting change in your brain that allows you to respond automatically to the ringtone in the future. This is learning at the neural level—relatively permanent changes in synaptic strength that result from experience and influence future behavior (Kandel, 2001).
Looking Forward

Now that we understand what learning is (and isn’t) and why it’s considered a hypothetical construct, we’re ready to explore the biological foundations of learning in the next part of this module. We’ll examine how evolution has shaped different organisms’ abilities to learn, introducing the concepts of prepared, unprepared, and contraprepared learning—discovering that not all learning is equally easy and that some things organisms are biologically prepared to learn quickly while others may be nearly impossible.
Everything we study will come back to this fundamental definition: Learning is a hypothetical construct—an inferred change in an organism’s mental state that results from experience and influences what the organism can do.
Media Attributions
- Child-kid-play-study © picjumbo is licensed under a Public Domain license
- Adapting © Jay Brown is licensed under a CC0 (Creative Commons Zero) license
- Learning and Motivation Interact © Jay Brown is licensed under a CC0 (Creative Commons Zero) license
- Reaction Potential © Jay Brown is licensed under a CC0 (Creative Commons Zero) license
- Momentary Effective Reaction Potential © Jay Brown is licensed under a CC0 (Creative Commons Zero) license
- Learning vs. Behavior © Jay Brown is licensed under a CC0 (Creative Commons Zero) license
A growth response in plants toward or away from a stimulus, such as a sunflower turning toward the sun.
Automatic, involuntary responses to specific stimuli; innate responses present from birth that don't require experience to develop, though they can be modified through learning processes like classical conditioning.
An inferred change in an organism's mental state that results from experience & influences what the organism can do; a relatively permanent change in behavior potential due to experience.
A theoretical concept that cannot be directly observed or measured but is inferred from observable behaviors; examples include learning, intelligence, motivation, & memory.
Observable behavior at any given moment; distinct from learning (internal capability); performance can be affected by factors like motivation, fatigue, & emotion that don't reflect the underlying learning.
The recognition that learning (internal capability) and performance (observable behavior) are not the same and don't always match.
Behavioral changes that occur due to biological development & aging rather than experience; not considered learning because it doesn't result from interaction with the environment.
Changes in behavior that occur across generations through natural selection rather than individual learning.
A temporary state affecting performance that does not involve learning.
The range of behaviors an organism is capable of performing based on what it has learned.
Learning that occurs without being immediately expressed in performance, demonstrated when the learned behavior becomes useful.
A mental representation of spatial relationships in an environment, as demonstrated in Tolman's maze experiments.
Hull's formal theory proposing that learning must combine with motivation and other factors for behavior to be exhibited.
In Hull's theory, the motivational state (such as hunger or thirst) that energizes behavior.
A property of reinforcement that increases or decreases its reinforcing effectiveness.
Physiological changes and conscious feelings of pleasantness or unpleasantness, aroused by external or internal stimuli, that lead to behavioral reactions.
Complex, species-typical behavior patterns that occur without prior experience; innate behaviors that don't require learning, though they can be modified by experience.
A temporary decrease in sensory receptor responsiveness due to constant stimulation; not a form of learning.
The phenomenon where relearning previously learned material is faster than original learning, indicating lasting neural changes.
The principle that memory performance is best when the processes used during retrieval match those used during encoding.
The strengthening of synaptic connections between neurons that occurs when they are repeatedly activated together; the neural mechanism underlying learning and memory formation.
The principle that neurons which are repeatedly activated at the same time will develop stronger connections; often summarized as 'neurons that fire together, wire together.'