The dogs,the cats and the pigeons
1. Pavlov’s dogs
Ivan Pavlov, a Russian physiologist, conducted experiments that laid the groundwork for what would become known as classical conditioning. His most famous experiment involved dogs and their salivary response to food.
In his initial studies, Pavlov investigated digestion. He noticed that dogs began to salivate not only when they saw or smelled food but also when they heard the footsteps of the person who usually fed them. Intrigued by this anticipatory response, Pavlov designed an experiment to understand it better.
He introduced a neutral stimulus—a bell ringing—that alone did not cause salivation. Then, he paired the sound of the bell (the conditioned stimulus) with the presentation of food (the unconditioned stimulus) that naturally triggered salivation (the unconditioned response). After several pairings, the dogs began to salivate at the sound of the bell alone, even before the food was presented. The salivation in response to the bell became the conditioned response.
Pavlov’s conclusion from these experiments was that through repeated pairings, a neutral stimulus could become a conditioned stimulus capable of triggering a response that was originally elicited by a different, unconditioned stimulus. This principle of classical conditioning has been foundational in the field of psychology, particularly in the study of learning and behavior.
2. Thorndike’s cats
Edward L. Thorndike, an American psychologist, is well-known for his work on learning theory, which led to the development of connectionism and operant conditioning. One of his most famous experiments involves the use of puzzle boxes, often referred to as “learning boxes” or “Thorndike boxes,” to study animal learning.
In his experiments, Thorndike placed cats inside the puzzle boxes, which were designed in such a way that the cats could escape only by performing certain actions, such as pulling a lever or pushing a button. Initially, the cats would exhibit random movements until they accidentally performed the action that opened the door. Once free, they could access food placed outside the box.
Thorndike observed that over multiple trials, the cats became faster at escaping the boxes. He concluded that the cats were learning through a process he termed “trial and error.” He theorized that behaviors that led to favorable outcomes were more likely to be repeated, while those that led to unfavorable outcomes were less likely to be repeated. This concept is encapsulated in his Law of Effect, which states that behaviors followed by positive consequences are strengthened, whereas those followed by negative consequences are weakened.
From these experiments, Thorndike formulated several key principles:
The Law of Effect: Behaviors that produce satisfying effects in a particular situation are more likely to occur again in that situation.
The Law of Readiness: Responses are facilitated when the organism is ready to perform them.
The Law of Exercise: Practice strengthens connections between stimuli and responses.
These findings contributed significantly to the understanding of how learning occurs and paved the way for further developments in behaviorist psychology, notably by influencing the work of B.F. Skinner and his concept of operant conditioning.
3. Skinner’s pigeons
B.F. Skinner, a prominent American psychologist, developed the concept of operant conditioning through his experiments using what came to be known as the “Skinner Box.” This apparatus was designed to study voluntary behaviors and how they are influenced by their consequences. In the Skinner Box, an animal, often a rat or a pigeon, was placed in an enclosure that contained a mechanism for delivering rewards or punishments. The animal could press a lever or peck a button to receive a reward, such as food or water. The box was also equipped with devices to record the animal’s behavior. Skinner’s experiments revealed that behaviors could be shaped by their consequences. If a behavior was followed by a reward, the animal was more likely to repeat that behavior. Conversely, if a behavior was followed by punishment or no consequence, the frequency of that behavior would decrease or not increase. This principle became known as the Law of Effect. Skinner discovered that continuous reinforcement (rewarding every single desired behavior) was effective for teaching new behaviors, but intermittent reinforcement (where rewards are given unpredictably) resulted in a stronger and more persistent behavior pattern. This finding explains why behaviors learned through intermittent reinforcement are harder to extinguish.
From these experiments, Skinner formulated several key conclusions about operant conditioning:
Reinforcement Schedules Dictate Behavior Frequency and Persistence: Skinner found that the timing and consistency of rewards significantly influence how frequently and consistently a behavior is exhibited. Continuous reinforcement is effective for introducing new behaviors, but variable reinforcement schedules, where rewards are given intermittently and unpredictably, create behaviors that are more resistant to extinction and are thus more durable.
Punishment Does Not Necessarily Foster Learning: Although punishment can diminish the occurrence of unwanted behaviors, it does not inherently teach alternative, desirable behaviors. This insight highlights the distinction between merely suppressing an action and actually guiding an individual toward more constructive behaviors.
Shaping Complex Behaviors Through Successive Approximations: Skinner introduced the concept of shaping, which involves reinforcing progressively closer approximations of a target behavior. This method effectively teaches intricate behaviors by breaking them down into manageable steps, each of which is rewarded, leading to the eventual mastery of the complete behavior. Skinner’s work on operant conditioning had significant implications for understanding behavior modification and learning in both animals and humans, and it has been applied in various fields, including education, behavior therapy, and animal training.
Having read up to this point, you may have keenly realized that behaviorism’s focus on observable behaviors leads to precise and objective measurements. This emphasis on empirically verifiable phenomena renders the discipline highly scientific. However, in stark contrast to the previously highlighted importance of cognition in play, behaviorism entirely overlooks mental processes such as thinking, perceiving, and feeling. Though our ensuing discussions will revolve around behaviorism, it is worth noting that, in actual research and design practices, we often integrate insights and findings from diverse psychological perspectives to achieve a more comprehensive understanding.
Game Psychology: Understanding Player Mentality and Game Design

Game Psychology: Understanding Player Mentality and Game Design

Reach your personal and professional goals
Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.
Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.
Register to receive updates
-
Create an account to receive our newsletter, course recommendations and promotions.
Register for free