I was in Las Vegas a couple of months ago to attend a family wedding. Although it was my first time there, I pretty much knew what to expect. Since there was a huge casino right in my downtown hotel, just getting to the wedding chapel involved running the gauntlet past rows of slot machines and various other gambling temptations. To the astonishment of everyone who asked afterward (including the border guard when I returned to Canada), I didn't do any gambling while I was there.
I can give you all sorts of reason why I don't like gambling casinos including the poor odds for winning, the tremendous profits that casinos make off of the countless gamblers who think they can "break the bank", the social and economic costs of chronic gambling, etc. My real reason for not gambling stems from the many hours that I used to spend with a little white rat back in the day...
As a psychology undergraduate, one of the mandatory courses dealt with operant conditioning, i.e., the modification of behaviour using positive and negative reinforcement.First introduced by the legendary B.F. Skinner, his experimental approach to behaviourism (which he named the experimental analysis of behavior) introduced a range of new terms that have befuddled psychology students ever since. Positive reinforcement of an observed behaviour (also known as an "operant") involved the use of rewards to increase the probability of that behaviour occuring in future. Negative reinforcement (not the same thing as punishment) involved the removal or avoidance of a negative stimulus to strengthen recurrence.While punishment and extinction are also part of Skinner's operant model, I'll be focusing on positive reinforcement alone at this point.
Even though Skinner wasn't the first behaviourist, he rejected existing behavioural models with his Radical Behaviorism stressing the non-theoretical nature of his approach. Observed behaviour was the only focus of research with theories of unconscious determinism being totally discarded (sorry, Dr. Freud). Skinner also distinguished between primary and secondary reinforcers with primary reinforcers stemming from biological needs,i.e, food, water, sex, or sleep. Secondary reinforcers were stimuli that have a learned association with primary reinforcers (money is a good example).
To be an effective positive reinforcer, the given stimulus needed to be immediate (or as close to immediate as could be managed) and needed to be contingent on the behaviour of the organism being conditioned. As a practical test of operant conditioning, Skinner developed the operant conditioning chamber (or the "Skinner Box" as it's better known).
Which brings me back to my long hours with that white rat...
Basically, the Skinner box that I used as an undergraduate involved a long transparent chamber to hold the rat reserved for my use.The box contained one lever that the rat could push and a narrow slot next to the lever where a food pellet came out. The point of the box was to test out different reinforcement strategies to see which one worked best in training the rat to press the bar to receive food.
While continuous reinforcement (a pellet each time the bar was pressed) worked the fastest in terms of the rat learning to press the bar for food, it was also most vulnerable to extinction. In other words, the rat quickly stopped pressing the bar when the food pellets stopped coming. Variable reinforcement, on the other hand, worked better. Instead of giving the rat a food pellet every time the bar was pushed, you provided the reward using a variable reinforcement schedule, so that the rat would be unable to anticipate when the pellet would be provided. It took a little longer for the rat to learn to press the bar, but it also took much longer for the rat to stop pressing. In other words, operant conditioning using a variable reinforcement schedule meant that the learned behaviour would persist over time whether or not there was a reward. The long hours that I spent with my rat learning about operant conditioning pretty well ensured that this particular lesson was drilled into my poor head.
And now we come back to Las Vegas...
Although the venerable slot machine long predates the Skinner Box (the first mechanical slot machine was built in 1895), the underlying similarities are pretty striking. Being in the casino and staring down on row after row of slot machines, it was just about impossible not to flash back to that poor white rat and the training sessions. Slot machines work by rewarding lever pulling behaviour with a secondary reinforcer (the casino chips that the machine pays out) on a variable reinforcement schedule. The underlying principles seemed guaranteed to ensure that people keep pulling that lever.
Looking around the casino, it's also hard not to notice that there are no external time cues to be found. No clocks, no windows showing time of day, no hint of time passing (it's a wonder that casinos don't ban wristwatches). While hunger and thirst are potential cues to time passing, the casino staff thoughtfully provides food and drink on request. Washrooms are available when needed and, as for sleep, did I mention that this was in a hotel? The gambling only stops when the gambler runs out of money (and helpful ATMs providing cash advances are all over the place).
An added factor at work in the casino setting is vicarious reinforcement, i.e., reinforcement that occurs in social settings when we see someone else getting a reward. Actually, it was Albert Bandura who first discussed vicarious reinforcement as part of his social learning model but casinos have been using it to influence gamblers for decades. News about "big wins" provides good publicity for casinos and keeps the gamblers coming. The driver of the shuttle bus that brought us from the airport to the hotel was kind enough to tell us about a recent big winner and cheerfully suggested that we try our luck in the casinos as well.
Watching people pulling the levers, it's also hard not to be struck by examples of superstitious behaviour. This is actually one of Skinner's terms involving the accidental reinforcement of a given behaviour which can occur in animals as well as people. Once an unintended reinforcer is associated with certain behaviours, unlearning it takes time and effort. Watching people wander from machine to machine hoping to find one that will "change their luck" seems common enough. The fact that various slot machines are decorated with different colourful themes and music helps this along.
Considering the lucrative business that casinos provide, is it any surprise that cities across North America are pushing to get in on the action? Not only have casinos sprung up in tourist spots around the world, but online versions are becoming increasingly popular (I don't even want to get into lotteries or other kinds of gambling). While problem gambling (a.k.a ludomania) isn't recognized as a mental disorder unless taken to extremes, the potential life-damaging consequences for people who can't keep their gambling under control can be severe. I've never been a big fan of current disease models of pathological gambling (since it's not a disease) and various treatment approaches tend to have only modest success.
People still keep pouring in hoping to win big but, if I had to lay odds, it would be on the casinos.