A behavior followed by a reinforcing stimulus results in an increased probability of that behavior occurring in the future. 


Мы поможем в написании ваших работ!



ЗНАЕТЕ ЛИ ВЫ?

A behavior followed by a reinforcing stimulus results in an increased probability of that behavior occurring in the future.



What if you don't give the rat any more pellets? Apparently, he's no fool, and after a few futile attempts, he stops his bar-pressing behavior. This is called extinction of the operant behavior.

A behavior no longer followed by the reinforcing stimulus results in a decreased probability of that behavior occurring in the future.

Now, if you were to turn the pellet machine back on, so that pressing the bar again provides the rat with pellets, the behavior of bar-pushing will «pop» right back into existence, much more quickly than it took for the rat to learn the behavior the first time. This is because the return of the reinforcer takes place in the context of a reinforcement history that goes all the way back to the very first time the rat was reinforced for pushing on the bar.

Schedules of reinforcement

Skinner likes to tell about how he «accidentally* — i.e. operantly — came across his various discoveries. For example, he talks about running low on food pellets in the middle of a study. Now, these were the days before «Purina rat chow» and the like, so Skinner had to make his own rat pellets, a slow and tedious task. So he decided to reduce the number of reinforcements he gave his rats for whatever behavior he was trying to condition, and, lo and behold, the rats kept up their operant behaviors, and at a stable rate, no less. This is how Skinner discovered schedules of reinforcement.

Continuous reinforcement is the original scenario: Every time that the rat does the behavior (such as pedal-pushing), he gets a rat goodie.

The fixed ratio schedule was the first one Skinner discovered: If the rat presses the pedal three times, say, he gets a goodie. Or five times. Or twenty times. Or «x» times. There is a fixed ratio between behaviors and reinforcers: 3 to 1, 5 to 1, 20 to 1, etc. This is a little like

«piece rate* in the clothing manufacturing industry: You get paid so much for so many shirts.

The fixed interval schedule uses a timing device of some sort. If the rat presses the bar at least once during a particular stretch of time (say 20 seconds), then he gets a goodie. If he fails to do so, he doesn't get a goodie. But even if he hits that bar a hundred times during that 20 seconds, he still only gets one goodie. One strange thing that happens is that the rats tend to «расе» themselves: They slow down the rate of their behavior right after the reinforcer, and speed up when the time for it gets close.

Skinner also looked at variable schedules. Variable ratio means you change the «x» each time — first it takes 3 presses to get a goodie, then 10, then 1, then 7 and so on. Variable interval means you keep changing the time period — first 20 seconds, then 5, then 35,then 10 and so on.

In both cases, it keeps the rats on their rat toes. With the variable interval schedule, they no longer «расе» themselves, because they no can no longer establish a «rhythm* between behavior and reward. Most importantly, these schedules are very resistant to extinction. It makes sense, if you think about it. If you haven't gotten a reinforcer for a while, well, it could just be that you are at a particularly «bad» ratio or interval. Just one more bar press, maybe this'll be the one.

This, according to Skinner, is the mechanism of gambling. You may not win very often, but you never know whether and when you'll win again. It could be the very next time, and if you don't roll them dice, or play that hand, or bet on that number this once, you'll miss on the score of the century.

Shaping

A question Skinner had to deal with was how we get to more complex sorts of behaviors. He responded with the others for years who have been conditioning to behave themselves in fairly normal ways, such as eating with a knife and fork, taking care of their own hygiene needs, dressing themselves, and so on.

There is an offshoot of b-mod called the token economy. This is used primarily in institutions such as psychiatric hospitals, juvenile halls, and prisons. Certain rules are made explicit in the institution, and behaving yourself appropriately is rewarded with tokens —poker chips, tickets, funny money, recorded notes, etc. Certain poor behavior is also often followed by a withdrawal of these tokens. The tokens can be traded in for desirable things such as candy, cigarettes, games, movies, time out of the institution, and so on. This has been found to be very effective in maintaining order in these often difficult institutions.

There is a drawback to token economy: When an «in-mate» of one of these institutions leaves, they return to an environment that reinforces the kinds of behaviors that got them into the institution in the first place. The psychotic's family may be thoroughly dysfunctional. The juvenile offender may go right back to «the 'hood». No one is giving them tokens for eating politely. The only reinforcements may be attention for «acting out», or some gang glory for robbing a Seven-Eleven. In other words, the environment doesn't travel well.

Skinner was to enjoy considerable popularity during the 1960»s and even into the 70»s. But both the humanistic movement in the clinical world, and the cognitive movement in the experimental world, were quickly moving in on his beloved behaviorism. Before his death, he publicly lamented the fact that the world had failed to learn from him.



Поделиться:


Последнее изменение этой страницы: 2016-12-17; просмотров: 247; Нарушение авторского права страницы; Мы поможем в написании вашей работы!

infopedia.su Все материалы представленные на сайте исключительно с целью ознакомления читателями и не преследуют коммерческих целей или нарушение авторских прав. Обратная связь - 3.238.195.81 (0.082 с.)