Ask your own question, for FREE!
Social Sciences 15 Online
OpenStudy (anonymous):

You are watching a rat pressing a lever in a Skinner box to obtain food pellets. The rat is pressing the lever at a slow, steady rate, but it does not stop, even when a food pellet is delivered. In this example, the reinforcement schedule that is in place is most likely

OpenStudy (anonymous):

This is the same kind of thing we see when we go to casinos. Specifically, on the slot machines. People on the slot machines keep pulling the slots. They don't stop, even when they win money, because they're hoping they'll win more. The reason is that the people on the slot machines have no idea how many times they'll need to pull the slot machines, before they win more money. It's random every time. ~~~~~~~~~~ For example, if you're on a slot machine, you might pull the slot three times, before you win. But who knows when you'll win again? It could be ten tries from now, or you could win on the very next try! That's why gambling is so addictive. ~~~~~~~~~~~ This is what we call a "variable ratio" reinforcement schedule. A "variable ratio" reinforcement schedule is where we have no idea how many times we have to do something, before we are rewarded. The people on the slot machines have no idea how many times they need to play, before they win again. And the rats have no idea how many times they need to press the lever, before they get another food pellet.

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!