Psychology Quiz 6.2

Total Word Count: 1223
   Send article as PDF   

stimulus control.

The ability of a stimulus to encourage some responses and discourage others is known as

which behaviors the person has recently not had an opportunity to do.

Suppose we want to find an event that will serve as an effective reinforcer for a given person. According to the disequilibrium principle, we should begin by determining

Reinforce the rat for something simple, like sitting up.

To use shaping to train a rat to press a lever, what will the experimenter do first?

They both strengthen a behavior.

What do positive reinforcement and negative reinforcement have in common with each other?

food

Which of the following is an example of a primary reinforcer?

Stop providing positive reinforcements.

What procedure produces extinction in operant conditioning?

Did the individual’s responses control the outcomes?

If you want to determine whether some example of learning qualifies as classical conditioning or operant conditioning, which question should you ask?

every correct response is reinforced.

In operant conditioning, a continuous reinforcement schedule is one in which

variable-ratio

You attend every new movie that appears at your local theater. You find that most of them are dull (not reinforcing) but really enjoy about one-fourth of them. This is an example of a __________ schedule of reinforcement.

operant conditioning

In the presence of a light, an animal makes a response that is followed by food. The food is given only when the animal makes the response. Which type of learning is this?

after every response.

Under a continuous schedule of reinforcement, when is the animal reinforced?

Learning is based on strengthening responses, not on insights.

Thorndike’s cats improved their ability to escape his puzzle boxes gradually, not suddenly. What conclusion did he draw from this observation?

money

Which of the following is an example of a secondary reinforcer?

We can draw none of these conclusions.

Children who are frequently spanked tend to be ill-behaved. What conclusion (if any) can we draw from this result?

Listen to the crying without responding.

A little boy has learned to get candy by crying. What procedure would lead to extinction of this response?

variable ratio

The more lottery tickets you buy, the greater your chances of winning. However, you have no way of knowing how many tickets you will have to buy before you win. It might be fewer than ten; it might be more than a million. This is an example of which type of schedule of reinforcement?

the animal’s behavior controls the outcomes (including reinforcers).

The main difference between classical and operant conditioning is that in operant conditioning

The punishment is slow and unpredictable.

The threat of cancer is not always effective in discouraging people from smoking. Why not?

You can be reinforced by a chance to do something that you have not been able to do recently.

Which of these states the disequilibrium principle?

negative reinforcement.

If you learn to turn off a dripping faucet to end the "drip, drip, drip" sound, your behavior was changed through

to find a simple behavioristic explanation of learning

What was Edward Thorndike’s research goal?

reinforcing successive approximations to a desired behavior.

Shaping (in the context of operant conditioning) means

You have learned to buy Ida brand potato chips, which are always fresh, and to avoid Hoe brand, which are usually stale.

Which of the following is an example of discrimination in operant conditioning?

law of effect.

Responses that are followed by satisfaction to the animal will be more firmly connected with the situation so that they will be more likely to recur in the future. This is a brief statement of the

escape or avoidance learning.

Negative reinforcement is also known as

fixed interval

An individual receives a reinforcement for the first response after a 1-minute interval, but not again until the next 1-minute interval has passed. This is an example of which type of schedule of reinforcement?

an event that became reinforcing as a result of previous experience

What is a secondary reinforcer?

variable interval

An individual receives a reinforcement for the first response after some period of time, but the amount of time changes. Sometimes the individual has to wait as much as 3 minutes, sometimes as little as 10 seconds. This is an example of which type of schedule of reinforcement?

variable interval

A professor gives unannounced quizzes at unpredictable times. Therefore students must study equally every night. Which type of schedule of reinforcement is this?

variable interval

You scan the night sky looking for meteors. Sometimes there is a brief time period between meteors, but sometimes you have to wait for a long time after seeing a meteor until another one appears. This is an example of which type of schedule of reinforcement?

reinforcing successive approximations to a behavior.

What did B.F. Skinner mean by "shaping"?

We don’t have enough information to answer the question. It depends on whether the food always occurs after the bell, or only if the animal sits up.

When an animal hears a bell, it sits up on its hind legs and drools. Then it receives food. What kind of conditioning is this?

variable interval

You enjoy getting e-mail messages, so you occasionally check your e-mail to find out if you have any new messages. Which schedule of reinforcement is present in this case?

shaping…chaining

Skinner trained a rat to raise a flag and salute during the playing of the "Star-Spangled Banner." The training involved a combination of __________ and __________.

variable interval

Reinforcement on which schedule produces a slow but steady rate of responding?

shaping.

An operant conditioner such as B. F. Skinner might provide you with a reinforcer after you make a sound, then after a louder sound, then after a more pleasant sound, and so forth until you are singing. This procedure would be an example of

chaining.

A rat learns to climb a ladder to a platform where it can pull a string to raise the ladder and then climb the ladder again. The reinforcement for each response is the opportunity to perform the next response. This procedure is known as

the animal’s responses do not control the reinforcements.

The main difference between classical and operant conditioning is that in classical conditioning

Cats don’t solve the problem by understanding.

When Thorndike found that cats gradually improved their performance in a puzzle box, without any point of sudden improvement, what did he conclude?

variable-interval

If you like to go fishing, and the fish are biting on some days and not others, you are reinforced on which schedule?

fixed interval

Your boss provides free coffee and donuts daily at 10:30 am. Showing up at the right time and place is reinforced on which schedule?

True

Fixed ratio is a "count" based schedule of reinforcement.

True

Positive punishment refers to adding something undesirable to an organism’s environment.

True

Variable interval is a "time" based schedule of reinforcement.

True

Positive reinforcement refers to adding something desirable to an organism’s environment.

True

Operant conditioning uses a "schedule of reinforcement" to train animals and people.

True

The term negative means removed.

False

Classical conditioning primarily works with the skeletal muscles.

True

The term positive means added.

True

Operant conditioning uses a "schedule of reinforcement" to train animals and people.

True

The term operant is derived from the word operate.

Scroll to Top