Psych 101 Chapter 5 – Test Bank

Your page rank:

Total word count: 9933
Pages: 36

Calculate the Price

- -
275 words
Looking for Expert Opinion?
Let us have a look at your work and suggest how to improve it!
Get a Consultant

1. The process by which experience or practice results in a relatively permanent change in behavior or potential behavior is known as __________.
a. learning
b. intelligence formation
c. imprinting
d. cognition

A

2. Learning is a process by which experience results in __________.
a. acquisition of motivation
b. relatively permanent behavior change
c. amplification of sensory stimuli
d. delayed genetic behavioral contributions

B

3. Learning is a process by which experience results in:
a. acquisition of motivation.
b. relatively permanent behavior change.
c. delayed genetic behavioral contributions.
d. amplification of sensory stimuli.

B

4. By pairing the ringing of a bell with the presentation of meat, Pavlov trained dogs to salivate to the sound of a bell even when no meat was presented. In this experiment, the presentation of the meat was the:
a. unconditioned stimulus.
b. unconditioned response.
c. conditioned stimulus.
d. conditioned response.

A

5. An experiment finds that a certain male subject always has an increased heartbeat when he hears a certain piece of music. The experimenter sounds a buzzer and then plays the piece of music. The experimenter repeats this procedure until the man responds with an increased heartbeat to the sound of the buzzer alone. In this situation the UNCONDITIONED response is the:
a. increased heartbeat.
b. piece of music.
c. sound of the buzzer.
d. listening to the music.

A

6. Many individuals decide that they feel hungry and eat lunch when they see both hands of the clock on the 12, indicating that it is noontime. This may occur regardless of how recently they ate breakfast. In this example, the conditioned response is:
a. the act of eating breakfast.
b. the act of eating lunch.
c. the counting of the number of hours since breakfast.
d. the watching of the hands of the clock.

B

7. Some of the simplest and most basic learning, that involves the acquisition of fairly specific patterns of behaviors in the presence of well-defined stimuli is:
a. motivation.
b. cognitive dissonance.
c. integration.
d. conditioning.

D

8. Learning that is not manifested until some later time is called _________.
a. manifest content
b. latent content
c. latent learning
d. manifest learning

C

9. Which of the following statements about learning is TRUE?
a. Learning can be directly observed and measured.
b. Learning cannot be directly observed or measured, so performance is observed and learning is inferred based on what the person is able to do.
c. The results of learning must immediately change behavior.
d. none of the above

B

9. Which of the following statements about learning is TRUE?
a. Learning can be directly observed and measured.
b. Learning cannot be directly observed or measured, so performance is observed and learning is inferred based on what the person is able to do.
c. The results of learning must immediately change behavior.
d. none of the above

C

11. Which of the following is an example of learning?
a. The human brain continues to grow and develop after birth.
b. A human male develops the capacity to produce sperm cells at puberty.
c. Drinking coffee makes a person more aroused.
d. A student does not swat at a wasp buzzing around her head.

D

12. Thorndike was known for his work with __________.
a. a puzzle box
b. modeling
c. monkeys
d. a Skinner box

A

13. In the experiment with Little Albert, the conditioned stimulus was __________.
a. Albert
b. the rat
c. the loud noise
d. the laboratory room

B

14. Ivan Pavlov is most closely associated with __________.
a. vicarious learning
b. the Law of Effect
c. operant conditioning
d. classical conditioning

D

15. Who was Little Albert?
a. developer of the concept of classical conditioning
b. an animal trained by using operant conditioning procedures
c. creator of methods for teaching children
d. a child who developed a fear as part of a demonstration of classical conditioning

D

16. The person MOST closely associated with the law of effect is __________.
a. Thorndike
b. Pavlov
c. Watson
d. Skinner

A

17. The person most closely associated with the law of effect is ________ .
a. Watson
b. Pavlov
c. Skinner
d. Thorndike

D

18. The person most directly associated with operant conditioning is ______.
a. Pavlov
b. Watson
c. Thorndike
d. Skinner

D

19. Who formulated the law of effect?
a. Pavlov
b. Skinner
c. Thorndike
d. Watson

C

20. Classical is to _____ as operant is to _____.
a. Pavlov; Skinner
b. Skinner; Pavlov
c. Pavlov; Watson
d. Watson; Pavlov

A

21. Thorndike is to _______ as Skinner is to _______.
a. reinforcement; Law of Effect
b. Law of Effect; reinforcement
c. reinforcement; punishment
d. positive reinforcement; negative reinforcement

B

22. The Law of Effect was proposed by:
a. Titchener.
b. Watson.
c. Skinner.
d. Thorndike.

D

23. Classical conditioning was discovered by:
a. Pavlov.
b. Watson.
c. Thorndike.
d. Skinner.

A

24. We associate the name of _______ most closely with classical conditioning.
a. B. F. Skinner
b. Robert Rescorla
c. Albert Bandura
d. Ivan Pavlov

D

25. Thorndike conducted research on:
a. operant conditioning.
b. classical conditioning.
c. shaping.
d. higher-order conditioning.

A

26. Thorndike’s main apparatus in his operant conditioning research was:
a. a wire monkey.
b. a cognitive map.
c. a puzzle box.
d. a buzzer.

C

27. The "law of effect" was first proposed in the modern scientific community by:
a. James.
b. Skinner.
c. Thorndike.
d. Pavlov.

C

28. The person associated with the law of effect is _______.
a. Watson
b. Pavlov
c. Skinner
d. Thorndike

D

29. B. F. Skinner is known for his theory of:
a. cognitive learning.
b. intelligence.
c. classical conditioning.
d. operant conditioning.

D

30. The learning process studied in the Skinner box is known as:
a. social learning.
b. higher-order conditioning.
c. cognitive learning.
d. operant conditioning.

D

31. The apparatus that has come to symbolize the theory of operant conditioning is the:
a. Rubik’s cube.
b. Skinner box.
c. Pavlov bell.
d. Thorndike puzzle.

B

32. Classical is to _______ as operant is to _______.
a. Pavlov; Skinner
b. Skinner; Pavlov
c. Pavlov; Watson
d. Watson; Pavlov

A

33. Classical conditioning was discovered by _______.
a. Pavlov
b. Watson
c. Thorndike
d. Skinner

A

34. What must be paired together for classical conditioning to occur?
a. unconditioned stimulus and unconditioned response
b. conditioned response and unconditioned response
c. neutral stimulus and unconditioned stimulus
d. neutral stimulus and conditioned stimulus

C

35. Taste aversions seem to be specific examples of what type of learning?
a. classical conditioning
b. insight learning
c. vicarious learning
d. operant conditioning

A

36. When Ivan Pavlov presented meat powder, the dog salivated. The meat powder was the ________ and salivation was the ________.
a. UCR, UCS
b. UCS, UCR
c. CS, CR
d. CR, CS

B

37. By pairing the ringing of a bell with the presentation of meat, Pavlov trained dogs to salivate to the sound of a bell even when no meat was presented. In this experiment, the ringing of the bell was the __________.
a. unconditioned stimulus
b. unconditioned response
c. conditioned stimulus
d. conditioned response

C

38 Rachel has found that when she opens the cupboard door to get the cat food, the cats come running to the kitchen. Rachel knows that this is classical conditioning and that the conditioned stimulus is the __________.
a. cat food
b. cat
c. running of the cats
d. cupboard door opening

D

39. Which of the following illustrates an unconditioned stimulus (UCS)?
a. blinking when air is blown into your eye
b. blinking, when you hear your favorite song
c. your favorite song
d. a puff of air to your eye

D

40. A research participant hears a tone followed by a puff of air directed toward his eye. Later, he blinks when he hears the tone. Before ending the experiment, what could the researcher do in order to extinguish the blinking to that tone?
a. present the tone alone repeatedly
b. present the puff of air alone repeatedly
c. increase the loudness of the tone
d. increase the amount of air that is directed toward the eye

A

41. When Casey opens the closet door to get some dog food, her dog salivates. What is the conditioned stimulus in this example?
a. dog food
b. the cat running
c. the sound of the closet door opening
d. the dog

C

42. By pairing the ringing of a bell with the presentation of meat, Pavlov trained dogs to salivate to the sound of a bell even when no meat was presented. In this experiment, the presentation of the meat was the __________.
a. unconditioned stimulus
b. unconditioned response
c. conditioned stimulus
d. conditioned response

A

43. Rachel has found that when she opens the cupboard door to get the cat food, the cats come running to the kitchen. Rachel knows that this is classical conditioning and that the unconditioned stimulus is the __________.
a. cat food
b. cat
c. running of the cats
d. cupboard door opening

A

44. A kind of therapy closely related to classical conditioning is known as __________ therapy.
a. desensitization
b. conditioned
c. psychoanalytic
d. response

A

45. Which of the following statements about classical conditioning is true?
a. Most classical conditioning requires repeated trials.
b. One trial is usually enough for conditioning to occur.
c. Learning will continue to increase indefinitely.
d. Learning is more effective if trials follow each other very quickly.

A

46. New learning that works in the opposite direction from the original learning results in ________ .
a. shaping
b. generalization
c. spontaneous recovery
d. extinction

D

47. Instinctive or involuntary behavior would probably be BEST modified by ___________ .
a. operant conditioning
b. trial and error
c. classical conditioning
d. shaping

C

48. We associate the name of ___________ most closely with classical conditioning.
a. B. F. Skinner
b. Robert Rescorla
c. Albert Bandura
d. Ivan Pavlov

D

49. Rachel has found that when she opens up the cupboard door to get the cat food, the cats come running to the kitchen. Rachel knows that this is classical conditioning, that the unconditioned stimulus is the __________ and that the conditioned stimulus is the ___________.
a. cat food; cupboard door opening
b. kitchen; cat food
c. cupboard door opening; cat food
d. cat food; kitchen

A

50. In classical conditioning, the interstimulus interval refers to the amount of time between ________.
a. learning trials
b. extinction trials
c. presentation of the conditioned stimulus and presentation of the unconditioned stimulus
d. experimental sessions

C

51. Presenting the unconditioned stimulus before the conditioned stimulus is known as _________ conditioning.
a. classical
b. operant
c. backward
d. aversive

C

52. Little Albert (Watson, 1920) learned through classical conditioning to fear ______.
a. brown cats
b. black dogs
c. white rats
d. his mother

C

53. An automatic, innate, and involuntary response to an environmental events is an ________.
a. UR
b. reflexive response
c. unconditioned response
d. all of the above

D

54. In classical conditioning, when a neutral stimulus is paired with a stimulus that naturally elicits a response, the neutral stimulus eventually elicits a similar response or becomes a/an _______ stimulus.
a. conditioned
b. discriminative
c. higher-order
d. unconditioned

A

55. How does one know he/she has classically conditioned a person or an animal?
a. The unconditioned stimulus all by itself elicits the unconditioned response.
b. The unconditioned stimulus all by itself elicits the conditioned response.
c. The conditioned stimulus all by itself elicits the conditioned response.
d. The unconditioned response all by itself elicits the conditioned response.

C

56. Which of the following is an example of classical conditioning?
a. A child learns to blink her eyes to a bell because the ringing of the bell has been followed by a puff of air to the eye.
b. A pigeon learns to peck at a disk in a Skinner box to get food.
c. Rich saw that when Donna banged her fist against a particular vending machine, she got a free soft drink, so now he bangs his fist against that machine when he wants a free soft drink.
d. A monkey learns to escape from a cage.

A

57. Most young children put their hands over their ears when they hear the loud boom of firecrackers at a Fourth of July festival, but at first pay just scant attention to the person lighting the firecrackers. However, after just a few firecrackers have been exploded, some of the children put their hands over their ears as soon as they see the person approach the firecracker with a match! What is the unconditioned stimulus?
a. the person lighting the firecrackers
b. the loud booming sound made by the firecrackers
c. the children putting their hands over their ears when they see the person about to light the firecrackers
d. the children putting their hands over their ears when the firecrackers explode

B

58. Most young children put their hands over their ears when they hear the loud boom of firecrackers at a Fourth of July festival, but at first pay scant attention to the person lighting the firecrackers. However, after just a few firecrackers have been exploded, some of the children put their hands over their ears as soon as they see the person approach the firecracker with a match! What is the conditioned stimulus?
a. the person lighting the firecrackers
b. the loud booming sound made by the firecrackers
c. the children putting their hands over their ears when they see the person about to light the firecrackers
d. the children putting their hands over their ears when the firecrackers explode

A

59. Most young children put their hands over their ears when they hear the loud boom of firecrackers at a Fourth of July festival, but at first pay scant attention to the person lighting the firecrackers. However, after just a few firecrackers have been exploded, some of the children put their hands over their ears as soon as they see the person approach the firecracker with a match! What is the unconditioned response?
a. the person lighting the firecrackers
b. the loud booming sound made by the firecrackers
c. the children putting their hands over their ears when they see the person about to light the firecrackers
d. the children putting their hands over their ears when the firecrackers explode

D

60. Classical conditioning:
a. is primarily concerned with reflexes.
b. is primarily concerned with involuntary responses.
c. is passive.
d. all of the above

D

61. Bobby and Sue were parked at Lover’s Lane. When Bobby kissed Sue, his breathing accelerated. Sue always wore Chanel #5 when she went out with Bobby. Whenever Bobby smelled Chanel #5, he began to breathe faster. Sue’s kiss was the:
a. UCS.
b. UCR.
c. CS.
d. CR.

A

62. Bobby and Sue were parked at Lover’s Lane. When Bobby kissed Sue, his breathing accelerated. Sue always wore Chanel #5 when she went out with Bobby. Bobby’s accelerated breathing when he and Sue kissed is the:
a. UCS.
b. UCR.
c. CS.
d. CR.

B

63. Bobby and Sue were parked at Lover’s Lane. When Bobby kissed Sue, his breathing accelerated. Sue always wore Chanel #5 when she went out with Bobby. Chanel #5 is the:
a. UCS.
b. UCR.
c. CS.
d. CR.

C

64. Bobby and Sue were parked at Lover’s Lane. When Bobby kissed Sue, his breathing accelerated. Sue always wore Chanel #5 when she went out with Bobby. Bobby’s faster breathing rate when he smells Chanel #5 is the:
a. UCS.
b. UCR.
c. CS.
d. CR.

D

65. In classical conditioning, one must pair the _______ before conditioning can occur.
a. UCS and CR
b. UCS and CS
c. CR and CS
d. UCR and CR

B

66. Of the four basic elements of classical conditioning, the one the organism learns to respond to is the:
a. UCS.
b. UCR.
c. CS.
d. CR.

C

67. As she walked through her neighborhood, Jodie, a 6 year old girl, frequently saw a large brown dog. She repeatedly walked to the dog to pet it, but as her hand approached the animal, it barked and bit her. The bite was painful and caused her to cry. Now Jodie cries when she sees dogs of any color or size. In the example, the dog’s bark and bite is the:
a. UCS.
b. CS.
c. UCR.
d. CR.

A

68. As she walked through her neighborhood, Jodie, a 6 year old girl, frequently saw a large brown dog. She repeatedly walked to the dog to pet it, but as her hand approached the animal, it barked and bit her. The bite was painful and caused her to cry. Now Jodie cries when she sees dogs of any color or size. Jodie’s crying when she sees dogs is the:
a. UCS.
b. CS.
c. UCR.
d. CR.

D

69. As she walked through her neighborhood, Jodie, a 6 year old girl, frequently saw a large brown dog. She repeatedly walked to the dog to pet it, but as her hand approached the animal, it barked and bit her. The bite was painful and caused her to cry. Now Jodie cries when she sees dogs of any color or size. The sight of dogs is the:
a. UCS.
b. CS.
c. UCR.
d. CR.

B

70. Paul is coming down with the flu, but he eats spaghetti anyway and subsequently becomes violently ill. A month later he sees that spaghetti is being served in the dining hall and is overcome by nausea. What type of learning is illustrated by this episode?
a. operant conditioning
b. cognitive learning
c. latent learning
d. classical conditioning

D

71. In classical conditioning the stimulus that normally evokes an automatic response even without new learning is called the:
a. conditioned stimulus.
b. reflexive stimulus.
c. unconditioned stimulus.
d. orienting stimulus.

C

72. When Luke kissed Laura, her heart rate increased. Luke always wore Old Spice After Shave. Whenever Laura smelled Old Spice, her heart raced. Luke’s kiss was the:
a. unconditioned stimulus.
b. unconditioned response.
c. conditioned stimulus.
d. conditioned response.

A

73. When Luke kisses Laura, her heart rate increases. Luke always wore Old Spice After Shave. Whenever Laura smelled Old Spice, her heart raced. Laura’s increased heart rate when Luke kissed her was the:
a. unconditioned stimulus.
b. unconditioned response.
c. conditioned stimulus.
d. conditioned response.

B

74. When Luke kissed Laura, her heart rate increases. Luke always wore Old Spice After Shave. Whenever Laura smelled Old Spice thereafter, her heart raced. Old Spice After Shave was the:
a. unconditioned stimulus.
b. unconditioned response.
c. conditioned stimulus.
d. conditioned response.

C

75. When Luke kissed Laura, her heart rate increases. Luke always wore Old Spice After Shave. Whenever Laura smelled Old Spice, her heart would race. Laura’s increased heart rate when she smelled Old Spice was the:
a. unconditioned stimulus.
b. unconditioned response.
c. conditioned stimulus.
d. conditioned response.

D

76. In classical conditioning, one must be sure to pair the:
a. US and CS.
b. US and UR.
c. CS and CR.
d. CS and UR.

A

77. Pairing the US and CS is essential for _______ to occur.
a. extinction
b. classical conditioning
c. operant conditioning
d. shaping

B

78. By pairing the ringing of a bell with the presentation of meat, Pavlov trained dogs to salivate to the sound of a bell even when no meat was presented. In this experiment, the presentation of the meat was the _______.
a. unconditioned stimulus
b. unconditioned response
c. conditioned stimulus
d. conditioned response

A

79. By pairing the ringing of a bell with the presentation of meat, Pavlov trained dogs to salivate to the sound of a bell even when no meat was presented. In this experiment, the ringing of the bell was the _______.
a. unconditioned stimulus
b. unconditioned response
c. conditioned stimulus
d. conditioned response

C

80. By pairing the ringing of a bell with the presentation of meat, Pavlov trained dogs to salivate to the sound of a bell even when no meat was presented. In this experiment, salivation to the meat was the _______.
a. unconditioned stimulus
b. unconditioned response
c. conditioned stimulus
d. conditioned response

B

81. By pairing the ringing of a bell with the presentation of meat, Pavlov trained dogs to salivate to the sound of a bell even when no meat was presented. In this experiment, salivation to the meat was the _______.
a. unconditioned stimulus
b. unconditioned response
c. conditioned stimulus
d. conditioned response

D

82. Rachel has found that when she opens up the cupboard door to get the cat food, the cats come running to the kitchen. Rachel knows that this is classical conditioning and that the unconditioned stimulus is the _______.
a. cat food
b. cats
c. running of the cats
d. cupboard door opening

A

83. An experimenter finds that a certain male subject always has an increased heartbeat when he sees a picture of a nude female. The experimenter sounds a buzzer and then presents such a picture. The experimenter repeats this procedure until the man responds with an increased heartbeat to the sound of the buzzer alone. In this situation the UNCONDITIONED response is the _______.
a. increased heartbeat
b. female’s picture
c. sounds of the buzzer
d. viewing of the picture

A

84. An experimenter finds that a certain male subject always has an increased heartbeat when he sees a picture of a nude female. The experimenter sounds a buzzer and then presents such a picture. The experimenter repeats this procedure until the man responds with an increased heartbeat to the sound of the buzzer alone. In this situation the CONDITIONED response is the ________.
a. increased heartbeat
b. nude female’s picture
c. sounds of the buzzer
d. viewing of the picture

A

85. In the experiment with Little Albert, the unconditioned stimulus was _______.
a. the experimenter
b. the laboratory
c. the loud noise
d. the rat

C

86. In the experiment with Little Albert, the conditioned stimulus was _______.
a. the experimenter
b. the laboratory
c. the loud noise
d. the rat

D

87. In the experiment with Little Albert, the unconditioned response was _______.
a. fear of the loud noise
b. fear of the rat
c. fear of the experimenter
d. fear of the laboratory

A

88. One of the best known examples of classical conditioning in humans was the Little Albert study, conducted by _______.
a. Pavlov
b. Freud
c. Watson
d. Skinner

C

89. In the classic study of fear conditioning in a human infant (the "Albert" experiment), what was the CS?
a. the rat
b. the rabbit
c. the loud noise
d. the crying response

A

90. In the classic study of fear conditioning in a human infant (the "Albert" experiment), what was the UCS?
a. the rat
b. the rabbit
c. the loud noise
d. the crying response

C

91. When a stimulus similar to the CS also elicits the CR, the phenomenon is called _______.
a. stimulus discrimination
b. stimulus generalization
c. spontaneous recovery
d. 2nd order conditioning

B

92. If a researcher presents the US first, then presents the CS, the pairing method used is _______.
a. trace
b. delay
c. simultaneous
d. backward

D

93. Repeatedly presenting a CS by itself will result in ________.
a. extinction
b. spontaneous recovery
c. stimulus discrimination
d. stimulus generalization

A

94. As she walked through her neighborhood, Jodie, a 6-year-old girl, frequently saw a large brown dog. She repeatedly walked to the dog to pet it, but as her hand approached the animal, it barked and bit her. The bite was painful and caused her to cry. Now Jodie cries when she sees dogs of any color or size. Jodie’s crying when she sees dogs is the ________.
a. US
b. CS
c. UR
d. CR

D

95 A grandmother gives her grandchild a cookie because the child cleaned up her room. What is the cookie in this example?
a. conditioned response
b. punisher
c. positive reinforcer
d. negative reinforcer

C

96. A negative reinforcer is a stimulus that is ________ and thus ________ the probability of a response.
a. removed; increases
b. presented; decreases
c. removed; decreases
d. presented; increases

A

97. Which of the following is an example of punishment?
a. taking away a child’s favorite toy for hitting another child
b. removing a penalty you imposed on a child after he began acting better
c. giving a child a star for telling a lie
d. giving a child a cookie for cleaning her room

A

98. When you were first learning to make your bed, your parents told you that you did a good job when you got the bedspread pulled up, even though the bed was still a little messy. For the next week they showed you how to be a little neater each time you made the bed. What operant conditioning procedure did your parents use?
a. generalization
b. extinction
c. shaping
d. punishment

C

99. A child is praised for using his fork instead of his fingers to eat some spaghetti. This is an example of __________ reinforcement.
a. positive
b. extrinsic
c. higher-order
d. secondary

A

100. A reinforcer that removes something unpleasant from a situation is a __________.
a. primary reinforcer
b. positive reinforcer
c. negative reinforcer
d. secondary reinforcer

C

101. On a variable-interval schedule, reinforcement is given for the __________.
a. first correct response after a fixed amount of time has passed
b. first correct response after varying amounts of time have passed
c. next correct response after a fixed number of responses have occurred
d. next correct response after a varying number of responses have occurred

B

102. Elizabeth was given a $1000 raise after her last performance evaluation. Her raise is a:
a. primary reinforcer.
b. punisher.
c. negative reinforcer.
d. secondary reinforcer.

D

103. What has occurred when there is a decrease in the likelihood or rate of a target response?
a. positive reinforcement and negative reinforcement
b. negative reinforcement
c. punishment
d. positive reinforcement

C

104. A positive reinforcer is a stimulus that is ________ and thus ________ the probability of a response.
a. removed; decreases
b. presented; increases
c. presented; decreases
d. removed; increases

B

105. Any event whose presence decreases the likelihood that ongoing behavior will recur is __________.
a. a secondary reinforcer
b. an aversive stimulus
c. punishment
d. negative reinforcement

C

106. Which of the following is a secondary reinforcer?
a. a bar of candy
b. warm, physical contact
c. money
d. a drink of water

C

107. Nagging someone to do something until they do it is an example of __________.
a. negative reinforcement
b. aversive conditioning
c. punishment
d. positive reinforcement

A

108. A reinforcer that adds something rewarding to a situation is called a(n) __________ reinforcer.
a. positive
b. additive
c. primary
d. secondary

A

109. On a fixed-ration schedule reinforcement is given ________ .
a. for the first correct response after randomly varying amounts of time have passed
b. for the next correct response after a fixed number of responses have been made
c. for the first correct response after a fixed amount of time has passed
d. for the next correct response after a varying number of responses have been made

B

110. On a fixed-interval schedule, reinforcement is given _______.
a. for the first correct response after a fixed amount of time has passed
b. for the first correct response after randomly varying amounts of time have passed
c. for the next correct response after a fixed number of responses have been made
d. for the next correct response after a varying number of responses have been made

A

111. On a variable-interval schedule, reinforcement is given _________ .
a. for the first correct response after a fixed amount of time has passed
b. for the first correct response after varying amounts of time have passed
c. for the next correct response after a fixed number of responses have been made
d. for the next correct response after a varying number of responses have been made

B

112. When someone uses negative reinforcement to change a behavior the behavior is likely to __________ .
a. occur less frequently
b. occur more frequently
c. occur at the same rate
d. completely stop

B

113. A camp leader repeatedly hugs a camper after she helps her friend. Each time, the camper is embarrassed and shies away from future acts of assistance. In the example, "Hugging the camper" is _______.
a. a positive reinforcer
b. a primary reinforcer
c. a punishment
d. none of the above

C

114. Mary arrives home to find her son washing the dirty dishes left from his party the night before. When she discovers his first-semester grade report on the table and sees that he got straight A’s, Mary rewards him by relieving him of the unpleasant task of finishing the dishes. Which operant process does the example illustrate?
a. positive reinforcement
b. negative reinforcement
c. extinction
d. punishment

B

115. Which of the following is an example of a primary reinforcer?
a. water
b. a thank-you letter
c. a smile from a loved one
d. money

A

116. Wearing sunglasses ALL THE TIME because people tell you they make you look "irresistible" is an example of which of the types of punishment and reinforcement?
a. aversive punishment
b. negative reinforcement
c. positive reinforcement
d. response cost

C

117. Negative reinforcement is best thought of as:
a. reinforcement for an undesirable activity.
b. punishment.
c. something that was predicted to serve as reinforcement but did not do so.
d. stimuli whose termination or removal increases behavior.

D

118. Putting on sunglasses to relieve glare is an example of which of the types of punishment and reinforcement?
a. aversive punishment
b. negative reinforcement
c. positive reinforcement
d. response cost

B

119. To avoid getting a headache, Lory always lets her dog outside when it sits by the door and howls. This is an example of which type of punishment or reinforcement?
a. aversive punishment
b. negative reinforcement
c. positive reinforcement
d. response cost

B

120. Training a rat to push a lever to escape from an electric shock is an example of:
a. aversive punishment.
b. negative reinforcement.
c. positive reinforcement.
d. response cost.

B

121. Positive reinforcers:
a. weaken behaviors they follow.
b. are always learned.
c. strengthen behaviors they follow.
d. are always unlearned.

C

122. Which of the following statements about positive reinforcers is accurate?
a. They are used in negative reinforcement.
b. They weaken behaviors that they follow.
c. They strengthen behaviors that they follow.
d. They strengthen behaviors that lead to their removal.

C

123. If a POSITIVE REINFORCER is added after a behavior and the behavior is strengthened/increased, the process used is called:
a. negative reinforcement.
b. positive reinforcement.
c. extinction.
d. punishment.

B

124. Mom and Dad think it is real funny, and laugh when their 2-year-old, Bruce, says dirty words. When Bruce is sent home from kindergarten because of swearing, they don’t understand why he cusses. Now when he cusses at home they ignore the cussing (they don’t think it’s cute anymore). Laughing in this example is:
a. positive reinforcer.
b. a negative reinforcer.
c. a primary reinforcer.
d. a neutral stimulus.

A

125. Which of the following is NOT a negative reinforcer?
a. turning off an electric shock
b. giving a spanking
c. removing a noxious odor
d. silencing a banging door

B

126. Animals exposed to unavoidable, uncontrollable aversive stimulation exhibit _______ when later trained in an avoidance procedure.
a. experimental neurosis
b. better learning
c. learned helplessness
d. enhanced performance

C

127. Which of the following is a primary reinforcer?
a. grades
b. water
c. money
d. recognition

B

128. Which of the following is a secondary reinforcer?
a. water
b. food
c. grades
d. physical support

C

129. At the National Zoological Park in Washington, D.C., a polar bear suffered a broken tooth, and keepers needed a safe way of treating the problem. The bear was rewarded first for sticking its nose through a slot in the cage door, then for allowing a keeper to lift its lip and touch its teeth. Finally, a veterinarian was able to treat the damaged tooth while the bear waited placidly for its familiar reward. This is an example of _______.
a. modeling
b. shaping
c. negative reinforcement
d. secondary learning

B

130. Anything that increases the likelihood that a behavior will increase is called a(n) _______.
a. aversive control
b. punishment
c. antecedent
d. reinforcer

D

131. When someone uses negative reinforcement to change a behavior, the behavior is likely to ______.
a. decrease
b. increase
c. remain the same
d. completely stop

B

132. When someone uses punishment to change a behavior, the behavior is likely to ______.
a. decrease
b. increase
c. remain the same
d. generalize

A

133. Which of the following statements is true?
a. Punishment does not always work.
b. The effectiveness of punishment depends solely on its force.
c. Punishment should be applied intermittently.
d. Punishment usually enhances the learning process.

A

134. Which of the following statements about punishment is NOT true?
a. Punishment does not always work.
b. Rewards should always immediately follow punishments.
c. Effective punishment is consistent punishment.
d. In itself, punishment serves to inhibit responses.

B

135. A reinforcer that adds something rewarding to a situation is called a ________ reinforcer.
a. positive
b. negative
c. primary
d. secondary

A

136. A reinforcer that removes something unpleasant from a situation is called a ________ reinforcer.
a. positive
b. negative
c. primary
d. secondary

B

137. The 5-year-old of two very busy parents has been throwing tantrums. Whenever the child goes off the deep end, one or both of his parents immediately come to his side and fuss over and cajole him. Nevertheless, his tantrums do not diminish; they even seem to increase. We may assume that his parents’ fussing over him serves as a _______.
a. negative reinforcer
b. punisher
c. positive reinforcer
d. model

C

138. A child is scolded for using his fingers instead of his fork to eat some spaghetti. The scolding stops when he picks up his fork. This is an example of _______ reinforcement.
a. positive
b. negative
c. tertiary
d. secondary

B

139. Which of the following is a primary reinforcer?
a. money
b. a bar of candy
c. a buzzer
d. poker chips

B

140. Which of the following is a secondary reinforcer?
a. money
b. a bar of candy
c. attention
d. a drink of water

A

141. Which of the following would be classified as a secondary reinforcer?
a. a sandwich
b. the word "good"
c. reduction of pain
d. a drink of soda

B

142. Which of the following is a primary reinforcer?
a. a sandwich
b. praise
c. money
d. grades

A

143. Electric shock, scoldings, and bad grades are:
a. secondary reinforcers.
b. primary reinforcers.
c. aversive stimuli.
d. conditioned stimuli.

C

144. The fact that a reward will increase the future likelihood of a response that produced it is known as:
a. the discrimination principle.
b. the law of practice.
c. the law of effect.
d. the Premack principle.

C

145. A woodchuck tries to crack a walnut shell in two different ways–with his paws and with his teeth. The last method worked and the first did not; hence, the woodchuck will be more likely to rely on his teeth for splitting the next nut. This observation illustrates the:
a. the discrimination principle.
b. the Law of Practice.
c. the Law of Effect.
d. the Premack principle.

C

146. When the removal of an event increases the likelihood of a prior response, _______ has occurred.
a. positive reinforcement
b. negative reinforcement
c. positive punishment
d. negative punishment

B

147. The Internal Revenue Service threatens Sue with a penalty if she fails to pay her back taxes. She pays, and the threat is withdrawn. In the future, she is more prompt in meeting her obligation. This is an example of the use of _______ to control behavior.
a. positive reinforcement
b. negative reinforcement
c. positive punishment
d. negative punishment

B

148. What is the typical dependent variable used in studies of the operant conditioning of lever pressing in rats?
a. the number of responses per minute
b. the cumulative record of lever presses
c. the average intensity of lever presses
d. none of the above

B

149. If a rat has learned to press a lever to obtain pellets of food and, all of a sudden, the response permanently ceases to produce any food, then _______ will occur.
a. shaping
b. discrimination
c. generalization
d. extinction

D

150. Which of the following is a conditioned positive reinforcer?
a. money
b. sex
c. food
d. warmth

A

151. The presentation of an aversive stimulus following a particular operant response is called:
a. negative reinforcement.
b. discrimination training.
c. aversion conditioning.
d. punishment.

D

152. Analogy: Negative reinforcement is to punishment as _______ is to _______.
a. presenting; withdrawing
b. withdrawing; presenting
c. aversive; pleasant
d. give; take

B

153. Which of the following is NOT a negative reinforcer?
a. turning off an electric shock
b. giving a spanking
c. removing a noxious odor
d. silencing a banging door

B

154. Which of the following is a primary reinforcer?
a. grades
b. water
c. money
d. recognition

B

155. Which of the following is a primary reinforcer?
a. grades
b. water
c. money
d. recognition

B

156. Billy throws rocks. Each time he throws a rock, he is immediately spanked. Spanking is a ________.
a. positive reinforcer
b. negative reinforcer
c. secondary reinforcer
d. punishment

D

157. Negative reinforcement is negative in the sense that:
a. a consequence stimulus is delivered in a negative manner.
b. it results in the removal of the behavior.
c. the behavior results in the removal of a negative reinforcer.
d. the behavior is decreased/weakened

C

158. Aunt Bea gave Opie fried chicken livers every time he made his bed. Opie began making his bed more often than he used to. In this example, chicken livers are a _______ reinforcer.
a. neutral
b. negative
c. secondary
d. primary

D

159 Which two learning processes seem to be opposites?
a. acquisition and generalization
b. discrimination and extinction
c. discrimination and generalization
d. acquisition and discrimination

C

160. Giving different responses to the same stimuli to which you were classically conditioned illustrates ____________ .
a. response generalization
b. spontaneous recovery
c. stimulus generalization
d. vicarious conditioning

A

161. A pigeon learns to peck only at a red disk. It will not peck at an identical disk of any other color. This illustrates the concept of ___________.
a. extinction
b. discrimination
c. avoidance training
d. desensitization

B

162. The process of presenting the conditioned stimulus alone so often that the learner no longer associates it with the unconditioned stimulus and stops making the conditioned response is called _________ .
a. extinction
b. generalization
c. spontaneous recovery
d. shaping

A

163. The process of learning to respond only to a single specific object or event is called _________ .
a. extinction
b. inhibition
c. stimulus generalization
d. discrimination

D

164. Reacting to a stimulus that is similar to the one you have learned to react to is called ___________ .
a. stimulus generalization
b. response generalization
c. higher-order conditioning
d. modeling

A

165. If a dog salivates when it sees a green light or a yellow light, it is exhibiting ________.
a. generalization
b. discrimination
c. higher-order conditioning
d. extinction

A

166. The spread of conditioning to stimuli similar to the conditioned stimulus is called:
a. associative linkage.
b. generalization.
c. higher-order conditioning.
d. spontaneous recovery.

B

167. Of the following phenomena, which one best explains the spreading of phobias to objects similar to the one to which the phobia was originally acquired?
a. discrimination
b. extinction
c. generalization
d. spontaneous recovery

C

168. A small boy has just recently delighted his parents because he learned to call his father "daddy." However, it has now become an embarrassment to his mother when she takes him out with her because he keeps calling other men "daddy." This is an example of:
a. associative linkage.
b. generalization.
c. higher-order conditioning.
d. spontaneous recovery.

B

169. Once conditioning has been acquired, presenting just the conditioned stimulus without the unconditioned stimulus produces:
a. extinction.
b. generalization.
c. a new conditioned response.
d. spontaneous recovery.

A

170. Repeatedly presenting a CS by itself will result in:
a. extinction.
b. spontaneous recovery.
c. stimulus discrimination.
d. stimulus generalization.

A

171. As she walked through her neighborhood, Jodie, a 6 year old girl, frequently saw a large brown dog. She repeatedly walked to the dog to pet it, but as her hand approached the animal, it barked and bit her. The bite was painful and caused her to cry. Now Jodie cries when she sees dogs of any color or size. Jodie now cries when she sees any dog, big or small, brown or black, etc. This illustrates which of the following?
a. generalization
b. discrimination
c. extinction
d. spontaneous recovery

A

172. When a CS is repeatedly presented by itself, ______ will occur.
a. generalization
b. discrimination
c. extinction
d. stimulus substitution

C

173. John’s heart has been conditioned to beat rapidly whenever he smells Windsong perfume on a woman. However, John’s heart also races when he smells Chanel #5 and other perfumes. This illustrates:
a. stimulus generalization.
b. discrimination.
c. extinction.
d. spontaneous recovery.

A

174. Stimulus discrimination:
a. is a response followed by a reinforcer.
b. occurs when responses are made to stimuli that are similar to the original CS.
c. is the removal of a stimulus.
d. occurs when responses are made to certain stimuli, but not to others.

D

175. The process of presenting the conditioned stimulus alone so often that the learner no longer associates it with the unconditioned stimulus and stops making the conditioned response is called _______.
a. extinction
b. generalization
c. spontaneous recovery
d. shaping

A

176. When a CR has been conditioned to a particular stimulus, the organism will also tend to make the CR in response to other stimuli. This phenomenon is called:
a. discrimination.
b. spread of effect.
c. generalization.
d. response shifting.

C

177. Spontaneous recovery:
a. occurs before the pairing of the CS and US.
b. occurs after a fixed interval schedule of reinforcement.
c. is an unlearned response.
d. can occur once a response has been extinguished.

D

178. Laura’s heart rate had been conditioned to increase whenever she smelled Old Spice After Shave. However, her heart would also race to the aroma of Brut and English Leather. This reaction is known as:
a. shaping.
b. stimulus generalization.
c. operant conditioning.
d. discrimination.

B

179. Stimulus generalization occurs:
a. only when a response is followed by a reinforcer.
b. only to those with a high capacity to learn.
c. after extinction.
d. when a conditioned response is elicited by stimuli similar to the CS.

D

180. This is the first exam you have ever taken in Professor Smith’s class. You know nothing about her tests, and she has never done anything harmful to you or anyone else. Nonetheless, you are anxious about the test. Your anxiety in this situation is an example of:
a. generalization.
b. discrimination.
c. backward conditioning.
d. none of the above.

A

181. Corky’s mouth waters when he sees Ball Park Franks, but not when he sees other brands of franks. This response is known as:
a. extinction.
b. discrimination.
c. generalization.
d. intelligence.

B

182. The opposite of stimulus generalization is:
a. stimulus discrimination
b. unconditioned stimulus.
c. conditioned stimulus.
d. response generalization.

A

183. Reacting to a stimulus that is similar to the one which you have learned to react is called _______.
a. stimulus generalization
b. response generalization
c. higher-order conditioning
d. modeling

A

184. The process of learning to respond only to a single specific object or event is called _______.
a. extinction
b. inhibition
c. stimulus generalization
d. discrimination

D

185. A person is conditioned to fear white rats. Soon after, she also begins to fear white cats, white dogs, and white rabbits. Her new, unconditioned fears result from _______.
a. modeling
b. discrimination
c. response generalization
d. stimulus generalization

D

186. A person originally feared great heights, such as standing on top of tall buildings. Now the person has also developed fears of flying in airplanes, standing on ladders, and even watching high-wire artists perform. These new fears are probably the result of _______.
a. modeling
b. discrimination
c. stimulus generalization
d. response generalization

C

187. A pigeon learns to peck only at a red disk. It will not peck at an identical disk of any other color. This illustrates the concept of _______.
a. extinction
b. discrimination
c. avoidance training
d. desensitization

B

188. A child who calls all four-legged animals "dogs" is exhibiting ______.
a. simplification
b. response generalization
c. stimulus generalization
d. equipotentiality

C

189. Being able to solve new problems faster because of previous experience with similar problems is called ________.
a. rote behavior
b. a learning set
c. latent learning
d. insight learning

B

190. You have a class in which you have a quiz every Friday. Your studying for quizzes is reinforced on what type of schedule?
a. fixed ratio
b. fixed interval
c. variable ratio
d. variable interval

B

191. Which schedule of reinforcement is programmed into slot machines?
a. fixed ratio
b. variable interval
c. variable ratio
d. fixed interval

C

192. On a fixed-interval schedule, reinforcement is given for the __________.
a. first correct response after a fixed amount of time has passed
b. first correct response after varying amounts of time have passed
c. next correct response after a fixed number of responses have occurred
d. next correct response after a varying number of responses have occurred

A

193. According to the law of effect, a behavior is MOST likely to be stamped in, or repeated, when it is __________ .
a. ignored
b. preceded by reinforcement
c. followed by reinforcement
d. accompanied by a neutral stimulus

C

194. A reinforcer that removes something unpleasant from a situation is called a __________ reinforcer.
a. positive
b. negative
c. primary
d. secondary

B

195. A reinforcer that is reinforcing in and of itself is called a ________ reinforcer.
a. direct
b. delayed
c. primary
d. secondary

C

196. Research suggests that delayed reinforcement ____________ .
a. is much more effective than immediate reinforcement
b. is slightly more effective than immediate reinforcement
c. is equally effective as immediate reinforcement
d. is less effective than immediate reinforcement

D

197. Lila doesn’t like her psychology class because the instructor uses unannounced pop exams to test the class. As a result, she never knows when she will be tested. Her instructor is testing her on a __________ schedule.
a. fixed-ratio
b. fixed-interval
c. variable-ratio
d. variable-interval

D

198. Sandy’s favorite activity is to go to Las Vegas and play the slot machines. Her gambling behavior is being reinforced on a __________ schedule.
a. fixed-ratio
b. fixed-interval
c. variable-ratio
d. variable-interval

C

199. An animal is placed in a box with a bar and also a wire floor that can deliver a mild shock. The experimenter first sounds a buzzer, then a few seconds later turns on the shock. Pressing the bar after the buzzer sounds but before the shock is delivered will prevent the shock from occurring. This is an example of __________ .
a. avoidance training
b. modeling
c. classical conditioning
d. punishment learning

A

200. On a fixed-ratio schedule reinforcement is given _________ .
a. for the first correct response after randomly varying amounts of time have passed
b. for the next correct response after a fixed number of responses have been made
c. for the first correct response after a fixed amount of time has passed
d. for the next correct response after a varying number of responses have been made

B

201. On a variable-ratio schedule, reinforcement is given ________ .
a. for the first correct response after a fixed amount of time has passed
b. for the first correct response after varying amounts of time have passed
c. for the next correct response after a fixed number of responses have been made
d. for the next correct response after a varying number of responses have been made

D

202. Anything that increases the likelihood that a behavior will occur more frequently is called a(n) __________ .
a. aversive control
b. punishment
c. antecedent
d. reinforcer

D

203. The schedule of reinforcement that yields the slowest increase in a behavior and the fastest extinction of the behavior when the schedule is stopped is ________.
a. variable ratio
b. continuous reinforcement
c. partial reinforcement
d. fixed interval

B

204. Five-year-old Tommy is helping get ready for a family reunion at Thanksgiving by polishing the good silverware. If his mother gives him a dime for each piece he polishes, what kind of reinforcement schedule is she using?
a. fixed interval
b. fixed ratio
c. variable interval
d. variable ratio

B

205. Linda sees a sign on a farmer’s fence that reads: HELP ME PICK STRAWBERRIES FOR EVERY 5 QUARTS YOU PICK, KEEP ONE FOR YOURSELF. If Linda decides to pick strawberries for this farmer, she would be under a _______ schedule of reinforcement.
a. fixed interval
b. fixed ratio
c. variable interval
d. variable ratio

B

206. A very high rate of responding is produced by a _______ schedule of reinforcement.
a. fixed interval
b. fixed ratio
c. variable interval
d. variable ratio

D

207. The only vending machine in your dorm is notorious for delivering its merchandise only occasionally when people put money in it. This is most similar to a _______ schedule of reinforcement.
a. fixed interval
b. fixed ratio
c. variable interval
d. variable ratio

D

208. Your professor has informed you at the beginning of the term that you will have eight tests–but they will all be unannounced. This is most similar to a _______ reinforcement schedule.
a. fixed interval
b. fixed ratio
c. variable interval
d. variable ratio

C

209. Gretta spends a lot of time at the race track betting on ponies, and occasionally she wins. The frequency of her betting is controlled by which of the following?
a. fixed ratio schedules
b. a continuous reinforcement schedule
c. a partial schedule of reinforcement
d. luck

C

210. An infant who is fed every four hours is on a schedule that is SIMILAR to which of the following?
a. fixed ratio
b. variable ratio
c. fixed interval
d. variable interval

C

211. A person receiving a monthly salary is on a:
a. continuous reinforcement schedule.
b. fixed-ratio schedule of reinforcement.
c. fixed-interval schedule of reinforcement.
d. variable-ratio schedule of reinforcement.

C

212. On a fixed-ratio schedule, reinforcement is given ________.
a. for correct responses after randomly varying amounts of time have passed
b. after a specific number of responses are given
c. for the first correct response after a specific amount of time has passed
d. after a randomly varying number of responses are given

B

213. On a fixed-interval schedule, reinforcement is given ________.
a. for the first correct response after a specific amount of time has passed
b. for correct responses after randomly varying amounts of time have passed
c. after a specific number of responses are given
d. after a randomly varying number of responses are given

A

214. On a variable-ratio schedule, reinforcement is given _______.
a. for the first correct response after a specific amount of time has passed
b. for correct responses after randomly varying amounts of time have passed
c. after a specific number of responses are given
d. after a randomly varying number of responses are given

D

215. On a variable-interval schedule, reinforcement is given _______.
a. for the first correct response after a specific amount of time has passed
b. for correct responses after randomly varying amounts of time have passed
c. after a specific number of responses are given
d. after a randomly varying number of responses are given

B

216. Scott works at a job where he is paid a salary every 2 weeks. Scott is being reinforced on a _______ schedule.
a. fixed-ratio
b. fixed-interval
c. variable-ratio
d. variable-interval

B

217. Sandy’s favorite activity is to go to Las Vegas and play the slot machines. Her gambling behavior is being reinforced on a _______ schedule.
a. fixed-ratio
b. fixed-interval
c. variable-ratio
d. variable-interval

C

218. Perry works at a job where he is paid by commission. For every car Perry sells, he gets 10% of the profits. Perry is being reinforced on a _______ schedule.
a. fixed-ratio
b. fixed-interval
c. variable-ratio
d. variable-interval

A

219. Abigail is trying to figure out how she can BEST use employee pay to shape her employees’ behavior. She is worried about consistent behavior, not speed. Therefore, she is interested in getting a slow but steady rate of response from her workers. According to reinforcement principles, she should probably use a ________ schedule.
a. fixed-ratio
b. fixed-interval
c. variable-ratio
d. variable-interval

D

220. Abigail is trying to figure out how she can BEST use employee pay to shape her employees’ behavior. She is interested in short-term productivity (speed), not consistency, long-term productivity, or employee turnover. According to reinforcement theory, she should probably use a _______ schedule.
a. fixed-ratio
b. fixed-interval
c. variable-ratio
d. variable-interval

A

221. If Billy was praised every 4th time he collected rocks without throwing them, his behavior would be on which schedule of reinforcement?
a. variable ratio
b. fixed interval
c. fixed ratio
d. variable interval

C

222. Which schedule of reinforcement reinforces the first correct response after a constant interval of time has elapsed?
a. fixed ratio
b. variable ratio
c. fixed interval
d. variable interval

C

223. An example of a behavior that is learned through operant conditioning is _____________.
a. blinking in response to a flash of light
b. studying in order to get a teacher’s approval
c. sneezing in response to dust
d. pulling one’s hand away from a flame

B

224. Emitted, voluntary behavior is BEST modified by _____________.
a. operant conditioning
b. trial and error
c. classical conditioning
d. extinction

A

225. Shaping is achieved through:
a. discrimination training.
b. generalization.
c. higher-order conditioning.
d. successive approximations.

D

226. To teach a tiger to jump through a flaming hoop, the tiger is first reinforced for jumping up on a certain pedestal, then for leaping from that pedestal to another. Next the tiger has to jump through a hoop between the pedestals to get the reward. Finally, the hoop is set afire and the tiger must jump through it to get the reward. This is an example of __________ .
a. modeling
b. shaping
c. negative reinforcement
d. secondary learning

B

227. A young girl is just learning to dress herself. At first, the parents call her a "big girl" just for putting on her clothes "frontwards," even if they are not buttoned. Then, they call her a "big girl" if she tries to button them–even if the buttons are not in the right holes. Then, they call her a "big girl" only if she buttons them correctly. They have been using:
a. discrimination.
b. generalization.
c. higher-order conditioning.
d. successive approximation.

D

228. Operant conditioning assumes that:
a. events that follow behavior affect whether the behavior is repeated in the future.
b. one’s mental processes (e.g., memory and perception) mediate what behaviors one does in a situation.
c. voluntary behaviors are reflexive.
d. one learns by watching others’ behavior.

A

229. Peggy wanted to teach her dog how to roll over. She tried giving him instructions, but it didn’t work. She tried waiting for him to roll over so she could reinforce the behavior, but she had to go to bed before the dog rolled. Finally, she began reinforcing the dog when it made behaviors that more closely resembled rolling over. At last, using _______, she was able to teach the dog to do the trick.
a. shaping
b. positive reinforcement
c. positive reinforcers
d. secondary reinforcers

A

230. Reinforcing behaviors that more closely resemble a final, terminal behavior is called:
a. positive reinforcement.
b. shaping.
c. positive reinforcers.
d. secondary reinforcers.

B

231. A procedure used to teach a whole behavior by first training its parts is called:
a. higher order conditioning.
b. shaping.
c. modeling.
d. response generalization.

B

232. Changing behavior through the reinforcement of partial responses is called _______.
a. modeling
b. shaping
c. negative reinforcement
d. classical conditioning

B

233. The type of learning that involves a sudden coming together of the elements of a situation so that the solution to a problem is instantly clear is __________.
a. cognitive mapping
b. vicarious learning
c. latent learning
d. insight

D

234. Which type of learning occurs when we observe other people act?
a. operant conditioning
b. classical conditioning
c. insight learning
d. observational learning

D

235. What do we call learning that has taken place but is not demonstrated?
a. insight learning
b. serial enumeration
c. latent learning
d. shaping

C

236. Learning that occurs but is not immediately reflected in a behavior change is called __________.
a. vicarious learning
b. innate learning
c. latent learning
d. insight

C

237. In a study on learning, the psychologist conducting the study seeks to explain the inner needs and desires that made learners pursue their goals. She is interested in the inner processes that result in learning. She is studying ___________ .
a. neurophysiological learning
b. primary learning
c. secondary learning
d. cognitive learning

D

238. The idea that for classical conditioning to occur, the presentation of the conditioned stimulus must tell you something about whether the unconditioned stimulus is going to occur is called ___________ theory.
a. social learning
b. contingency
c. operant conditioning
d. autonomic conditioning

B

239. In Bandura’s classic (1965) study of children exposed to a film of an adult hitting a Bobo doll, __________ .
a. children who saw the model punished learned to be more aggressive than children who say the model rewarded
b. children who saw the model rewarded learned to be more aggressive than children who say the model punished
c. children who saw the model punished performed more aggressively in a free play situation than children who saw the model rewarded
d. children who saw the model rewarded performed more aggressively in a free play situation than children who saw the model punished

D

240. Learning that depends on mental processes that are not able to be observed directly is called _________ learning.
a. autonomic
b. primary
c. secondary
d. cognitive

D

241. The concept of latent learning was developed by __________ .
a. Watson
b. Skinner
c. Thorndike
d. Tolman

D

242. The mental image of an area, such as a maze or the floor plan of a building, is called _____________.
a. a Gestalt
b. insight
c. a Skinner response
d. a cognitive map

D

243. The process by which prior conditioning prevents conditioning to a new (second) stimulus, even when the two stimuli are presented simultaneously, is called ___________ .
a. a learning set
b. learned helplessness
c. diffusion
d. blocking

D

244. Social learning theory’s foremost proponent is __________ .
a. Watson
b. Thorndike
c. Skinner
d. Bandura

D

245. A key to social learning theory is ____________ .
a. insight learning
b. cognitive mapping
c. latent learning
d. observational learning

D

246. Cognitive learning involves:
a. an association between events or phenomena.
b. an association between responses.
c. an association between behavior and its consequences.
d. internal representations of events in the world.

D

247. During your very first visit to your campus, you probably needed a map to get around efficiently. However, a little while later you no longer needed the map, because _______ had occurred.
a. classical conditioning
b. cognitive learning
c. instrumental conditioning
d. operant conditioning

B

248. Insight is a concept associated with _______ learning theory.
a. classical
b. operant
c. social
d. cognitive

D

249. Though Jenny tried in vain to reach a puzzle on the top shelf by standing on a chair, she simply could not reach. Suddenly she realized that by placing a thick catalog on the seat of the chair, she would be high enough to reach the puzzle. Jenny’s solution is best explained by which of the following?
a. her previous history of conditioning
b. her previous experiences with reaching objects that are out of reach
c. latent learning
d. insight

D

250. Cognitive learning theories attempt to explain how learning occurs using:
a. observation and imitation.
b. unobservable mental processes.
c. classical conditioning processes.
d. classical, operant, and observational processes.

B

251. The founder of Gestalt psychology and pioneer of insight problem solving was:
a. Thorndike.
b. Terman.
c. Kohler.
d. Harlow.

C

252. The mental storing of an entire image of an area, such as a maze or floor plan of a building is called______.
a. Gestalt
b. insight
c. Skinner response
d. a cognitive map

D

253. The type of learning that involves a sudden coming together of the elements of a situation so that the solution to a problem is instantly clear is _____.
a. insight
b. latent learning
c. cognitive mapping
d. contingency blocking

A

254. Cognitive maps are:
a. observable mental events.
b. consistent with conditioning theories.
c. learned without reinforcement.
d. inconsistent with insightful problem solving.

C

255. Insight is a concept associated with ________ learning theory.
a. classical
b. operant
c. social
d. cognitive

D

Share This
Flashcard

More flashcards like this

NCLEX 10000 Integumentary Disorders

When assessing a client with partial-thickness burns over 60% of the body, which finding should the nurse report immediately? a) ...

Read more

NCLEX 300-NEURO

A client with amyotrophic lateral sclerosis (ALS) tells the nurse, "Sometimes I feel so frustrated. I can’t do anything without ...

Read more

NASM Flashcards

Which of the following is the process of getting oxygen from the environment to the tissues of the body? Diffusion ...

Read more

Unfinished tasks keep piling up?

Let us complete them for you. Quickly and professionally.

Check Price

Successful message
sending