0% found this document useful (0 votes)
2K views31 pages

Learn 1

The document contains multiple choice questions about concepts in operant and classical conditioning. Key points covered include: - The role of reinforcement is strikingly significant in operant conditioning. - A negative reinforcement in Skinnerian operant conditioning is the withdrawing or removal of a positive reinforcer. - Unadaptive habits like nail biting are best dealt with using the operant method rather than the classical method. - In continuous reinforcement schedules, every appropriate response is reinforced.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2K views31 pages

Learn 1

The document contains multiple choice questions about concepts in operant and classical conditioning. Key points covered include: - The role of reinforcement is strikingly significant in operant conditioning. - A negative reinforcement in Skinnerian operant conditioning is the withdrawing or removal of a positive reinforcer. - Unadaptive habits like nail biting are best dealt with using the operant method rather than the classical method. - In continuous reinforcement schedules, every appropriate response is reinforced.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 31

1.

In Operant conditioning procedure, the role of reinforcement is:


(a) Strikingly significant

(b) Very insignificant

(c) Negligible

(d) Not necessary

(e) None of the above

2. According to Skinnerian Operant conditioning theory, a negative


reinforcement is:
(a) Nothing but punishment

(b) A biofeedback

(c) A withdrawing or removal of a positive reinforcer

(d) An instinctive drift

(e) None of the above

3. Behaviour therapists believe that the respondent or classical


conditioning is effective in dealing with the non-voluntary automatic
behaviour, whereas the operant one is successful predominantly with
motor and cognitive behaviours, Thus, unadaptive habits such as nail
biting, trichotillomania, enuresis encopresis, thumb sucking etc. are
satisfactorily dealt within the :
(a) Classical Method

(b) Operant Method

(c) Trial and Error Method


(d) Insightful learning procedure

(e) None of the above

4. Current positive reinforcement requires the individual to imagine


performing a particular task or behaviour followed by a:
(a) Negative consequence

(b) Zero consequence

(c) Positive Consequence

(d) Neutral consequence

(e) None of the above

5. Aversion is one of the conditioning procedures used in:


(a) Non-directive therapy

(b) Psychoanalytic therapy

(c) Behaviour therapy

(d) Chemotherapy

(e) None of the above

6. A very useful principle of learning is that a new response is


strengthened by:
(a) Punishment

(b) Reinforcement

(c) Biofeedback

(d) Discriminative Stimulus

(e) None of the above


7. In continuous reinforcement schedule (CRF), every appropriate
response:
(a) Is reinforced

(b) Is not reinforced

(c) Is sometimes reinforced

(d) Is an instinctive drift

(e) None of the above

8. The continuous reinforcement schedule is generally used:


(a) In the last part of training

(b) In early stages of training

(c) In the middle period of training

(d) In both last and first part of training

(e) None of the above

9. In real life, reinforcement of every response (CRF) is:


(a) Of the nature of an exception rather than the rule

(b) Impossible

(c) Necessary

(d) Not necessary

(e) None of the above

10. Which schedule of reinforcement is a ratio schedule stating a ratio of


responses to reinforcements?
(a) Variable Ratio Schedule

(b) Fixed Interval Schedule


(c) Variable Interval Schedule

(d) Fixed Ratio Schedule

(e) None of the above

11. Respondents are elicited and operants are not elicited but they are:
(a) Emitted spontaneously

(b) Emitted voluntarily

(c) Permanent responses

(d) Temporary responses

(e) None of the above

12. In which schedule of reinforcement, appropriate movements are


reinforced after varying number of responses?
(a) Fixed Ratio Schedule

(b) Fixed Interval Schedule

(c) Variable ratio Schedule

(d) Variable Interval Schedule

(e) None of the above

13. Which schedule of reinforcement does not specify any fixed number,
rather states the requirement in terms of an average?
(a) Variable Ratio Schedule

(b) Fixed Ratio Schedule

(c) Fixed Interval Schedule

(d) Variable Interval Schedule


(e) None of the above

14. As a rule, variable ratio schedule (VR) arrangements sustain:


(a) Low rates of responding

(b) High rates of responding

(c) Zero responding

(d) 90% of responding

(e) None of the above

15. Under conditions of variable ratio schedule, the only sensible way to
obtain more reinforcements is through emitting:
(a) 50% responses

(b) 90% responses

(c) Less number of responses

(d) Greater number of responses

(e) None of the above

16. In which schedule of reinforcement, the experimenter (E) reinforces


the first correct response after a given length of dine?
(a) Fixed Ratio Schedule

(b) Fixed Interval Schedule

(c) Variable Ratio Schedule

(d) Variable Interval Schedule

(e) None of the above

17. In our daily life, watching for the pot of milk to boil may be somewhat
similar to the behaviour pattern observed in:
(a) Fixed Interval Schedule

(b) Fixed Ratio Schedule

(c) Variable Ratio Schedule

(d) Variable Interval Schedule

(e) None of the above

18. In which schedule of reinforcement, the delay intervals vary as per a


previously decided plan?
(a) Fixed Ratio Schedule

(b) Variable Ratio Schedule

(c) Fixed Interval Schedule

(d) Variable Interval Schedule

(e) None of the above

19. In our daily life, any kind of looking for things which occur without
any reference to our behaviour may illustrate the application of:
(a) Variable Interval Schedule

(b) Fixed Ratio

(c) Variable Ratio Schedule

(d) Fixed interval Schedule

(e) None of the above

20. In case of continuous reinforcement, we get the least resistance to


extinction and the:
(a) Highest response rate during training

(b) 50% response rate during training


(c) Smallest response rate during training

(d) 90% response rate during training

(e) None of the above

21. The expression “Contingencies of reinforcement” occurs frequently


in:
(a) Operant Conditioning Literature

(b) Classical Conditioning Literature

(c) Trial and Error Learning Literature

(d) Latent Learning Literature

(e) None of the above

22. Who illucidates the contiguity theory of reinforcement in the most


pronounced and consistent manner?
(a) C. Hull

(b) Guthrie

(c) Tolman

(d) Mc Dougall

(e) J. B. Watson

23. In comparison with drive-reduction or need- reduction interpretation,


stimulus intensity reduction theory has an added advantage in that:
(a) It offers a unified account of primary and learned drives as also of primary
and conditioned reinforcement

(b) It is very precise and placed importance on Trial and Error Learning

(c) It has some mathematical derivations which are conducive for learning
theorists
(d) All learning theories can be explained through this

(e) None of the above

24. Who preferred to call Classical Conditioning” by the name of “Sign


Learning”?
(a) I. P. Pavlov

(b) Mowrer

(c) Miller

(d) Guthrie

(e) J. B. Watson

25. Which type of learning tells us what to do with the world and applies
to what is commonly called habit formation?
(a) Insightful Learning

(b) Latent Learning

(c) Trial and Error Learning

(d) Instrumental Learning

(e) Classical Conditioning

26. Who propounded the expectancy theory of learning?


(a) Guthrie

(b) C. Hull

(c) Tolman

(d) Thorndike

(e) I. P. Pavlov
27. Who said that any act is a movement but not vice versa?
(a) J.B. Watson

(b) W. Kohler

(c) Guthrie

(d) E. L. Thorndike

(e) C. Hull

28. Guthrie believed that conditioning should take place:


(a) After two trials

(b) After three trials

(c) After a single trial

(d) After ten trials

(e) None of the above

29. According to Guthrie, forgetting is not a matter of decay of old


impressions and associations but:
(a) A result of inhibition of old connections by new ones

(b) A result of disinhibitions of old connections

(c) A result of generalizations of stimuli

(d) A result of discrimination

(e) None of the above

30. The great learning theorist, Clark Hull was influenced by the
moderate wing of:
(a) Gestalt Psychology

(b) Behaviouristic Orientation


(c) Psychoanalytic Literature

(d) Logical Positivism and by conventionalism

(e) None of the above

31. Who defined “Need” as a state of the organism in which a deviation


of the organism from the optimum of biological conditions necessary for
survival takes place?
(a) Mc Dougall

(b) Clark H. Hull

(c) E.L Thorndike

(d) I.P. Pavlov

(e) None of the above

32. According to Hullian theory, under the pressure of needs and drives,
the organism undertakes:
(a) Adaptive actions

(b) Learning by foresight

(c) Learning by hindsight

(d) Transfer of training

(e) None of the above

33. Hull believes that no conditioning will take place unless there is:
(a) Food

(b) Need Reduction

(c) Puzzle Box

(d) Secondary Reinforcement


(e) None of the above

34. Who defined stimulus (S) in terms of physical energy such as


mechanical pressure, sound, light etc.?
(a) E. L. Thorndike

(b) W. Kohler

(c) B. F. Skinner

(d) Clark Hull

(e) E. C. Tolman

35. “Where a reaction (R) takes place in temporal contiguity with an


afferent receptor impulse (S) resulting from the impact upon a receptor
of a stimulus energy (S) and the conjunction is followed closely by the
diminution in a need and the associated diminution in the drive, D, and
in the drive receptor discharge, SD, there will result in increment, A (S
→R), in the tendency for that stimulus on subsequent occasions to
evoke that reaction”. Who has given the above definition of
“reinforcement”?
(a) Clark L. Hull

(b) E. L. Thorndike

(c) I.P. Pavlov

(d) W. Kohler

(e) None of the above

36. Most of Hull’s explanations are stated in two languages, one of the
empirical description and the other in:
(a) Psycho physiological terms

(b) Neurophysiological terms


(c) Physiological terms

(c) Physical terms

(e) None of the above

37. The molar approach deals with the organism as a whole, the
molecular approach:
(a) Deals with parts

(b) Deals with stimuli

(c) Deals with responses

(d) Has nothing to do with the organism

(e) Deals with the detailed, fine and exact elements of action of the nervous
system

38. The hypothetico-deductive system in geometry was developed by:


(a) I.P. Pavlov

(b) B. L. Thorndike

(c) C. Hull

(d) Pieri

(e) E. C. Tolman

39. Whenever behaviour is correlated to specific eliciting stimuli, it is:


(a) Respondent Behaviour

(b) Operant Behaviour

(c) Stimulant Behaviour

(d) Fixed Behaviour


(e) Static Behaviour

40. Whenever behaviour is not correlated to any specific eliciting stimuli,


it is:
(a) Respondent Behaviour

(b) Operant Behaviour

(c) Static Behaviour

(d) Modified Behaviour

(e) None of the above

41. According to Tolman, docile or teachable behaviour is:


(a) Molar

(b) Molecular

(c) Respondent

(d) Operant

(e) None of the above

42. According to Skinnerian theory, the “S” type of conditioning applies


to:
(a) Modified Behaviour

(b) Stimulant Behaviour

(c) Operant Behaviour

(d) Respondent Behaviour

(e) None of the above

43. The sign-gestalt expectation represents a combination of:


(a) Intelligence and Perception
(b) Perception and Learning

(c) Intelligence and Learning

(d) Perception and Motivation

(e) None of the above

44. Who stated that appetites and aversions are “states of agitation”?
(a) E. L. Thorndike

(b) E. C. Tolman

(c) W. Kohler

(d) Clark Hull

(e) None of the above

45. Who said that the ultimate goal of aversion is the state of
physiological quiescence to be reached when the disturbing stimulus
ceases to act upon the organism?
(a) E. L. Thorndike

(b) W. Kohler

(c) E. C. Tolman

(d) Clark Hull

(e) None of the above

46. According to E. C. Tolman, there are two aversions: fright and


pugnacity. Fright is avoidance of injury and pugnacity is avoidance of:
(a) Interference

(b) Affiliation

(c) Motivation
(d) Perception

(e) None of the above

47. “Equivalence Belief’ is a connection between” a positively cathected


type of disturbance-object and a type of what may be called:
(a) An interfering object

(b) A sub disturbance object

(c) A motivating object

(d) A goal-oriented object

(e) None of the above

48. Who revealed that “Field expectancy” takes place when one
organism is repeatedly and successfully presented with a certain
environmental set-up?
(a) E. C. Tolman

(b) C. L. Hull

(c) E. L. Thorndike

(d) I.P. Pavlov

(e) Guthrie

49. Dollard and Miller related Thorndike’s spread of effect to the:


(a) Gradient of reinforcement

(b) Biological constraints

(c) Principle of preparedness

(d) None of the above

50. Miller and Dollard are more concerned with:


(a) Biological factor in learning

(b) Social factor in learning

(c) Physiological and Social factors in learn ing

(d) Personal factors in learning

(e) None of the above

51. Mowrer’s Sign learning comes close to Guthrie’s contiguity and his
‘solution learning’ corresponds to:
(a) Pavlov’s Classical Conditioning

(b) Kohler’s Insightful learning

(c) Skinner’s instrumental learning

(d) Thorndike’s trial and error learning

(e) None of the above

52. Mowerer’s two-factor theory takes into consideration the fact that:
(a) Some conditioning do not require reward and some do

(b) Every conditioning requires reinforcement

(c) The organism learns to make a response to a specific stimulus

(d) Learning is purposive and goal-oriented

(e) None of the above

53. When learning in one situation influences learning in another


situation, there is evidence of:
(a) Avoidance learning

(b) Learned helplessness


(c) Premise of Equipotentiality

(d) Transfer of Training

(e) None of the above

54. If learning in situation ‘A’ may favourably influence learning in


situation ‘B’, then we have:
(a) Positive Transfer

(b) Negative Transfer

(c) Zero Transfer

(d) Bilateral Transfer

(e) Neutral Transfer

55. If learning in situation ‘A’ has a detrimental effect on learning in


situation ‘B’, then we have:
(a) Positive Transfer

(b) Zero Transfer

(c) Neutral Transfer

(d) Negative transfer

(e) None of the above

56. Mediation occurs when one member of an associated pair is linked


to the other by means of:
(a) A reinforcement

(b) An intervening element

(c) Generalization

(d) Secondary reinforcement


(e) None of the above

57. Zero transfer is otherwise known as:


(a) Neutral Transfer

(b) Positive Transfer

(c) Negative Transfer

(d) Bilateral Transfer

(e) None of the above

58. Negative Transfer of Training is otherwise known as:


(a) Neutral Transfer

(b) Habit interference

(c) Zero Transfer

(d) Bilateral Transfer

(e) None of the above

59. “If you do not like milk, you may not like all milk products like cheese
butter, ghee and curd”. This is due to:
(a) Generalization Gradient

(b) Avoidance Learning

(c) Biological Constraints

(d) Transfer of Training

(e) None of the above

60. Who told, “Although Classical Conditioning is a laboratory


procedure, it is easy to find real world examples.”?
(a) B.J. Underwood (1983)
(b) G. H. Bower (1976)

(c) C. B. Osgood (1957)

(d) Kimble and Germazy (1980)

(e) Mc Geoch (1942)

61. According to Hull, a systematic behaviour or learning theory can be


possible by happy amalgamation of the technique of conditioning and
the:
(a) Law of Effect

(b) Law of Exercise

(c) Law of Frequency

(d) Law of Recency

(e) None of the above

62. The methods of verbal learning are important because:


(a) The use of standard methods for learning makes comparisons of results
possible

(b) Rewards are not necessary here

(c) They minimise the effect of punishment

(d) Punishment has no effect on learning

(e) None of the above

63. Positive transfer of training is possible with:


(a) Dissimilar tasks

(b) Motor tasks

(c) Similar tasks


(d) Verbal tasks

(e) None of the above

64. A ‘Skinner Box’ is used for:


(a) Motor learning

(b) Verbal learning

(c) Sensory learning

(d) Problem Solving

(e) Incidental learning

65. Punishment is effective only when it weakens:


(a) Undesirable response

(b) Desirable response

(c) Positive response

(d) Negative response

(e) None of the above

66. Which one of the following psychologists is not associated with the
theories of learning?
(a) Sullivan

(b) C. Hull

(c) Tolman

(d) Thorndike

(e) Guthrie
67. In which method, the entire list is once exposed to ‘S’ and then he is
asked to anticipate each item in the list before it is exposed on the
memory drum?
(a) Recall

(b) Recognition

(c) Relearning and Saving

(d) Anticipation Method

(e) None of the above

68. The new items which are added to the original list in recognition
method are known as:
(a) Stimulants

(b) Respondents

(c) Gradients

(d) Distractors

(e) None of the above

69. Learning to make new responses to identical or similar stimuli


results in a:
(a) Negative Transfer

(b) Positive Transfer

(c) Zero transfer

(d) Neutral transfer

(e) None of the above

70. Both positive and negative transfers are largely the result of:
(a) Similarity of responses in the first and the second task
(b) Dissimilarity of responses in the first and the second task

(c) Co-ordination of responses in the first and the second task

(d) Both similarity and dissimilarity of responses in the first and the second
task

(e) None of the above

71. The greater the similarity between the stimuli of the first task and the
second task:
(a) The less the extent of transfer

(b) The greater the extent of transfer

(c) The minimum the extent of transfer

(d) No transfer occurs

(e) None of the above

72. A high positive transfer results when stimuli are similar and
responses are:
(a) Identical

(b) Not Identical

(b) Haphazard

(d) Equipotential

(e) None of the above

73. It is possible to maximize a positive transfer from a class room


situation to real life situation by making formal education more realistic
or closely connected with:
(a) Real-life problems

(b) Imaginary problems


(c) Temporary problems

(d) Easy Problems

(e) None of the above

74. In programmed learning, the importance is placed on:


(a) Trial and error learning

(b) Latent learning

(c) Classical conditioning

(d) Operant conditioning

(e) None of the above

75. Who is regarded as the father of the ‘Programmed Learning’?


(a) B. F. Skinner

(b) I. P. Pavlov

(c) C.L. Hull

(d) J.B. Watson

76. Who has first devised a machine for teaching in 1920?


(a) M. R. F. Maier

(b) A. Luchins

(c) S. L. Pressey

(d) H. F. Harlow

(e) D. O. Hebb

77. In the system of programmed learning, the learner becomes:


(a) An active agent in acquiring the acquisition
(b) A passive agent in acquiring the acquisition

(c) A neutral age in acquiring the acquisition

(d) Instrumental in acquiring the acquisition

(e) None of the above

78. Programmed learning:


(a) Is not helpful for teaching

(b) Is not helpful in the socialization of the child

(c) Is not helpful in classroom situation

(d) Is not helpful for teachers

(e) None of the above

79. Lewin’s field theory gives more importance to behaviour and


motivation and less to:
(a) Incentive

(b) Drive

(c) Experience

(d) Intelligence

(e) None of the above

80. Kurt Lewin regards the environment of the individual as his:


(a) life-space

(b) Instinctive drift

(c) Autoshaping

(d) Foresight
(e) None of the above

81. Guthrie’s theory of learning is known as the learning by:


(a) Interpretation

(b) Representation

(c) Substitution

(d) Response

(e) None of the above

82. For Skinner, the basic issue is how reinforcement sustains and
controls responding rather than:
(a) Which stimulus evokes a response

(b) Which response is helpful

(c) Which stimulus can be generalized

(d) Which stimulus can be discriminated

(e) None of the above

83. Who said that the event-that is drive reducing is satisfying?


(a) E. C. Tolman

(b) R. S. Woodworth

(c) E. L. Thorndike

(d) Clark H. Hull

(e) None of the above

84. Materials like food for hungry animals or water for thirsty animals are
called:
(a) Secondary reinforcers
(b) Primary Reinforcers

(c) Intermittent reinforcers

(d) Fixed reinforcers

(e) None of the above

85. When a thing acquires some characteristics of a reinforcer because


of its consistent association with the primary reinforcement, we call it
a/an:
(a) Secondary Reinforcer

(b) Primary Reinforcer

(c) Fixed Reinforcer

(d) Intermittent Reinforcer

(e) None of the above

86. In one experiment, the chimpanzees were taught to insert poker


chips in a vending machine in order to obtain grapes. When this was
done, they were made to pull, with all their strength, an iron bar attached
to a similar machine to obtain poker chips. The chimpanzees learned it
too, because they were allowed to cash those chips for grapes
afterwards. Here the token chips had only a/an:
(a) Primary Reinforcing Value

(b) Extra Reinforcing Value

(c) Special Reinforcing Value

(d) Secondary Reinforcing Value

(e) None of the above

87. Partial Reinforcement is often called:


(a) Intermittent Reinforcment

(b) Schedules of Reinforcement

(c) Span of Reinforcement

(d) Reinforcement Schedule

(e) None of the above

88. Reinforcing a given response only for sometime on trials is known


as:
(a) Partial Reinforcement

(b) Continuous Reinforcement

(c) Reinforcement Schedule

(d) No Reinforcement

(e) None of the above

89. Most human habits are reinforced in a:


(a) Variable fashion

(b) Constant fashion

(c) Partial Manner

(d) Particular Time span

(e) None of the above

90. Most human habits are resistent to extinction because these are
reinforced:
(a) In a constant fashion

(b) All the times


(c) Every now and then

(d) In a variable fashion

(e) Very often

91. Which type of learning experiments show how the behaviour of


animals can be controlled or shaped in a desired direction by making a
careful use of reinforcement?
(a) Classical conditioning

(b) Operant conditioning

(c) Latent Learning

(d) Sign Learning

(e) None of the above

92. In Operant Conditioning, he strength of an operant response is


usually measured in terms of the frequency of lever pressing:
(a) Per unit of time

(b) In every five minutes

(c) As a whole

(d) In a day

(e) None of the above

93. The method we use in memorising poetry is called:


(a) Paired-associate learning

(b) Distributed learning

(c) Serial memorisation

(d) Massed learning


(e) Syntactic Memorisation

94. Shifting from right-hand driving in (in U.S.A.) to a left-hand driving (in
India) is an illustration of:
(a) Negative transfer of training

(b) Positive transfer of training

(c) Neutral transfer of training

(d) Both neutral and positive transfer of training

(e) None of the above

95. The replacement of one conditioned response by the establishment


of an incompatible response to the same conditioned stimulus is known
as:
(a) Backward Conditioning

(b) Counter Conditioning

(c) Forward Conditioning

(d) High order conditioning

(e) None of the above

96. Experimental literature revealed that experiments on latent learning


were done by:
(a) Tolman and Honzik (1930)

(b) Gibson and Harlow

(c) Pavlov and Watson

(d) Kohler and Wertheimer


97. Working with monkeys, Harlow (1949) propounded that the general
transfer effect from one situation to another may be accounted for by
the concept of:
(a) “Learning how to learn” or “Learning Sets”

(b) Sign learning

(c) Latent learning

(d) Gradient of learning

(e) Plateau

98. Proactive Inhibition refers to the learning of ‘A’ having a detrimental


effect on the learning of ‘B’. So it is a:
(a) Neutral transfer of effect

(b) Zero transfer of effect

(c) Positive transfer of effect

(d) Negative transfer of effect

(e) None of the above

99. Who has defined “perceptual learning” as “an increase in the ability
to extract information from the environment as a result of experience or
practice with the stimulation coming from it.”?
(a) I. P. Pavlov

(b) Wertheimer

(c) B. F. Skinner

(d) Eleanor Gibson (1969)

(e) J.B. Watson

100. To distinguish the calls of birds:


(a) Sign learning is necessary

(b) Perceptual learning is needed

(c) Operant conditioning would be conducive

(d) Insight is needed

(e) CR will be helpful

Answers
1.(a) 2. (c) 3. (b) 4. (c) 5. (c) 6. (b) 7. (a) 8. (b) 9. (a) 10. (d)  11. (a) 12. (c) 13.
(a) 14. (b) 15. (d) 16. (b) 17. (a) 18. (d) 19. (a) 20. (c) 21. (c) 22. (b) 23. (a) 24.
(b) 25. (d) 26. (c) 27. (c) 28. (c) 29. (a) 30. (d) 31. (b) 32. (a) 33. (b) 34. (d) 35.
(a) 36. (b) 37. (e) 38. (d) 39. (a) 40. (b) 41. (a) 42. (d) 43. (d) 44. (b) 45. (c) 46.
(a) 47. (b) 48. (a) 49. (a) 50. (b) 51. (c) 52. (a) 53. (d) 54. (a) 55. (d) 56. (b) 57.
(a) 58. (b) 59. (d) 60. (d) 61. (a) 62. (a) 63. (c) 64. (d) 65. (a) 66. (a) 67. (d) 68
(d) 69. (a) 70. (a) 71. (b) 72. (a) 73. (a) 74. (d) 75. (a) 76. (c) 77. (a) 78. (b) 79.
(c) 80. (a) 81. (d) 82. (a) 83. (d) 84. (b) 85. (a) 86. (a) 87. (a) 88. (a) 89. (a) 90.
(d) 91. (b) 92. (a) 93. (c) 94. (a) 95. (b) 96. (a) 97. (a) 98. (d) 99. (d) 100. (b)

You might also like