LEARNING
OPERANT CONDITIONING
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
fixed variable
|
|
fixed ratio
|
|
variable ratio
|
|
variable interval
|
Detailed explanation-1: -Ratio schedules involve reinforcement after a certain number of responses have been emitted. The fixed ratio schedule involves using a constant number of responses. For example, if the rabbit is reinforced every time it pulls the lever exactly five times, it would be reinforced on an FR 5 schedule.
Detailed explanation-2: -In operant conditioning, a variable-ratio schedule is a partial schedule of reinforcement in which a response is reinforced after an unpredictable number of responses. 1 This schedule creates a steady, high rate of response. Gambling and lottery games are good examples of a reward based on a variable-ratio schedule.
Detailed explanation-3: -Continuous reinforcement requires the subject to receive positive rewards for behavior every time the behavior is exhibited. Example: Every time a child remembers to raise their hand in class, the teacher gives them a sticker. Partial reinforcement is when the subject receives rewards for behavior some of the time.
Detailed explanation-4: -Variable Interval: In variable interval (VI) schedule, the first behavior is reinforced after an average amount of time has passed. Example: You provide Jane praise (“good job”) the first time she says “please” after about every 55, 60 or 65 minutes.