LEARNING
OPERANT CONDITIONING
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
fixed ratio
|
|
fixed interval
|
|
variable ratio
|
|
variable ratio
|
Detailed explanation-1: -Ratio schedules involve reinforcement after a certain number of responses have been emitted. The fixed ratio schedule involves using a constant number of responses. For example, if the rabbit is reinforced every time it pulls the lever exactly five times, it would be reinforced on an FR 5 schedule.
Detailed explanation-2: -Variable Interval: In variable interval (VI) schedule, the first behavior is reinforced after an average amount of time has passed. Example: You provide Jane praise (“good job”) the first time she says “please” after about every 55, 60 or 65 minutes.
Detailed explanation-3: -In operant conditioning, a variable-ratio schedule is a partial schedule of reinforcement in which a response is reinforced after an unpredictable number of responses. 1 This schedule creates a steady, high rate of response. Gambling and lottery games are good examples of a reward based on a variable-ratio schedule.
Detailed explanation-4: -Your Employer Checking Your Work: Does your boss drop by your office a few times throughout the day to check your progress? This is an example of a variable-interval schedule. These check-ins occur at unpredictable times, so you never know when they might happen.