GROWTH DEVELOPMENT CHILD
PRINCIPLES OF LEARNING
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
Fixed-ratio schedule
|
|
Variable-ratio schedule
|
|
Fixed-interval schedule
|
|
Variable interval schedule
|
Detailed explanation-1: -Variable-ratio schedules occur when a response is reinforced after an unpredictable number of responses. This schedule creates a high steady rate of responding. Gambling and lottery games are good examples of a reward based on a variable ratio schedule.
Detailed explanation-2: -In operant conditioning, a variable-ratio schedule is a partial schedule of reinforcement in which a response is reinforced after an unpredictable number of responses. 1 This schedule creates a steady, high rate of response. Gambling and lottery games are good examples of a reward based on a variable-ratio schedule.
Detailed explanation-3: -Gambling machines pay out on a variable reinforcement schedule, which is a type of partial reinforcement where only a proportion of responses are reinforced and there is no fixed pattern; this lack of predictability keeps people gambling.
Detailed explanation-4: -Gambling works on a variable ratio schedule of reinforcement. When someone gambles, they are rewarded with a win after an unpredictable number of bets placed. For example, someone playing bingo might have a general idea that they will win about every 50 games that they play.
Detailed explanation-5: -A variable ratio schedule means that the reinforcer is delivered after an average number of correct responses has occurred. So a variable ratio schedule is similar to a fixed ratio schedule except the number of responses needed to receive the reinforcement changes after each reinforcer is presented.