GROWTH DEVELOPMENT CHILD
LEARNING THEORIES
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
fixed ratio
|
|
variable ratio
|
|
variable interval
|
|
fixed interval
|
Detailed explanation-1: -variable ratio schedule. The variable ratio schedule occurs when a response is reinforced after an unpredictable number of responses. A slot machine is an example of this schedule because the players do not know how many times they have to play before they win something.
Detailed explanation-2: -Gambling at a slot machine is an example of a variable ratio reinforcement schedule5. What is this? Gambling or lottery game rewards unpredictably. Each winning requires a different number of lever pulls.
Detailed explanation-3: -What is Variable Ratio Reinforcement? Variable ratio reinforcement is one way to schedule reinforcements in order to increase the likelihood of conscious behaviors. The reinforcement, like the jackpot for a slot machine, is distributed only after a behavior is performed a certain number of times.
Detailed explanation-4: -In a variable ratio reinforcement schedule, the number of responses needed for a reward varies. This is the most powerful partial reinforcement schedule. An example of the variable ratio reinforcement schedule is gambling.
Detailed explanation-5: -In operant conditioning, a variable-ratio schedule is a partial schedule of reinforcement in which a response is reinforced after an unpredictable number of responses. 1 This schedule creates a steady, high rate of response. Gambling and lottery games are good examples of a reward based on a variable-ratio schedule.