COMPUTER FUNDAMENTALS

EMERGING TRENDS IN COMPUTING

ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
Exploding gradient problem is an issue in training deep networks where the gradient getS so large that the loss goes to an infinitely high value and then explodes.What is the probable approach when dealing with “Exploding Gradient” problem in RNNs?
A
Use modified architectures like LSTM and GRUs
B
Gradient clipping
C
Dropout
D
None of these
Explanation: 

Detailed explanation-1: -Another popular technique to mitigate the exploding gradients problem is to clip the gradients during backpropagation so that they never exceed some threshold. This is called Gradient Clipping. This optimizer will clip every component of the gradient vector to a value between –1.0 and 1.0.

Detailed explanation-2: -Gradient clipping is a technique for preventing exploding gradients in recurrent neural networks. Gradient clipping can be calculated in a variety of ways, but one of the most common is to rescale gradients so that their norm is at most a certain value.

Detailed explanation-3: -Reducing the amount of Layers This is the solution could be used in both, scenarios (exploding and vanishing gradient). However, by reducing the amount of layers in our network, we give up some of our models complexity, since having more layers makes the networks more capable of representing complex mappings.

There is 1 question to complete.