Question

3.1. Which of the following are some common issues we experience while training deep neural networks? As the data gets big and complex, convergence might take lot of time as the

training gets slow. O As the network gets deeper i.e., by adding more layers, the magnitude of the gradients might below too less or high, which might lead to vanishing or exploding gradients. O Since deep learning models are complex by nature, generally we observe under-fitting issue.

Fig: 1