A Machine Learning Specialist built an image classification deep learning model. However, the Specialist ran into an overfitting problem in which the training and testing accuracies were 99% and 75%, respectively.
How should the Specialist address this issue and what is the reason behind it?
- The learning rate should be increased because the optimization process was trapped at a local minimum.
- The dropout rate at the flatten layer should be increased because the model is not generalized enough.
- The dimensionality of dense layer next to the flatten layer should be increased because the model is not complex enough.
- The epoch number should be increased because the optimization process was terminated before it reached the global minimum.
Answer(s): B
Explanation:
Overfitting occurs when a model is too complex and memorizes the training data instead of learning the underlying pattern. As a result, the model performs well on the training data but poorly on new, unseen data.
Increasing the dropout rate, a regularization technique, can help combat overfitting by randomly dropping out some neurons during training, which prevents the model from relying too heavily on any single feature.
Reveal Solution Next Question