1.

Which of the following are Pros of SGD?

A. Stochastic Gradient Descent (SGD) requires several hyperparameters
B. It is sensitive to feature scaling
C. Stochastic Gradient Descent (SGD) is very efficient
D. All of the above
Answer» D. All of the above


Discussion

No Comment Found