Optimizers in Deep Learning

Types of Optimizers

1. Gradient Descent

2. Stochastic Gradient Descent

3. Mini-Batch Gradient Descent

4. SGD With Momentum

For complete math behind SGD with momentum refer:

5. Adaptive Gradient optimization

6. RMSprop / Adadelta

7. Adam Optimizer

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Ayushi choudhary

Ayushi choudhary

Machine learning and data science enthusiast. Eager to learn new technology advances.