Keeping up with my tradition of posting my old handwritten notes, here are my notes on basic Machine Learning optimization algorithms. Optimization algos in ML help minimize the cost function, thereby helping to reduce the error between predicted value and actual value
1) Most popular – Gradient Descent

Gradient Descent is used in linear regression, logistic regression & early implementation of neural networks

2) ADAM = ADAptive Moment estimation
Adam optimization algorithm does not use single global learning rate , but a different learning rate for every single parameter





I hope you liked my notes on basic machine learning optimization algorithms. Feel free to comment here or on my Linkedin post.