Basic Machine Learning Optimization Algorithms

Keeping up with my tradition of posting my old handwritten notes, here are my notes on basic Machine Learning optimization algorithms. Optimization algos in ML help minimize the cost function, thereby helping to reduce the error between predicted value and actual value

1) Most popular – Gradient Descent

Gradient descent equation

Gradient Descent is used in linear regression, logistic regression & early implementation of neural networks

Gradient Descent
Gradient Descent ML algorithm

2) ADAM = ADAptive Moment estimation

Adam optimization algorithm does not use single global learning rate , but a different learning rate for every single parameter

Adam Optimization ML algorithm
Adam Optimization ML algorithm

I hope you liked my notes on basic machine learning optimization algorithms. Feel free to comment here or on my Linkedin post.

Published by

Mrukant Popat

Thank you for visiting my blog. I am an Engineering leader working in tech for about two decades now. A lot of work on AI / ML, Embedded Systems, Linux kernel, firmware, application software, Backend / server side, web and mobile. I have experience working on both client and server side architectures end-to-end. I like to read and write on OS (operating systems), Linux kernel and AI / ML. Most of my blogs are related to these technologies applied to various verticals where I have worked like media ( Audio / Video ), Automotive, SDWAN, IoT and Robotics. Thanks again and please provide your feedback in the form of comments.

Leave a Reply

Your email address will not be published. Required fields are marked *