Gradient Descent with Momentum in Neural Network

Gradient Descent with momentum works faster than the standard Gradient Descent algorithm. The basic idea of the momentum is to compute the exponentially weighted average of gradients over previous iterations to stabilize the convergence and use this gradient to update the weight and bias parameters.

Let’s first understand what is an exponentially weighted average.

Exponentially Weighted Average

An exponentially weighted average is also known as moving average in statistics. The general mathematical equation of exponentially weighted average is:

This equation indicates that giving more weighted to the previous value and smaller weight to the current value.

Implementation:

Leave a Reply

Your email address will not be published. Required fields are marked *

Machine Learning Model Tutorials

Content-Based Recommendation System

Face verification on Live CCTV IP camera feed using Amazon Rekognition Video and Kinesis Video Streams

AWS Rekognition: Face detection & analysis of an image

Stream CCTV IP camera (RTSP) feed into AWS Kinesis Video Streams

Model Quantization Methods In TensorFlow Lite

Introduction to TensorFlow Lite

TensorFlow : Prepare Custom Neural Network Model with Custom Layers

Regularization Techniques: To avoid Overfitting in Neural Network

Setting Dynamic Learning Rate While Training the Neural Network

Neural Network: Introduction to Learning Rate

Mathematics behind the Neural Network

Implementation of Neural Network from scratch using NumPy

How Neural Network works?

Gradient Descent in Neural Network

Activation Functions in Neural Network

Introduction to Neural Network

K-Nearest Neighbors (KNN)

Support Vector Machine (SVM)

Logistic Regression

Linear Regression

Random Forest

Decision Tree

Introduction to Machine Learning Model

Performance Measurement Metrics to Evaluate Machine Learning Model

Essential Mathematics for Machine Learning

Applications of Machine Learning

Study Machine Learning