Code360 powered by Coding Ninjas X Code360 powered by Coding Ninjas X
Last updated: Feb 7, 2022

Optimization of Neural Network

Learn about different algorithms, architectures, and techniques for optimising neural networks for better prediction and analysis, as well as improving the accuracy of any model.
Gradient Descent with Momentum
This article will focus on the limitations of gradient descent and how momentum overcomes those shortcomings. Finally, we will look into the maths part.
Nesterov Accelerated Gradient
This article is about Nesterov accelerated gradient focused on its need and working.
What is AdaGrad?
This blog is about AdaGrad. Why it is used, and what is the concept behind it.
In this blog, we will review the Gradient Descent algorithm and its drawbacks and the various Optimizers to remove the drawback.
This article focuses on the RMSProp optimizer, the reason behind using it, the work and mathematics behind it. Finally at last, we will look into a python implementation.
Adaptive Moment Estimation (ADAM)
In this article, we will learn ADAM (adaptive moment estimation) optimizer. The derivation of ADAM optimizer and implementation of it.
Limited-Memory Broyden-Fletcher-Goldfarb-Shanno Algorithm
In this article, we will discuss the L-BFGS(Limited-Memory Broyden-Fletcher-Goldfarb-Shanno) Algorithm and its maths.