Last updated: Feb 7, 2022

Neural Networks

Neural networks are the foundation of deep learning. A network for prediction and accuracy is built by combining various nodes that are arranged in a meaningful structure. Artificial Neural Networks are another name for them. Learn about its different types, applications, and hands-on experience.
Artificial Neural Networks
This blog provides a high-level view of ANNs, particularly their architecture and their uses in the real world.
Graph Neural Networks MEDIUM
In this article on Graph Neural Networks (GNN), we will understand GNN's fundamentals, syntax, practical examples with code and output, etc.
Multilayer Perceptron
The objective of this blog is to understand what multilayer perceptrons are.
Loss Functions in Neural Networks
In this article, we will learn about the loss function in neural networks. We’ll see some importance and types of loss functions.
What is Neural Network EASY
In this article, we will explore what is Neural Network, how they work, and the different types of neural networks. We will also see a simple implementation of a neural network using Python.
Quantization and Pruning
This blog post will explore two crucial techniques. - Quantization and Pruning. - that enables the development of efficient deep neural networks while maintaining accuracy.
Author Arya27
0 upvotes
Introduction to Hopfield Neural Network
This blog discussed an introduction to Hopfield neural network. We will also discuss its architecture, energy function, and training model.

Optimization of Neural Network

Learn about different algorithms, architectures, and techniques for optimising neural networks for better prediction and analysis, as well as improving the accuracy of any model.
Gradient Descent with Momentum
This article will focus on the limitations of gradient descent and how momentum overcomes those shortcomings. Finally, we will look into the maths part.
Nesterov Accelerated Gradient EASY
Nesterov Accelerated Gradient (NAG) is an optimization technique that improves gradient descent by adding a momentum term for faster convergence and reduced oscillations.
What is AdaGrad?
This blog is about AdaGrad. Why it is used, and what is the concept behind it.
AdaDelta
In this blog, we will review the Gradient Descent algorithm and its drawbacks and the various Optimizers to remove the drawback.
RMSProp
This article focuses on the RMSProp optimizer, the reason behind using it, the work and mathematics behind it. Finally at last, we will look into a python implementation.
Adaptive Moment Estimation (ADAM)
In this article, we will learn ADAM (adaptive moment estimation) optimizer. The derivation of ADAM optimizer and implementation of it.
Limited-Memory Broyden-Fletcher-Goldfarb-Shanno Algorithm
In this article, we will discuss the L-BFGS(Limited-Memory Broyden-Fletcher-Goldfarb-Shanno) Algorithm and its maths.