1.
đź“śIntroductionđź“ś
2.
đź“śDense Layersđź“ś
2.1.
đź“śMatrix-Vector Multiplicationđź“ś
3.
đź“śDense Layers With Kerasđź“ś
3.1.
đźŽŻSyntaxđźŽŻ
4.
đź’ˇKeras Dense Layer Parametersđź’ˇ
5.
Keras Dense Layer Methodsđź’ˇ
6.
Basic Operations With Dense Layer
7.
Implementation
7.1.
đźŽŻSequential Model With Single Dense Layer
7.1.1.
đź§‘â€Ťđź’»Codeđź§‘â€Ťđź’»
7.2.
đźŽŻSequential Model With Multiple Dense Layers
8.
8.1.
Why do we use dense layers?
8.2.
Are dense layers fully connected?
8.3.
What is the use of dense Layer in CNN?
8.4.
What does dense do?
8.5.
Is a dense layer a hidden layer?
9.
Conclusion
Last Updated: Mar 27, 2024
Easy

Dense In Deep Learning

Mayank Goyal
0 upvote
Master Python: Predicting weather forecasts
Speaker
Ashwin Goyal
Product Manager @

đź“śIntroductionđź“ś

Layers are the basic building blocks in the deep learning model. The models can present various layers, like LSTM, convolutional, dense, etc. All of these different layers have their importance based on their features.

A dense layer is a fully connected layer used in the neural network's end stages to change the output's dimensionality from the preceding layer. Dense layers help define the relationship between the data values in which the model is working.

đź“śDense Layersđź“ś

A dense layer is connected deeply with preceding layers in any neural network. Each neuron in the dense layer is connected to every neuron of its preceding layer. Dense layers are the most commonly used layers in Artificial Neural Networks models.
The neurons in the dense layers in a model receive an outcome from every neuron of the preceding layer. That's where neurons of the dense layer perform matrix-vector multiplication. So in the background, the dense layer performs a matrix-vector multiplication. It is a procedure where the row vector of the outcome from its previous layers equals the column vector of the dense layer.

đź“śMatrix-Vector Multiplicationđź“ś

The general formula of matrix-vector multiplication can be represented as:

A is a (M x N) matrix, whereas x is a (1 x N) matrix. The values used in the matrix are parameters that can be trained and updated with the help of backpropagation. Backpropagation is an algorithm we use for training the feedforward neural networks. Backpropagation in neural networks computes the gradient of the loss function concerning the network's weights for single input or its associated outcome. Thus from the above theory, we can guarantee that the output from the dense layer is an N-dimensional vector. Thus it can be seen that it is reducing the dimension of the vectors. So basically, a dense layer is used to change the vectors' size by using every neuron.

Get the tech career you deserve, faster!
Connect with our expert counsellors to understand how to hack your way to success
User rating 4.7/5
1:1 doubt support
95% placement record
Akash Pal
Senior Software Engineer
326% Hike After Job Bootcamp
Himanshu Gusain
Programmer Analyst
32 LPA After Job Bootcamp
After Job
Bootcamp

đź“śDense Layers With Kerasđź“ś

đźŽŻSyntaxđźŽŻ

``keras.layers.Dense(units, activation=None, use_bias=True, kernel_initializer='glorot_uniform', bias_initializer='zeros', kernel_regularizer=None, bias_regularizer=None, activity_regularizer=None, kernel_constraint=None, bias_constraint=None)``

Now we will look at each of the hyperparameters quickly.

đź’ˇKeras Dense Layer Parametersđź’ˇ

Further, Keras also provides some of the primary methods discussed below:

Keras Dense Layer Methodsđź’ˇ

Also see, Introduction to Machine Learning

Basic Operations With Dense Layer

The three main parameters for a dense layer are an activation function, the weight matrix, and a bias vector. Using these parameters, a dense layer operation can be represented as:

đź‘‰output = activation(dot(input, kernel) + bias)

In the above equation, the activation function is used for performing element-wise activation, and the kernel is the weights matrix generated by the layer. Bias is a bias vector formed by the dense layer. Keras dense layer performs dot product of input tensor and weight kernel matrix.

Implementation

đźŽŻSequential Model With Single Dense Layer

We use the Sequential model, an in-built Keras model. First, we fed the input layer to the model and then added a dense layer along with ReLU activation.

đź§‘â€Ťđź’»Codeđź§‘â€Ťđź’»

``````import tensorflow as tf
model_1 = tf.keras.models.Sequential()
model_1.output_shape``````

đź‘‰Output

``(None, 32)``

đźŽŻSequential Model With Multiple Dense Layers

We look at a model where multiple hidden layers are used in deep neural networks. Here we are using the ReLu activation function in the neurons of the hidden dense layer.

``````đź§‘â€Ťđź’»Codeđź§‘â€Ťđź’»
import tensorflow as tf
model1 = tf.keras.models.Sequential()
print(model1.output_shape)``````

đź‘‰Output

Why do we use dense layers?

A Dense layer feeds all outputs from the previous layer to all its neurons, each providing one output to the next layer. It's the most basic layer in neural networks.

Are dense layers fully connected?

The dense layer, also called the fully-connected layer, refers to the layer whose inside neurons connect to every neuron in the preceding layer.

What is the use of dense Layer in CNN?

The dense layer is a simple Layer of neurons in which each neuron receives input from all the neurons of the previous layer, thus called as dense. The dense layer is used to classify images based on output from convolutional layers.

What does dense do?

Dense is the only actual network layer in that model. A Dense layer feeds all outputs from the previous layer to all its neurons, each providing one output to the next layer.

Is a dense layer a hidden layer?

The first Dense object is the first hidden layer. The input layer is specified as a parameter to the first Dense object's constructor.

Conclusion

Let us brief the article. Firstly, we saw the intuition behind the dense layer. Later, we saw how it could be implemented using Keras and saw different parameters and methods associated with it. Since it is the primary part of any neural network, we should know about the various primary and dense layers.

You can also refer to Stochastic Gradient DescentFeature SelectionLogistic RegressionSigmoid neuron and many more to enhance your knowledge.