Layers are the basic building blocks in the deep learning model. The models can present various layers, like LSTM, convolutional, dense, etc. All of these different layers have their importance based on their features.

A dense layer is a fully connected layer used in the neural network's end stages to change the output's dimensionality from the preceding layer. Dense layers help define the relationship between the data values in which the model is working.

A dense layer is connected deeply with preceding layers in any neural network. Each neuron in the dense layer is connected to every neuron of its preceding layer. Dense layers are the most commonly used layers in Artificial Neural Networks models. The neurons in the dense layers in a model receive an outcome from every neuron of the preceding layer. That's where neurons of the dense layer perform matrix-vector multiplication. So in the background, the dense layer performs a matrix-vector multiplication. It is a procedure where the row vector of the outcome from its previous layers equals the column vector of the dense layer.

đź“śMatrix-Vector Multiplicationđź“ś

The general formula of matrix-vector multiplication can be represented as:

A is a (M x N) matrix, whereas x is a (1 x N) matrix. The values used in the matrix are parameters that can be trained and updated with the help of backpropagation. Backpropagation is an algorithm we use for training the feedforward neural networks. Backpropagation in neural networks computes the gradient of the loss function concerning the network's weights for single input or its associated outcome. Thus from the above theory, we can guarantee that the output from the dense layer is an N-dimensional vector. Thus it can be seen that it is reducing the dimension of the vectors. So basically, a dense layer is used to change the vectors' size by using every neuron.

Get the tech career you deserve, faster!

Connect with our expert counsellors to understand how to hack your way to success

Now we will look at each of the hyperparameters quickly.

đź’ˇKeras Dense Layer Parametersđź’ˇ

HyperParameteras

Description

units

It is the most primitive of all the parameters. It accepts only a positive integer and defines the size of the output vector from the dense layer,i.e., determines the size of the weight matrix along with the bias vector.

activation

It helps define the activation function that can be applied per element of a dense layer. There are various types of activation functions provided by Keras, such as relu, sigmoid, softmax, Selu, Elu, Tanh, and many more. If not specified, it uses a linear activation function.

use_bias

It takes a boolean value that helps decide whether we should include a bias vector for calculation purposes. By default, we set use_bias is set to true.

kernel_initializer

It helps in initializing the kernel weight matrix of the dense layer.

bias_intializer

It is used for initializing the bias vector. A bias vector is the additional weights requiring no input and corresponding to the output layer. By default, it is set as zeros.

kernel_regularizer

It defines the regularizer or the loss function applied to the kernel weight matrix.

bias_regularizer

It defines the regularizer for the bias vector if we have initialized any. By default, it is set to none.

activity_regularizer

It defines the regularizer of the activation function we have described in the activation parameter. activtiy_regularizer is applied to the output of the layer. By default, it is set to none.

kernel_constraint

It represents the constraint function applied to the kernel weight matrix.

bias_constraint

It represents the constraint function that can be applied to the bias vector.

Further, Keras also provides some of the primary methods discussed below:

Keras Dense Layer Methodsđź’ˇ

Methods

Description

get_weights

It returns the complete list of the weights used in the layers.

set_weights

It sets the weights of the layers.

summary()

The model's summary() method displays all the layers, including each layer's name, output shape, and the number of parameters.

input_shape

Get the input shape if the layer has only a single node.

input

Get the input data if the layer has only a single node.

get_input_shape_at()

The layer returns the input shape at the specified index if it has multiple nodes.

get_input()

The layer returns the input data at the specified index if it has multiple nodes.

output_shape

Returns the output shape if the layer has a single node.

output

Returns the output data if the layer has a single node.

get_output_shape_at

The layer returns the output shape at the specified index if it has multiple nodes.

get_output_at

It returns the output data at the specified index if the layer has multiple nodes.

The three main parameters for a dense layer are an activation function, the weight matrix, and a bias vector. Using these parameters, a dense layer operation can be represented as:

In the above equation, the activation function is used for performing element-wise activation, and the kernel is the weights matrix generated by the layer. Bias is a bias vector formed by the dense layer. Keras dense layer performs dot product of input tensor and weight kernel matrix.

Implementation

đźŽŻSequential Model With Single Dense Layer

We use the Sequential model, an in-built Keras model. First, we fed the input layer to the model and then added a dense layer along with ReLU activation.

We look at a model where multiple hidden layers are used in deep neural networks. Here we are using the ReLu activation function in the neurons of the hidden dense layer.

A Dense layer feeds all outputs from the previous layer to all its neurons, each providing one output to the next layer. It's the most basic layer in neural networks.

Are dense layers fully connected?

The dense layer, also called the fully-connected layer, refers to the layer whose inside neurons connect to every neuron in the preceding layer.

What is the use of dense Layer in CNN?

The dense layer is a simple Layer of neurons in which each neuron receives input from all the neurons of the previous layer, thus called as dense. The dense layer is used to classify images based on output from convolutional layers.

What does dense do?

Dense is the only actual network layer in that model. A Dense layer feeds all outputs from the previous layer to all its neurons, each providing one output to the next layer.

Is a dense layer a hidden layer?

The first Dense object is the first hidden layer. The input layer is specified as a parameter to the first Dense object's constructor.

Conclusion

Let us brief the article. Firstly, we saw the intuition behind the dense layer. Later, we saw how it could be implemented using Keras and saw different parameters and methods associated with it. Since it is the primary part of any neural network, we should know about the various primary and dense layers.