Code360 powered by Coding Ninjas X Naukri.com. Code360 powered by Coding Ninjas X Naukri.com
Table of contents
1.
Introduction  
2.
Model Architecture
3.
Differences between McCulloch & Pitt’s Model and Perceptron Model
4.
Similarities between McCulloch & Pitt’s Model and Perceptron Model
5.
Geometric Interpretation of McCulloh & Pitt’s Model
5.1.
OR Function
5.2.
AND Function
6.
Frequently Asked Questions
6.1.
What is the role of the summation function? 
6.2.
What is the role of threshold function?
6.3.
What are inhibitory and excitatory inputs?
7.
Conclusion
Last Updated: Mar 27, 2024
Easy

McCulloch Pitt's Model of Neuron

Author Arun Nawani
0 upvote
Master Python: Predicting weather forecasts
Speaker
Ashwin Goyal
Product Manager @

Introduction  

Many people mistake Perceptron to be the first Artificial Neural Network. However, to clear any misconceptions, Perceptron is the most fundamental unit of the modern-day ANN. Not the first Artificial Neural Network. The McCulloch Pitt's Model of Neuron is the earliest logical simulation of a biological neuron, developed by Warren McCulloch and Warren Pitts in 1943 and hence, the name McCulloch Pitt’s model. As simplistic as it may seem, we have to take into consideration that it was built several decades ago. In this blog, we are going to understand the origin of modern-day Artificial Neural Networks. 

Also read about, Artificial Intelligence in Education

Model Architecture

The motivation behind the McCulloh Pitt’s Model is a biological neuron. A biological neuron takes an input signal from the dendrites and after processing it passes onto other connected neurons as the output if the signal is received positively, through axons and synapses. This is the basic working of a biological neuron which is interpreted and mimicked using the McCulloh Pitt’s Model.

Source - link

McCulloch Pitt’s model of neuron is a fairly simple model which consists of some (n) binary inputs with some weight associated with each one of them. An input is known as ‘inhibitory input’ if the weight associated with the input is of negative magnitude and is known as ‘excitatory input’ if the weight associated with the input is of positive magnitude. As the inputs are binary, they can take either of the 2 values, 0 or 1. 

Source - link

Then we have a summation junction that aggregates all the weighted inputs and then passes the result to the activation function. The activation function is a threshold function that gives out 1 as the output if the sum of the weighted inputs is equal to or above the threshold value and 0 otherwise. 

So let’s say we have n inputs = { X1, X2, X3, …. , Xn }

And we have n weights for each= {W1, W2, W3, …., W4}

So the summation of weighted inputs X.W = X1.W1 + X2.W2 + X3.W3 +....+ Xn.Wn


If X ≥ ø(threshold value)

     Output = 1

Else

       Output = 0

 

Let’s Take a real-world example:

A bank wants to decide if it can sanction a loan or not. There are 2 parameters to decide- Salary and Credit Score. So there can be 4 scenarios to assess-

  1. High Salary and Good Credit Score
  2. High Salary and Bad Credit Score
  3. Low Salary and Good Credit Score
  4. Low Salary and Bad Credit Score

Let X1 = 1 denote high salary and X1 = 0 denote Low salary and X2 = 1 denote good credit score and X2 = 0 denote bad credit score

Let the threshold value be 2. The truth table is as follows

X1

X2

X1+X2

Loan approved

1

1

2

1

1

0

1

0

0

1

1

0

0

0

0

0

 

 

 

 

 

 

 

 

The truth table shows when the loan should be approved considering all the varying scenarios. In this case, the loan is approved only if the salary is high and the credit score is good. The McCulloch Pitt's model of neuron was mankind’s first attempt at mimicking the human brain. And it was a fairly simple one too. It’s no surprise it had many limitations-

  1. The model failed to capture and compute cases of non-binary inputs. It was limited by its ability to compute every case with 0 and 1 only. 
  2. The threshold had to be decided beforehand and needed manual computation instead of the model deciding itself.
  3. Linearly separable functions couldn’t be computed. 
Get the tech career you deserve, faster!
Connect with our expert counsellors to understand how to hack your way to success
User rating 4.7/5
1:1 doubt support
95% placement record
Akash Pal
Senior Software Engineer
326% Hike After Job Bootcamp
Himanshu Gusain
Programmer Analyst
32 LPA After Job Bootcamp
After Job
Bootcamp

Differences between McCulloch & Pitt’s Model and Perceptron Model

  1. McCulloh/Pitt’s Model accepts only boolean inputs whereas, Perceptron Model can process inputs in various real forms.
  2. In McCulloh/Pitt’s Model the inputs are not weighted which means that this model is not flexible, however in comparison to this model, Perceptron model accepts weights with respect to the provided inputs which makes it much more flexible.

Similarities between McCulloch & Pitt’s Model and Perceptron Model

  1. Both models can handle linearly separable data.
  2. Threshold inputs can be adjusted in both models so that they can fit respective datasets.

Geometric Interpretation of McCulloh & Pitt’s Model

Let us understand the geometric interpretation of the model using the following functions.

OR Function

We know that the thresholding parameter for OR function is 1, i.e. theta is 1. The possible combinations of inputs are: (0,0), (0,1), (1,0), and (1,1). Considering the OR function’s aggregation equation, i.e. x_1+x_2≥1, let us plot the graph.

Source - link

The graph shows that the inputs for which the output when passed through OR function M-P neuron lie ON or ABOVE (output is 1, positive) that line and all inputs that lie BELOW (output is 0, negative) that line give the output as 0.

Therefore, the McCulloh Pitt’s Model has made a linear decision boundary which splits the inputs into two classes, which are positive and negative. 

AND Function

Similar to OR Function, we can plot the graph for AND function considering the equation is x_1+x_2=2.

 

Source - link

Here, the decision boundary separates all the input points that lie ON or ABOVE and give output 1 with just (1,1) when passed through the AND function. 

From these examples, we can understand that with increase in the number of inputs, the dimensions which are plotted on the graph will also increase, which means that if we consider 3 inputs with OR function, we will plot a graph on a three-dimensional (3D) plane and draw a decision boundary in 3 dimensions.

Check out this article - Padding In Convolutional Neural Network

Frequently Asked Questions

What is the role of the summation function? 

The summation function receives and aggregates the sum of the weighted inputs which are then sent to the activation function.
 

What is the role of threshold function?

The threshold function is the activation function in the McCulloch Pitt's model of neurons. The neuron is fired if and only if the sum of weighted inputs is greater than the threshold value. 
 

What are inhibitory and excitatory inputs?

Inhibitory inputs are those which have maximum impact on the final prediction and could be the sole factor to decide the output. Excitatory inputs may not fire the neuron on their own but can do so with a combination of the other factors

Conclusion

The McCulloch Pitt's model of neuron marks the beginning of the development of Artificial Neural Network. The model is simple to understand and has been discussed in detail in the blog. We suggest you go through the blog again as it’s important from the college exam’s perspective. If the blog intrigued you to learn more about ANN and Deep learning, check out our industry-oriented Machine learning course curated by industry experts.  

Previous article
Introduction To Geometric Deep Learning
Next article
Perceptron
Live masterclass