Code360 powered by Coding Ninjas X Naukri.com. Code360 powered by Coding Ninjas X Naukri.com
Table of contents
1.
Introduction
2.
What is the Bayesian Belief Network?
3.
Parts of Bayesian Network
3.1.
Directed Acyclic Graph
3.2.
Table of Conditional Probabilities
4.
A Bayesian Belief Network Graph
5.
Components of Bayesian Network
5.1.
Casual Component
5.2.
Actual Numbers
6.
Joint probability distribution
7.
Example of Bayesian Belief Network
7.1.
Example
7.2.
Problem 
7.3.
Solution
8.
Semantics of Bayesian Network
9.
Frequently Asked Questions
9.1.
What is the Bayesian belief model? Explain briefly.
9.2.
What is a Bayesian belief network graph theory?
9.3.
What is an example of a Bayesian model?
10.
Conclusion
Last Updated: Mar 27, 2024
Medium

Bayesian Belief Network

Author yuvatimankar
0 upvote
Master Python: Predicting weather forecasts
Speaker
Ashwin Goyal
Product Manager @

Introduction

In the field of Artificial intelligence and decision-making, Bayesian Belief Networks (BBNs) have emerged as appealing and powerful concepts. Bayesian Belief Networks provide a framework for analyzing and representing complex systems by specifically modeling the relationships among uncertain variables. With their capability to reason under uncertainty, Bayesian Belief Networks have found multiple applications in fields such as finance, healthcare, environment management, and many more. 

Bayesian Belief Network

In this article, we will explore what are the fundamentals of BBNs, the components of Bayesian networks, and their examples with detailed explanations, so let's get started.

What is the Bayesian Belief Network?

Bayesian Belief Network is a probabilistic graphical model that is used for representing uncertain knowledge and drawing decisions on the basis of that knowledge. These are a type of Bayesian network, a graphical model depicting probabilistic relationships between variables. Bayesian Belief network is also known as belief network, Bayes network, Bayesian model, and decision network.

Bayesian belief networks are probabilistic; the reason behind this is that these networks are designed from a probability distribution and also use theory of probability for anomaly detection and prediction.

Get the tech career you deserve, faster!
Connect with our expert counsellors to understand how to hack your way to success
User rating 4.7/5
1:1 doubt support
95% placement record
Akash Pal
Senior Software Engineer
326% Hike After Job Bootcamp
Himanshu Gusain
Programmer Analyst
32 LPA After Job Bootcamp
After Job
Bootcamp

Parts of Bayesian Network

Bayesian belief network can be used for creating models from data and experts' opinions, and it comprises of two parts: 

  • Directed Acyclic Graph 
     
  • Table of Conditional Probabilities

Directed Acyclic Graph

This is a visual graphic representation of the network's variables and their relationships with one another. In a Directed Acyclic Graph, the nodes indicates variables, and the edges indicate the dependencies between them. The arrows in the graph depicts the direction of casualty.

Table of Conditional Probabilities

For every node in the directed acyclic graph, there is a corresponding table of conditional probabilities that states the probability of every possible value of the node given the values of its parents in the directed acyclic graph. These tables show the probabilistic connection between the variables in the network.

A Bayesian Belief Network Graph

A Bayesian belief network graph is made up of Arcs(directed links)and nodes, where:

  • Each node is correlated with the random variables, and a variable can be discrete or continuous
     
  • Arc (directed rows) indicate the casual relationship or conditional probabilities between random variables. These arrows or directed links connect the pair of nodes in the graph
     
  • These links show that one node directly affects the other node, and if there is no directed connection, that means that nodes are independent of each other
     
  • In the above diagram, as we can see, A, B, C, and D are random variables depicted by the nodes of the network graph 
     
  • If we consider node B, which is connected to node A by a directed arrow, then Node A is known as the parent of Node B
     
  • Node C is not dependent on Node A

Components of Bayesian Network

The Bayesian Belief Network has mainly two components: the casual and the actual number component. 

Casual Component

  • The casual component of a Bayesian belief network indicates the causal relationships between variables in the system
     
  • It consists of directed acyclic graphs that represent the direction of causal relationships among the variables
     
  • The casual component of a Bayesian belief network is important for understanding how the system's variables are connected
     
  • It gives us a graphical representation of the causal relationships between the variables. This representation can be used to make predictions and also to understand how changes in one variable will influence the other

Actual Numbers

  • The numerical component of a Bayesian belief network consists of conditional probability tables for each node in the directed acyclic graph
     
  • These tables specify the probability of each variable given the values of its parent variables
     
  • The numerical component of a Bayesian belief network gives us the actual numbers that are used to make predictions and calculate probabilities
     
  • Every node in the network contains a conditional probability table that specifies the probability of that node given the values of its parent nodes
     
  • These probabilities are used for calculating the overall probabilities of the system given several inputs or observations

Joint probability distribution

In the Bayesian network model, joint probability distribution describes the probability of all possible configurations of the variables of the network. It is the product of the conditional probabilities of each node given its parent in the network. This means that the joint probability distribution gives us a complete description of the probability distribution of all the variables in the network.

Representation:

If we have variables such as x1,x2,x3,x4,..,xn, then the probabilities of various combinations of x1,x2,x3,x4,....xn, are known as Joint probability distribution.

P[x1,x2,x3,x4,.....,xn] This can be written in the following way in terms of the joint probability distribution.


= P[x1|x2,x3,x4,.....,xn]P[x2,x3,x4,....,xn]

= P[x1|x2,x3,x4,....,xn]P[x2|x3,x4,......,xn]P[x3|x4,.....,xn].....P[xn-1|xn]P[xn].

General Syntax for each variable Xi:

P(Xi|Xi-1,........., X1) = P(Xi |Parents(Xi ))

Example of Bayesian Belief Network

Let's see an example of constructing an acrylic graph to illustrate the Bayesian Belief Network.

Example

Henry installed a new burglar alarm at his home to detect burglary. The alarm consistently reacts to a break-in, but it also reacts to minor earthquakes. Henry has two neighbors, John and Olivia, who have taken responsibility for calling Henry at work when they hear an alarm.

John always calls Henry when he hears the alarm, but sometimes, he gets distracted by the phone ringing and calls at other times. On the other hand, Olivia likes to listen to loud music, so sometimes she misses hearing the alarm. Here, we would like to compute the probability of a Burglary Alarm.

Problem 

Determine the probability that the alarm has sounded but that neither a burglary nor an earthquake has occurred and that John and Olivia have phoned Henry.

Solution

  • The Bayesian belief network for the above problem is provided below. The network structure is a representation that earthquake and burglary is the parent node of the alarm and directly affect the probability of the alarm going off, but John and Olivia's calls depend on the alarm probability
     
  • The network is showing that our presumptions did not directly observe the break-in and also did not notice the minor earthquake, and also they did not consult with another before contacting
     
  • The conditional distributions for each node are provided as a conditional probabilities table (CPT)
     
  • Each row in the conditional probabilities table must be summed to 1 because all the entries in the table depict an exhaustive set of cases for the variable
     
  • In the conditional probabilities table, a boolean variable with k boolean parent contains 2k probabilities. Hence, if there are two parents, then the conditional probabilities table will contain 4 probability values


List of all Occurring events in this network:

  • Burglary(B)
  • John Calls(J) 
  • Olivia Calls (O) 
  • Alarm (A)
  • Earthquake(E)


We can write the events of the problem statement in the form of probability: P[J, O, A, B, E], can rewrite the above probability statement using joining probability distribution:

P[J, O, A, B, E]= P[J | O, A, B, E]. P[O, A, B, E]

=P[J | O, A, B, E]. P[O | A, B, E]. P[A, B, E]

= P [J| A]. P [ O| A, B, E]. P[ A, B, E]

= P[J | A]. P[ O | A]. P[A| B, E]. P[B, E]

= P[J | A ]. P[O | A]. P[A| B, E]. P[B |E]. P[E]

diagram1

Let's take the detected probability for the Burglary and earthquake component:

  • The probability of burglary is P(B= True) = 0.002
     
  • The probability of no burglary is P(B= False)= 0.998
     
  • The probability of a minor earthquake is  P(E= True)= 0.001
     
  • The probability that an earthquake has not occurred is P(E= False)= 0.999


Condition probability table for Alarm A:

The Condition probability of Alarm A depends on the earthquake and Burglar:

B

E

P(A=True)

P(A=False)

True

True

0.94

0.06

True

False

0.95

0.04

False

True

0.31

0.69

False

False

0.001

0.999

Conditional probability table for John Calls:

The conditional probability of John that he will call depends on the probability of Alarn.

A

P(J=True)

P(J=False)

True

0.91

0.09

False

0.05

0.95

Conditional probability table for Olivia Calls:

The Conditional probability of Olivia that she calls depends on its parent node "Alarm."

A

P(O=True)

P(O=False)

True

0.75

0.25

False

0.02

0.98

Using the formula of joint distribution probability, we can draw the problem statement in the form of probability distribution:

P(O, J, A, ~B, ~E)

 = P (O|A) *P (J|A)*P (A|~B ^ ~E) *P (~B) *P (~E)

= 0.75* 0.91* 0.001* 0.998*0.999

= 0.00068045

Semantics of Bayesian Network

There are two ways for understanding the semantics of the Bayesian belief network, which is given below:

  • For understanding the network as the representation of the Joint Probability distribution
     
  • For understanding the network as an encoding of a collection of conditional independence statements

Frequently Asked Questions

What is the Bayesian belief model? Explain briefly.

A Bayesian belief model is a probabilistic visual model that depicts a set of variables and their conditional dependencies via a directed acyclic graph. It is one of the multiple forms of casual notation.

What is a Bayesian belief network graph theory?

A Bayesian network is a directed acyclic graph in which each node has quantitive probabilistic information connected to it.

What is an example of a Bayesian model?

Let's take an example: suppose the probability of getting heads when a coin is tossed is 50%. A Bayesian would say it is due to two possibilities only - a head and a tail. And the probability of any of these appearing is the same.

Conclusion

In this article, We have discussed the Bayesian Belief Network. We explore topics such as Introduction, components, Bayesian belief network graph, joint probability distribution, and examples with detailed explanations. We hope this blog has helped you enhance your knowledge of the Bayesian Belief Network. If you want to learn more, then check out our articles:

Refer to our guided paths on Coding Ninjas Studio to upskill yourself in Data Structures and AlgorithmsCompetitive ProgrammingJavaScriptSystem Design, and many more! If you want to test your competency in coding, you may check out the mock test series and participate in the contests hosted on Coding Ninjas Studio! But suppose you have just started your learning process and are looking for questions asked by tech giants like Amazon, Microsoft, Uber, etc. In that case, you must have a look at the problemsinterview experiences, and interview bundles for placement preparations.

Nevertheless, consider our paid courses to give your career an edge over others!

Do upvote our blogs if you find them helpful and engaging!

Happy Learning!

Previous article
Naive Bayes Algorithm
Next article
CART (Classification And Regression Tree) in Machine Learning
Live masterclass