0% found this document useful (0 votes)
104 views27 pages

LM42, LM43 - Exact Inference in Bayesian Networks

This document contains information about a course on Artificial Intelligence taught at KGiSL Institute of Technology. It includes the following: 1. The course code and name is AL3391 / Artificial Intelligence. It is a 3rd semester course taught in the department of AI & Data Science. 2. The course is divided into 5 units covering topics such as intelligent agents, problem solving, game playing, logical reasoning, and probabilistic reasoning. 3. The expected learning outcomes are that students will be able to explain intelligent agent frameworks, apply problem solving and game playing techniques, perform logical and probabilistic reasoning. 4. Details are provided on exact inference methods in Bayesian networks, including inference by enumeration

Uploaded by

chitresh cs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
104 views27 pages

LM42, LM43 - Exact Inference in Bayesian Networks

This document contains information about a course on Artificial Intelligence taught at KGiSL Institute of Technology. It includes the following: 1. The course code and name is AL3391 / Artificial Intelligence. It is a 3rd semester course taught in the department of AI & Data Science. 2. The course is divided into 5 units covering topics such as intelligent agents, problem solving, game playing, logical reasoning, and probabilistic reasoning. 3. The expected learning outcomes are that students will be able to explain intelligent agent frameworks, apply problem solving and game playing techniques, perform logical and probabilistic reasoning. 4. Details are provided on exact inference methods in Bayesian networks, including inference by enumeration

Uploaded by

chitresh cs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 27

KGiSL Institute of Technology

(Approved by AICTE, New Delhi; Affiliated to Anna University, Chennai)


Recognized by UGC, Accredited by NBA (IT)
365, KGiSL Campus, Thudiyalur Road, Saravanampatti, Coimbatore – 641035.

Department of Artificial Intelligence & Data Science


Name of the Faculty : Mr.MOHANRAJ.S

Subject Name & Code : AL3391 / ARTIFICIAL INTELLIGENCE

Branch & Department : B.Tech & AI&DS

Year & Semester : 2022 / III

Academic Year :2022-23


UNIT I INTELLIGENT AGENTS 9
Introduction to AI – Agents and Environments – concept of rationality – nature of environments – structure of
agents. Problem solving agents – search algorithms – uninformed search strategies.

UNIT II PROBLEM SOLVING 9


Heuristic search strategies – heuristic functions. Local search and optimization problems – local search in
continuous space – search with non-deterministic actions – search in partially observable environments – online
search agents and unknown environments

UNIT III GAME PLAYING AND CSP 9


Game theory – optimal decisions in games – alpha-beta search – montecarlo tree search –stochastic games –
partially observable games. Constraint satisfaction problems – constraint propagation – backtracking search for
CSP – local search for CSP – structure of CSP.

UNIT IV LOGICAL REASONING 9


Knowledge-based agents – propositional logic – propositional theorem proving – propositional model checking –
agents based on propositional logic. First-order logic – syntax and semantics – knowledge representation and
engineering – inferences in first-order logic – forward chaining – backward chaining – resolution.

UNIT V PROBABILISTIC REASONING 9


Acting under uncertainty – Bayesian inference – naïve Bayes models. Probabilistic reasoning – Bayesian networks –
exact inference in BN – approximate inference in BN – causal networks.
SYLLABUS

UNIT V PROBABILISTIC REASONING

Acting under uncertainty – Bayesian inference – naïve Bayes models.

Probabilistic reasoning – Bayesian networks – exact inference in BN –

approximate inference in BN – causal networks.

AL3391/AI/II AI&DS/III SEM/KG-KiTE


Course Outcomes

At the end of this course, the students will be able to:


CO1: Explain intelligent agent frameworks
CO2: Apply problem solving techniques
CO3: Apply game playing and CSP techniques
CO4: Perform logical reasoning
CO5: Perform probabilistic reasoning under uncertainty

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

Purpose of inferences:

•The basic task for any probabilistic inference system is to compute the posterior

probability distribution for a set of query variables, given some observed event - that

is, some assignment of values to a set of evidence variables.

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

Notations:
•X  The query variable.
•E  The set of evidence variables E1, . . . ,Em.
•e  Particular observed event.
•Y  The non-evidence, non-query variables Y1, . . . , Yn(called the hidden variables).
•Thus, the complete set of variables is X={X}∪E ∪Y.
•A typical query asks for the posterior probability distribution P(X | e).

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

Inferences:

•Inference by Enumeration

•Inference by Variable Elimination

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

Inference by Enumeration:
•Any conditional probability can be computed by summing terms from the full joint
distribution.
•A query P(X | e) can be answered using Equation:

•A Bayesian network gives a complete representation of the full joint distribution.


•A query can be answered using a Bayesian network by computing sums of products of
conditional probabilities from the network.

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

Inference by Enumeration:

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

Inference by Enumeration:

•Consider the query P(Burglary | JohnCalls =true, MaryCalls =true).

•Burglary  Query variable(X)

•JohnCalls  Evidence variable 1 (E1)

•MaryCalls  Evidence variable 1 (E2)

•The hidden variables for this query are Earthquake and Alarm. (Y1 and Y2).

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

Inference by Enumeration:

•Using initial letters for the variables to shorten the expressions, we have:

•The semantics of Bayesian networks gives us an expression in terms of CPT entries. For

simplicity, we do this just for Burglary =true:

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

Inference by Enumeration:

•An improvement can be obtained from the following simple observations:

•The P(b) term is a constant and can be moved outside the summations over a and e, and the

P(e) term can be moved outside the summation over a. Hence, we have

•The evaluation process for the expression in Equation (14.4) is shown as an expression tree in

Figure 14.8.

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

Inference by Enumeration:

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

Inference by Enumeration:

•Note that the tree in Figure 14.8 makes explicit the repeated subexpressions evaluated by the

algorithm.

•The products P(j | a) P(m| a) and P(j | ¬a) P(m| ¬a) are computed twice, once for each value

of e.

•A general method, the variable elimination algorithm avoids such wasted computations.

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

Inference by Variable Elimination:

•The enumeration algorithm can be improved substantially by eliminating repeated calculations .

•The idea is simple: do the calculation once and save the results for later use. This is a form of

dynamic programming.

•One of such kind of approach is, the variable elimination algorithm, which is the simplest.

•Variable elimination works by evaluating expressions such as Equation (14.4) in right-to-left

order (that is, bottom up in Figure 14.8).

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

Inference by Variable Elimination:

•Intermediate results are stored, and summations over each variable are done only for those

portions of the expression that depend on the variable.

•Let us illustrate this process for the burglary network. We evaluate the expression:

•We have annotated each part of the expression with the name of the corresponding factor; each

factor is a matrix indexed by the values of its argument variables.

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

Inference by Variable Elimination:

•For example, the factors f4(A) and f5(A) corresponding to P(j | a) and P(m| a) depend just on A

because J and M are fixed by the query. They are therefore two-element vectors:

•In terms of factors, the query expression is written as:

•where the “×” operator is not ordinary matrix multiplication but instead the pointwise
AL3391/AI/II AI&DS/III SEM/KG-KiTE
product operation.
EXACT INFERENCES IN BAYESIAN NETWORKS

Inference by Variable Elimination: Operations on factors


•The pointwise product of two factors f1 and f2 yields a new factor f whose variables are the

union of the variables in f1 and f2 and whose elements are given by the product of the
corresponding elements in the two factors.
•Suppose the two factors have variables Y1, . . . , Yk in common. Then we have:

•If all the variables are binary, then f1 and f2 have 2j+k and 2k+l entries, respectively, and the
pointwise product has 2j+k+l entries.
•For example, given two factors f1(A,B)AL3391/AI/II
and f2AI&DS/III
(B,C), the pointwise product f1 ×f2 = f3(A,B,C)
SEM/KG-KiTE
EXACT INFERENCES IN BAYESIAN NETWORKS

Inference by Variable Elimination: Operations on factors

•Notice that the factor resulting from a pointwise product can contain more variables than any
of the factors being multiplied and that the size of a factor is exponential in the number of
variables. AL3391/AI/II AI&DS/III SEM/KG-KiTE
EXACT INFERENCES IN BAYESIAN NETWORKS

Inference by Variable Elimination: Operations on factors

•Summing out a variable from a product of factors is done by adding up the submatrices formed

by fixing the variable to each of its values in turn.

•For example, to sum out A from f3(A,B,C), we write:

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

Inference by Variable Elimination: Variable ordering and variable relevance


•The time and space requirements of variable elimination are dominated by the size of the
largest factor constructed during the operation of the algorithm.
•That is determined by the order of elimination of variables and by the structure of the
network.
•It turns out to be intractable to determine the optimal ordering, but several good heuristics are
available.
•One fairly effective method is a greedy one: eliminate whichever variable minimizes the size
of the next factor to be constructed.

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

Inference by Variable Elimination: Variable ordering and variable relevance


•Let us consider a query: P(JohnCalls | Burglary =true). As usual, the first step is to write out
the nested summation:

•Evaluating this expression from right to left, we notice something interesting: ⅀m P(m| a) is
equal to 1 by definition!
•Hence the variable M is irrelevant to this query; the result of the query P(JohnCalls |
Burglary =true) is unchanged if we remove MaryCalls from the network altogether.
AL3391/AI/II AI&DS/III SEM/KG-KiTE
EXACT INFERENCES IN BAYESIAN NETWORKS

Inference by Variable Elimination: Variable ordering and variable relevance

•In general, we can remove any leaf node that is not a query variable or an evidence variable.

•After its removal, there may be some more leaf nodes, and these too may be irrelevant.

•Continuing this process, we eventually find that every variable that is not an ancestor of a

query variable or evidence variable is irrelevant to the query.

•A variable elimination algorithm can therefore remove all these variables before evaluating the

query.

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

The complexity of exact inference:

•The complexity of exact inference in Bayesian networks depends strongly on the structure of

the network.

•The burglary network of Figure 14.2 belongs to the family of networks in which there is at

most one undirected path between any two nodes in the network.

•These are called singly connected networks or poly trees, and they have a particularly nice

property: The time and space complexity of exact inference in poly trees is linear in the size of

the network. AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

The complexity of exact inference:


•For multiply connected networks, such as that of Figure 14.12(a), variable elimination can
have exponential time and space complexity in the worst case, even when the number of
parents per node is bounded.
•Inference in Bayesian networks is NP-hard.
•There is a close connection between the complexity of Bayesian network inference and the
complexity of constraint satisfaction problems (CSPs).
•Moreover, the variable elimination algorithm can be generalized to solve CSPs as well as
Bayesian networks.

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

The complexity of exact inference:

AL3391/AI/II AI&DS/III SEM/KG-KiTE


EXACT INFERENCES IN BAYESIAN NETWORKS

Summary:

•Inference in Bayesian networks means computing the probability distribution of a set of

query variables, given a set of evidence variables.

•Exact inference algorithms, such as variable elimination, evaluate sums of products of

conditional probabilities as efficiently as possible.

AL3391/AI/II AI&DS/III SEM/KG-KiTE

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy