Open navigation menu
Close suggestions
Search
Search
en
Change Language
Upload
Sign in
Sign in
Download free for days
0 ratings
0% found this document useful (0 votes)
101 views
33 pages
5 - Uncertainty and Knowledge Reasoning
Uploaded by
Ashutosh Pathak
AI-enhanced title
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Download
Save
Save 5_Uncertainty and Knowledge Reasoning For Later
Share
0%
0% found this document useful, undefined
0%
, undefined
Print
Embed
Report
0 ratings
0% found this document useful (0 votes)
101 views
33 pages
5 - Uncertainty and Knowledge Reasoning
Uploaded by
Ashutosh Pathak
AI-enhanced title
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Carousel Previous
Carousel Next
Download
Save
Save 5_Uncertainty and Knowledge Reasoning For Later
Share
0%
0% found this document useful, undefined
0%
, undefined
Print
Embed
Report
Download
Save 5_Uncertainty and Knowledge Reasoning For Later
You are on page 1
/ 33
Search
Fullscreen
CSE3013 - "Artificial Intelligence" Dr. Pradeep K V Assistant Professor (Sr.) School of Computer Science and Engineering VIT - Chennai Al ‘CSE3O13 - “Artifical Intelligence” EDVIT Introduction We nee It is a state of @ Doubt about the future or @ About what is the right thing to do. Examples @ A doctor does not know exactly what is going on inside a patient. A teacher does not know exactly what a student understands, A robot does not know what is in a room it left a few minutes ago. A period of Politician The way of life in the coastal region Al Ea)y__VIT We used FOL and Propositional logic to represent the knowledge with confidence(certainty), thus we were assured of the predicates. Example: Given A —> B, which means that if A is true, then B must be true But consider a situation where we are not sure about whether A is true or not then we cannot express this statement, this situation is called uncertainty So to represent uncertain knowledge, where we are not sure about the predicates, we need uncertain reasoning or probabilistic reasoning. Causes of uncertainty : @ Information occurred from unreliable sources. @ Experimental Errors or Equipment fault @ Temperature variation or Climate change. Al ray EECSE3013 - "Artificial Intelligence" Dr. Pradeep K V Assistant Professor (Sr.) School of Computer Science and Engineering VIT - Chennai Al ‘CSE3O13 - “Artifical Intelligence” DEDy__VIT a ool ays @ It is a way of representing the knowledge, where we apply the concept of probability to indicate the uncertainty in knowledge. @ Here, we combine probability theory with logic to handle the uncertainty. In the real world, there are lots of scenarios, where the certainty of something is not confirmed, such as : @ "It will rain today" @ "Behavior of someone for some situations" @ "A match between two teams or two players." These are probable sentences for which we can assume that it will happen but not sure about it, so here we use probabilistic reasoning Need of Probabilistic reasoning in Al: © When there are unpredictable outcomes @ When specifications or possibilities of predicates becomes too large to handle. © When an unknown error occurs during an experiment. al ray EEIn PR, There are two ways to solve problems with uncertain knowledge: © Bayes’ rule © Bayesian Statistics It is a chance that an uncertain event will occur. It is measured numerically (0 or 1) of the likelihood that an event will occur. - [1 0 < P(A) < 1, where P(A) is the probability of an event A. - [25 P(A) = 0, indicates total uncertainty in an event A. ((B(GA)) - [3] P(A) = 1, indicates total certainty in an event A. - From ane: (PORFEPAY =O. We can find the probability of an uncertain event by using the below formula. Number_of _Desired_Outcomes Probability of _Occurance = comes Al Ea)y__VIT @ Event : Each possible outcome of a variable is called an event. @ Sample space: The collection of all possible events is called sample space. @ Random variables: Random variables are used to represent the events and objects in the real world. @ Prior probability: The prior probability of an event is probability computed before observing new information. @ Posterior Probability: The probability that is calculated after all evidence or information has taken into account. It is a combination of prior probability and new information Al ray EEVIT BEE Vetere tain of Tchnelogy It is a probability of occurring an event when another event has already happened Let's suppose, we want to calculate the event ‘A’ when event 'B’ has already occurred, "the probability of A under the conditions of B", it can be written as: _ P(AAB) PCAIB) = Ey © Where P(A A B)= Joint probability of A and B © P(B)= Marginal probability of B If the probability of A’ is given and we need to find the probability of B, then it will be given as: _ P(AAB) POBIA) = — Bay Al A DEDLer al probability II It can be explained by using the below Venn diagram, where B is occurred event, so sample space will be reduced to set B, and now we can only calculate event A when event B is already occurred by dividing the probability of P(A A B) by P(B). U Example : In a class, there are 70% of the students who like English and 40% of the students who likes English and mathematics, and then what is the percent of students those who like English also like mathematics? Solution : Let, A is an event that a student likes Mathematics and B is an event that a student likes English. P(AAB) 0.4 <— = 57% P(AIB) = P(B) 07 Hence, [579%] are the students who like English also like Mathematics Al ray EEIt is also known as Bayes’ rule, Bayes’ law, or Bayesian reasoning, which determines the probability of an event with uncertain knowledge It relates the conditional and marginal probabilities of 2 random events. It was named after the British mathematician Thomas Bayes. The Bayesian inference is an application of Bayes’ theorem, which is fundamental to Bayesian statistics. It is a way to calculate the value of P(B|A) with the knowledge of P(AIB). It allows updating the probability prediction of an event by observing new information of the real world. Al rificial Intelligence’ ED Ea)Example: If cancer corresponds to one's age then by using Bayes’ theorem, we can determine the probability of cancer more accurately with the help of age. Bayes’ theorem can be derived using product rule and conditional probability of event A with known event B: As from product rule we can write: Similarly, the probability of event B with known event A P(AA B) = P(A\B)P(B) (1) P(AAB)=P(BIA)P(A) (2) Equating right hand side of both the equations, we will get P(BIA), P(A) P(AA B) = ———_——_ ( ) P(B) (3) The above equation (3) is called as Bayes’ rule or Bayes’ theorem. This equation is basic of most modern Al systems for probabilistic inference. Al A 1/33EEN Moc ‘es we It shows the simple relationship between joint and conditional probabilities. Here, @ P(A|B) is known as posterior, which we need to calculate, and it will be read as Probability of hypothesis A when we have occurred an evidence B. @ P(BJA) is called the likelihood, in which we consider that hypothesis is true, then we calculate the probability of evidence @ P(A) is called the prior probability, probability of hypothesis before considering the evidence © P(B) is called marginal probability, pure probability of an evidence. In the equation (1), in general, we can write P(B) = P(A) « P(BIA;), hence the Bayes’ rule can be written as: P(A: * P(BIA:)) YX, P(Ai) * P(BIA:) Where At, Az, As, «1:1: An is a set of mutually exclusive and exhaustive events. P(Ai|B) = (4) Al AEN Nase Comm wT © Bayes’ rule allows us to compute the single term P(BJA) in terms of P(A|B), P(B), and P(A) © This is very useful in cases where we have a good probability of these three terms and want to determine the fourth one. © Suppose we want to perceive the effect of some unknown cause, and want to compute that cause, then the Bayes’ rule becomes: Pleffect|cause), P(cause) P(causeleffect) = Pattee effect Al AE esVIT BF Viasat of Tahal eu occaae What is the probal y that a patient has diseases meningitis with a stiff neck? Given Data: A doctor is aware that disease meningitis causes a patient to have a stiff neck, and it occurs 80% of the time. He is also aware of some more facts, which are given as follows @ The Known probability that a patient has meningitis disease is 1/30,000. @ The Known probability that a patient has a stiff neck is 2%. Let a be the proposition that patient has stiff neck and b be the proposition that patient has meningitis. , so we can calculate the following as: © P(alb) = 0.8 @ P(b) = 1/30000 © P(a)= .02 0.8% (ao P(bja) = Plaid)» Pb) _ 0.8 * (5005) _ 9 991333333 P(a) 0.02 Hence, we can assume that 1 patient out of 750 patients has meningitis disease with a stiff neck. Ea) rificial Intelligence’ 14/33From a standard deck of playing cards, a single card is drawn. The probability that the card is king is 4/52, then calculate posterior probability P(King|Face), which means the drawn face card is a king card. Solution: P(facelking) « p(king) . face|king) * p(king P(king|face) = S178) “Pune! 1 (king| face) P(face) (1) @ P(king): probability that the card is King= 4/52= 1/13 @ P(face): probability that a card is a face card= 3/13 @ P(face|king): probability of face card when we assume it is a king = 1 1 RB 1 =5 @) 1l* P(king|face) = —y re is the probability that the face card is king card Al A 15/33VIT Applications @ It is used to calculate the next step of the robot when the already executed step is given © Bayes’ theorem is helpful in weather forecasting @ It can solve the Monty Hall problem. Al ES ray EEVIT BF Viasat of Tahal PEW ELM Stl NN elad @ It is key computer technology for dealing with probabilistic events and to solve a problem which has uncertainty. "It is a probabilistic graphical model which represents a set of variables and their conditional dependencies using a directed acyclic graph." It is also called a Bayes/Belief/Decision/Bayesian model. Bayesian networks are probabilistic, because these networks are built from a probability distribution, and also use probability theory for prediction and anomaly detection It can be used for building models from data and experts opinions, and it consists of two parts: © Directed Acyclic Graph @ Table of conditional probabilities. The generalized form of Bayesian network that represents and solve decision problems under uncertain knowledge is known as an Influence diagram Al Ea)VIT Bayesian Network Graph | It is made up of nodes and arcs (Directed Acyclic Graph)[DAG], where Node A “os @ Each node corresponds to the random variables, and a variable can be continuous or discrete. @ Arc or directed arrows represent the causal relationship or conditional probabilities between random variables. These directed links or arrows connect the pair of nodes in the graph. These links represent that one node directly influence the other node, and if there is no directed link that means that nodes are independent with each other @ In the above diagram, A, B, C, and D are random variables represented by the nodes of the network graph. @ If we are considering node B, which is connected with node A by a directed arrow, then node A is called the parent of Node B. @ Node C is independent of node A, al ray EEPENI Nivel mele) ‘es we The Bayesian network has mainly two components: @ Causal Component @ Actual numbers Each node in the Bayesian network has condition probability distribution P(X;|Parent(X;)), which determines the effect of the parent on that node. It is based on Joint probability distribution and conditional probability. Joint Probability Distribution: The probabilities of a different combination of the variables x1,x2,X3......%n are known as the “joint probability distribution" if we have those variables. = Pha }xo.xs mJ P [x2 |x. -P[xn—1|Xo] In General, For every Variable X;, we can write the equation as : P(X;[Xj—1eveeeees X1) = P(X;|Parents(X; )) Al A 19/33,VIT Explanation of Bayesian network | Let's understand the Bayesian network through an example by creating a directed acyclic graph: eee Harry installed a new burglar alarm at his home to detect burglary. The alarm reliably responds at detecting a burglary but also responds for minor earthquakes. Harry has two neighbors David and Sophia, who have taken a responsibility to inform Harry at work when they hear the alarm. David always calls Harry when he hears the alarm, but sometimes he got confused with the phone ringing and calls at that time too. On the other hand, Sophia likes to listen to high music, so sometimes she misses to hear the alarm. Here we would like to compute the probability of Burglary Alarm. Problem: Calculate the probability that alarm has sounded, but there is neither a burglary, nor an earthquake occurred, and David and Sophia both called the Harry. Solution: @ The network structure is showing that burglary and earthquake is the parent node of the alarm and directly affecting the probability of alarm's going off, but David and Sophia’s calls depend on alarm probability. al ray EEI vIT BEE Vetere tain of Tchnelogy Explanation of Bayesian © The network is representing that our assumptions do not directly perceive the burglary and also do not notice the minor earthquake, and they also not confer before calling. The conditional distributions for each node are given as conditional probabilities table or CPT. Each row in the CPT must be sum to 1 because all the entries in the table represent an exhaustive set of cases for the variable boolean variable with k boolean parents contains 2“ ies. Hence, if there are two parents, then CPT will contain 4 probability values List of Events Occurred : Burglary (B) Earthquake(E) Alarm(A) David Calls(D) Sophia calls(S) Ea)TELE mel SE NCE) Events in terms of Probability eam} statement using joint probability distribution: P[D, S, A, B, E] = P[D |S, A, B, E]. PIS, A, B, E] = P[D |S, A,B, E]. P[S | A, B, E]. PIA, B, E| = P[D| A]. P[ S| A, B, E]. P[ A. B, E] = P[D | A]. PIS | A]. P (A| B, E]. P[B, E] P[D | A]. P[S | A]. P[A| B, E]. P[B |E]. PLE] | Burolay BE (Earthquake) | 71 TD [| oe Alarm) —-F—\aan | ne Ft [oe [oe Fe aooe [oan D / \ > re Tone] GD Sophias [* ia + [oat | om als) [7 = [ons [os Dr. Pradeep KV. CSEIDI3 « “Antica ntligence VIT BEE Vetere tain of Tchnelogy P[D, S, A, B, E] can rewrite the above probability AlVIT BEE Vetere tain of Tchnelogy TELE mel SE NCE) Let's take the observed probability for the Burglary and earthquake component: @ P(B=True) = 0.002, which is the probability of burglary. @ P(B=False) = 0.998, which is the probability of no burglary. @ P(E=True) = 0.001, whi is the probability of a minor earthquake @ P(E=False) = 0.999, Which is the probability that an earthquake not occurred. Table: CPT for Alarm A Depends on 'B' & 'E’ m:} E_| P(A=True) | P(A=False) | | [True] True 0.94 006 | [True | False 0.95 0.04 | [False | True 0.31 0.69 | {False | False 0.001 0.999 | Table: CPT for David Calls Depends on ‘A’ Table: CPT for Sophia Calls Depends on ‘A’ A_| P(D=True) | P(D=False) A_| P(S=True) | P(S=False) | True O91 0.09 True 0.75 02 (| False 0.05 0.95 False 0.02 0.98 | Al Dr. Pradeep KV. CSESDI3 - "Arta Intligence’Explanation of Bayesian network V td we From the formula of joint distribution, we can write the problem statement in the form of probability distribution: P(S, D, A, =B, sE) = P (S|A) * P(DIA) * P(A|=B A 7E) * P(=B) * P(-E). P(S, D, A, 3B, “E) = 0.75 «0.91 * 0.001 * 0.998 « 0.999 P(S, D, A, +B, E) = 0.00068045 Hence, a Bayesian network can answer any query about the domain by using Joint distribution. The semantics of Bayesian Network: @ To understand the network as the representation of the Joint probability distribution. (It is helpful to understand how to construct the network.) @ To understand the network as an encoding of a collection of conditional independence statements. (It is helpful in designing inference procedure) Ea)T Problems to Solve...! | ie ace) cua Three people A, B, and C have submitted a job application to a private business. The likelihood of their choices is 1:2:4.The chances that A, B, and C can implement adjustments to increase the company’s profitability are, respectively, 0.8, 0.5, and 0.3.Determine the likelihood that the nomination of c is to blame if the change doesn't occur. Lace) Four balls are in a bag. Without replacement, two balls are picked at random, and it is discovered that they are blue. How likely is it that every ball in the bag will be blue? Problem-3 90% of the youngsters in one community were sick with the flu, 10% with the measles, and no other illnesses were present. For measles, the likelihood of seeing rashes is 0.95, but for the flu, the likelihood is 0.08. Determine the likelihood that a child will have the flu if they develop rashes. Al a EDace) ceed There are three similar cards, with the exception that the first card has red on both sides, the second card has blue on both sides, and the third card has a red side and a blue side. Out of these 3, one card is picked at random and placed on the table with its red visible side. How likely is it that the other side is blue? Lac) deus) There are three jars with white and black balls in them: the first contains three white and two black balls, the second has two white and three black balls, and the third has four white and one black ball. One urn is arbitrarily selected from those one white ball, without any bias. What is the likelihood that it came from the three urns? Al A 26/ 33CSE3013 - "Artificial Intelligence" Dr. Pradeep K V Assistant Professor (Sr.) School of Computer Science and Engineering VIT - Chennai ‘CSE3O13 - “Artifical Intelligence”y__VIT A decision network (also called an influence diagram) is a graphical representation of a finite sequential decision problem. It extend belief networks to include decision variables and utility. It extends the single-stage decision network to allow for sequential decisions, and allows both chance nodes and decision nodes to be parents of decision nodes. It is a directed acyclic graph (DAG) with chance nodes (drawn as ovals), decision nodes (drawn as rectangles), and a utility node (drawn as a diamond). The meaning of the arcs is: @ Arcs coming into decision nodes represent the information that will be available when the decision is made. @ Arcs coming into chance nodes represent probabilistic dependence. @ Arcs coming into the utility node represent what the utility depends on ray EEy__VIT Example - Decision Network Example-1 : Below Figure shows a simple decision network for a decision of whether the agent should take an umbrella when it goes out. The agent's utility depends on the weather and whether it takes an umbrella The agent does not get to observe the weather; it only observes the forecast. The forecast probabilistically depends on the weather, = a > Umbrella Figure: Decision network for decision of whether to take an umbrella Al Era EEVIT BEE Vetere tain of Tchnelogy PX aan ello As part of this network, the designer must specify the domain for each Random and the Domain for each decision variable. @ Random variables are: © Weather{norain, rain}, © Forecast{sunny, cloudy, rainy} @ Decision Variables are : © Umbrella{ take_it, leave_it} There is no domain associated with the utility node The designer also must specify the probability of the random variables given their parents. Suppose P(Weather) is defined as [P(WESthEE=ia) S03) P(Forecast | Weather) is given by norain sunny [0.7 norain cloudy | 0.2 norain rainy [0.1 rain sunny | 0.15 rain cloudy | 0.25 rain rainy [0.6 Al Dr. Pradeep KV. CSEIDI3 « “Artifical Intligence zny__VIT eT) mem BL tol ela And Suppose the utility function, u(Weather, Umbrella) is norain take_it | 20 norain leave_it | 100 rain take_it | 70 rain leave_it |o There is no table specified for the Umbrella decision variable. It is the task of the planner to determine which value of Umbrella to select, as a function of the forecast. ray EEy__VIT Example - Decision Network IV Example-2 : Consider a simple case of diagnosis where a doctor first gets to choose some tests and then gets to treat the patient, taking into account the outcome of the tests. The reason the doctor may decide to do a test is so that some information (the test results) will be available at the next stage when treatment may be performed. The test results will be information that is available when the treatment is decided, but not when the test is decided. It is often a good idea to test, even if testing itself can harm the patient Test Treatment Figure: Decision network for diagnosis Era EEExample - Decision Network V ee a Example-3 : Decision Network for Alarm, The agent can receive a report of people leaving a building and has to decide whether or not to call the fire department. Before calling, the agent can check for smoke, but this has some cost associated with it. The utility depends on whether it calls, whether there is a fire, and the cost associated with checking for smoke. Figure: Decision network for the alarm problem Era EE
You might also like
1 - CSE3013 - Introduction To AI
PDF
No ratings yet
1 - CSE3013 - Introduction To AI
58 pages
Chapter 4
PDF
No ratings yet
Chapter 4
25 pages
Unit-1 To 5
PDF
No ratings yet
Unit-1 To 5
183 pages
Unit 2 - Probabilistic Reasoning
PDF
No ratings yet
Unit 2 - Probabilistic Reasoning
25 pages
AI UNIT-5 Notes AI UNIT-5 Notes: Scan To Open On Studocu Scan To Open On Studocu
PDF
No ratings yet
AI UNIT-5 Notes AI UNIT-5 Notes: Scan To Open On Studocu Scan To Open On Studocu
26 pages
Bayes Rule, Bayesian Model
PDF
No ratings yet
Bayes Rule, Bayesian Model
3 pages
Unit 5
PDF
No ratings yet
Unit 5
23 pages
Unit Ii
PDF
No ratings yet
Unit Ii
30 pages
Unit 5
PDF
No ratings yet
Unit 5
25 pages
Bayesian Theorem
PDF
No ratings yet
Bayesian Theorem
9 pages
Uncertainity and Knowledge Engineering
PDF
No ratings yet
Uncertainity and Knowledge Engineering
24 pages
Ai PPT 3
PDF
No ratings yet
Ai PPT 3
29 pages
Ai (It) Unit-3
PDF
No ratings yet
Ai (It) Unit-3
85 pages
Module 4 - Probability Reasoning and Uncertainty
PDF
No ratings yet
Module 4 - Probability Reasoning and Uncertainty
80 pages
Chapter 5 - Uncertain Knowledge and Reasoning
PDF
No ratings yet
Chapter 5 - Uncertain Knowledge and Reasoning
29 pages
Acting Under Uncertainty - Bayesian Inference-Probabilistic Reasoning
PDF
No ratings yet
Acting Under Uncertainty - Bayesian Inference-Probabilistic Reasoning
22 pages
Aiml CS3491 Cse
PDF
No ratings yet
Aiml CS3491 Cse
91 pages
CS3491 Unit 2
PDF
No ratings yet
CS3491 Unit 2
22 pages
3 CSE3013 Adversarial Search
PDF
No ratings yet
3 CSE3013 Adversarial Search
48 pages
Unit 4 Artificial Intelligence
PDF
No ratings yet
Unit 4 Artificial Intelligence
12 pages
Bayes Rule
PDF
No ratings yet
Bayes Rule
29 pages
Unit-Iii Knowledge & Reasoning
PDF
No ratings yet
Unit-Iii Knowledge & Reasoning
35 pages
PDF&Rendition 1
PDF
No ratings yet
PDF&Rendition 1
26 pages
Aiml Unit 2
PDF
No ratings yet
Aiml Unit 2
22 pages
Unit 3
PDF
No ratings yet
Unit 3
8 pages
UNIT 5 Artificial Intelligence Notes
PDF
No ratings yet
UNIT 5 Artificial Intelligence Notes
20 pages
7 Statistical Reasoning
PDF
No ratings yet
7 Statistical Reasoning
21 pages
4 - CSE3013 - Knowledge Representation and Reasoning - Repaired
PDF
No ratings yet
4 - CSE3013 - Knowledge Representation and Reasoning - Repaired
48 pages
7 - Expert Systems
PDF
No ratings yet
7 - Expert Systems
16 pages
SD Bayes Theorem 1
PDF
No ratings yet
SD Bayes Theorem 1
35 pages
Handling Uncertainty
PDF
No ratings yet
Handling Uncertainty
13 pages
Unit 2new
PDF
No ratings yet
Unit 2new
26 pages
Baes Rule
PDF
No ratings yet
Baes Rule
8 pages
CSE3013 Module5
PDF
No ratings yet
CSE3013 Module5
43 pages
Module3 - Learning, Uncertainity Lecture Notes. 16861418577274
PDF
No ratings yet
Module3 - Learning, Uncertainity Lecture Notes. 16861418577274
30 pages
UNIT2
PDF
No ratings yet
UNIT2
31 pages
AI13
PDF
No ratings yet
AI13
5 pages
Al3391-Unit 5
PDF
No ratings yet
Al3391-Unit 5
23 pages
Unit 8 Ai
PDF
No ratings yet
Unit 8 Ai
29 pages
Ai Unit 4 Unit 4
PDF
No ratings yet
Ai Unit 4 Unit 4
12 pages
UNIT 5 AI Notes
PDF
No ratings yet
UNIT 5 AI Notes
26 pages
CH 6
PDF
No ratings yet
CH 6
19 pages
Cs3351 Aiml Unit 2 Notes Eduengg
PDF
No ratings yet
Cs3351 Aiml Unit 2 Notes Eduengg
26 pages
Iai Unit-V
PDF
No ratings yet
Iai Unit-V
8 pages
Ai2 Unit
PDF
No ratings yet
Ai2 Unit
22 pages
Unit2 AI & ML
PDF
No ratings yet
Unit2 AI & ML
29 pages
AI Models
PDF
No ratings yet
AI Models
10 pages
AI - Module 4
PDF
No ratings yet
AI - Module 4
57 pages
5 UtilityBasedSystems
PDF
No ratings yet
5 UtilityBasedSystems
11 pages
(R17A1204) Artificial Intelligence (6) - 119-143
PDF
No ratings yet
(R17A1204) Artificial Intelligence (6) - 119-143
25 pages
Unit 2 Probabilistic Reasoning
PDF
No ratings yet
Unit 2 Probabilistic Reasoning
21 pages
Causes of Uncertainty
PDF
No ratings yet
Causes of Uncertainty
14 pages
5 Uncertainity Problems
PDF
No ratings yet
5 Uncertainity Problems
30 pages
Ai Cat 2
PDF
No ratings yet
Ai Cat 2
20 pages
5 - CSE3013 - Uncertainity and Knowledge Engineering
PDF
No ratings yet
5 - CSE3013 - Uncertainity and Knowledge Engineering
24 pages
FAI Module 3
PDF
No ratings yet
FAI Module 3
19 pages
Unit-3 Ai
PDF
No ratings yet
Unit-3 Ai
24 pages
Probabilistic Reasoning: Unit-V
PDF
No ratings yet
Probabilistic Reasoning: Unit-V
33 pages
Probabilistic Reasoning
PDF
No ratings yet
Probabilistic Reasoning
9 pages
Probabilistic Reasoning in Artificial Intelligence
PDF
No ratings yet
Probabilistic Reasoning in Artificial Intelligence
5 pages
Unit 2
PDF
No ratings yet
Unit 2
29 pages
AL3391 AI UNIT 5 NOTES EduEngg
PDF
100% (1)
AL3391 AI UNIT 5 NOTES EduEngg
26 pages
CS3491 Unit 2 Aiml
PDF
100% (1)
CS3491 Unit 2 Aiml
21 pages
Unit 5 1
PDF
No ratings yet
Unit 5 1
18 pages