Open navigation menu
Close suggestions
Search
Search
en
Change Language
Upload
Sign in
Sign in
Download free for days
0 ratings
0% found this document useful (0 votes)
22 views
46 pages
Unit-4 (AI)
AI info
Uploaded by
Dhanashree Kavathkar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Download
Save
Save Unit-4 (AI) For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
0 ratings
0% found this document useful (0 votes)
22 views
46 pages
Unit-4 (AI)
AI info
Uploaded by
Dhanashree Kavathkar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Carousel Previous
Carousel Next
Download
Save
Save Unit-4 (AI) For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
Download now
Download
You are on page 1
/ 46
Search
Fullscreen
First Order Predicate Logic - In PL, seen that how to represent statements using PL. But only represent facts, which are either true or false. * PLis not sufficient to represent the complex sentences or natural language statements. * PL has very limited expressive power. Some sentence cannot represent using PL logic. “All books available in liberary” "Some Dogs like Biscuit" * PL logic is not sufficient, so more powerful logic required, such as first-order logic.First Order Predicate Logic * It is an extension to propositional logic. * FOL is sufficiently expressive to represent the natural language statements in a concise way. * FOL is a powerful language that develops information about the objects in a more easy way & can also express the relationship b/w those objects. - FOL does not only assume that the world contains facts like PL but also assumes objects, functions & predicate or relations in the world. - As a natural language, first-order logic also has two main parts: * Syntax * SemanticsFOL Statement First-order logic statements can be divided into two parts Subject: Subject is the main part of the statement Predicate: A predicate can be defined as a relation, which binds two atoms together in a statement. Ajay is Student Tommy is a dog Ravi and Ajay are brothersQuantifiers in FOL * A quantifier is a language element which generates quantification, and quantification specifies the quantity of specimen in the universe of discourse. * These are the symbols that permit to determine or identify the range and scope of the variable in the logical expression. There are two types of quantifier: + Universal Quantifier, (for all, everyone, everything) + Existential quantifier, (for some, at least one).Universal Quantifier Universal quantifier is a symbol of logical representation, which specifies that the statement within its range is true for everything or every instance of a particular thing. The Universal quantifier is represented by a symbol V, which resembles an inverted A. Note: In universal quantifier we use implication ">". If xis a variable, then Vx is read as: * For all x * For each x * For every x.Universal Quantifier * All man drink coffee. {X1, X2, X3,.........Xn} drink coffee vx man(x) > drink (x, coffee). * All birds fly. Vx bird(x) >fly(x). * Every man respects his parent. x man(x) > respects (x, parent). * Not all students like both Math and Science. AV (x) [ student(x) > like(x, Math) A like(x, Science)].Existential Quantifier * It is the type of quantifiers, which express that the statement within its scope is true for at least one instance of something. * denoted by the logical operator 4, which resembles as inverted E. When it is used with a predicate variable, called as an existential quantifier. - In Existential quantifier, always use AND or Conjunction symbol (A). * If x is a variable, then existential quantifier:- 4x / 4(x). And it will be read as: * There exists a 'x.' * Forsome 'x.' * For at least one 'x.'Existential Quantifier * Some boys are intelligent. 3x: boys(x) A intelligent(x) * Some boys play cricket. 3x boys(x) A play(x, cricket). * Only one student failed in Math. A(x) [ student(x) > failed (x, Math) AV (y) [-(x==y) A student(y) > -failed (x, Math)].Unification is a kind of binding logic between two or more variables. In propositional logic it is easy to determine that two literals can not both be true at the same time. Eg. Man(Sunil) and -Man(Sunil) isa contradiction while Man(Sunil) and sMan(Arun) is not . In predicate logic, this matching process is more complicated, since bindings of variables must be considered. In Predicate logic in order to determine contradiction we need a matching procedure that compares two literals and discovers whether there exist a set of substitutions that make them identical. Unification algorithms* The goal of unification is to make two expression look like identical by using substitution * It means the meaning of the sentence should not be changed, but it should be expressed in multiple ways. ¢ The UNIFY algorithm in unification takes two sentences as input and then returns a unifier if one exists: — Substitution means replacing one variable with another term. — It takes two literals as input and makes them identical using substitution. — It returns fail if the expressions do not match with each other. UNIFY(p, q)= © where SUBST( 6, p) = SUBST( 6, q).Let's say there are two different expressions, P(x, y), and P(a, f(z)). we need to make both above statements identical to each other. Perform the substitution. «. (ii) Substitute x with a, and y with f(z) in the first expression, and it will be represented as a/x and f(z)/y. With both the substitutions, the first expression will be identical to the second expression and the substitution set will be: [a/x, f(z)/y].* Given: Knows(Ram,x). Is a predicate * Whom does Ram knows? * The UNIFY algorithm will search all the related sentences in the knowledge base, which could unify with Knows(Ram, x). UNIFY (Knows(Ram, x), Knows(Ram, Shyam))={x/Shyam} UNIFY (Knows{Ram,x}, Knows{y, Aakash})={x/Aakash, y/Ram} UNIFY (Knows{Ram,x}, Knows{x, Raman})=fails. Unifier is empty * The last one failed because we have used the same variable for two persons or the two sentences happen to use the same variable name, x * unifications are attempted only with sentences that have some chance of unifying. * For example, there is no point in trying to unify Knows(Ram, x) with Brother(Laxman, Ram).* Predicate symbol must be same, atoms or expression with different predicate symbol can never be unified. tryassassinate (Marcus, Caesar) hate (Marcus, Caesar) * Number of Arguments in both expressions must be identical. hate(Marcus) hate (Marcus, Caesar) * Unification will fail if there are two similar variables present in the same expression.Lifting Lifting is a technique used in Al to generalize knowledge from specific examples. It involves creating higher-level rules or concepts from lower-level ones. Lifting allows Al systems to reason and make decisions based on abstract knowledge.* Forward chaining is a form of reasoning which start with atomic sentences in the knowledge base and applies inference rules (Modus Ponens) in the forward direction to extract more data until a goal is reached Here facts are held in a working memory and condition action rules represent actions to take when specified facts occur in working memory it may add and delete facts from working memory.It is a bottom-up approach, as it moves from bottom to top. It is a process of making a conclusion based on known facts or data, by starting from the initial state and reaches the goal state. Forward-chaining approach is also called as data-driven as the data determines which rules are selected and used. Example: A He exercises regularly. A->B If he is exercising , he is fit. B He is fit.* Backward chaining is a goal driven method of deriving a particular goal from a given knowledge base and set of inference rules The inference system knows the final decision or goal, this system starts from the goal and works backwards to determine what facts must be asserted so that the goal can be achieved. Example B He is fit. A->B __ If he is exercising , he is fit A He exercises regularly.* It is known as a top-down approach. ¢ Backward-chaining is based on modus ponens inference rule. * The goal is broken into sub-goal or sub-goals to prove the facts true. * It is called a goal-driven approach, as a list of goals decides which rules are selected and used.What is Forward and Backward chaining? * A backward chaining algorithm is * Forward chaining is a form of a form of reasoning, which starts reasoning which start with with the goal and works atomic sentences in the backward, chaining through knowledge base and applies rules to find known facts that inference in the forward support the goal. direction to extract more data until a goal is reached. “Top-down” “Bottom-up”Forward Chaining V/S Backward Chaining Forward chaining Backward chaining It follows bottom up approach. process of making a conclusion based on known data, by starting from initial state and reaches the goal state. Forward-chaining approach is also called as data-driven as we reach to the goal using available data. Forward -chaining approach is commonly used in the expert system, such as CLIPS, business, and production rule systems. * It follows top down approach. * In backward chaining, the goal is broken into sub-goal or sub-goals to prove the facts true. * Itis called a goal-driven approach, as a list of goals decides which rules are selected and used. * Backward -chaining algorithm is used in game theory, automated theorem proving tools, inference engines, proof assistants, and various Al applications.Example: Solved in class 1. John likes all kind of food 2. Apple is food 3. Chicken is food 4. Anyone eats anything and alive is food s Bill eats peanut and alive 6. Peter eats everything Bill eats Question : John likes peanuts Forward ChainingUncertainty ¢ When an agent knows enough facts about its environment, the logical plans and actions produces a guaranteed work. ¢ Unfortunately, agents never have access to the whole truth about their environment. Agents act under uncertainty.About Probabiliy + Probability means possibility. quai, + Itis a branch of mathematics that deals with the occurrence of a random event. + Probability is a measure of the likelihood of an event to ocour. & censured fe Example ee + When we toss a coin, either we get Head OR Tail, only two possible outcomes are possible 1D. * But if we toss two coins in the air, there could be three possibilities of events to occur, such as both the coins show heads or both show tails or one shows heads and one tail. + i.e.(H, H), CH, T),(1, T). Prabablty Formula: Number of favorable outcomes toA. Total number of outcomesConditional Probability + The probability of an event occurring given that another event has already occurred is called a conditional probability. Conditional Probability Formula ‘New Sample Space (After AandB “ P(AMB) oe P(A|B)= Probability of PB) AgivenB Probability of BConditional Probability Example * Neha took two tests. + The probability of her passing both tests is 0.6. + The probability of her passing the first test is 0.8. * What is the probability of her passing the second test given that she has passed the first test? Solution: P(first and second) _ 0.6 P(first) 0.8 P(second | first) = = 0.75About Bayes Theorem * Bayes’ theorem is also known as Bayes' rule, Bayes' law, or Bayesian reasoning, which determines the probability of an event with uncertain knowledge. + Bayes' theorem was named after the British mathematician Thomas Bayes. + It is a way to calculate the value of P(BJA) with the knowledge of P(AJB). * Bayes’ theorem allows updating the probability prediction of an event by observing new information of the real world.Bayes Theorem Formula of an event A. Bayes’ Theorem gives the conditional probabil given another event B has occurred Bayes Theorem where: P(B| A)P(A) eeeennan emma ns P(A|B)=————— aoe ota olean P(B) ieeeExample of Bayes Theorem >From the deck of the cards, find the probability of the card being picked is king given that it is the face card. > This can be represented as : P(King|Face) * There are total 52 cards in a deck (n=52), from which 12 cards are face cards; king, queen and jack with club, diamond, heart and spade + There is a set of face cards with 12 members. + There is a subset of king cards with 4 members. * So 4 face cards out of 12 face cards (4/12) are king cards. QUONouaUND PeDEreeOr00Example of Bayes Theorem According to Bayes’ theorem: —poxingtnn) = MfmsKinl Pina) >4 members are king cards out of 52 cards: ping) 4 = 5 3 >12 members are face cards out of 52 cards: P(Facc)= B= >Probability of being face given king: | P(Face|King) = 1 Here we have prior knowledge that if the card is king, it is face card only. It is 100% sure so it is always 1. QOONouaUND POO DOOD DOD Sosesessee PEEEEEE EEE | 44444444<¢Example of Bayes Theorem . . _ P(Face|King)P(King) + Putting all in Bayes' formula: iralFe:) =~“ >Bayes' Theorem proof: P(King\Face) = P(King" Face)/P(Face) So from the image above P(King Face) = 4 P(Face) = 12 P(King|Face) = 4/12 = 1/3 2412 = 1/3Artificial Neural Network (ANN) * Artificial Neural Network (ANN) is inspired by the working of a human brain. * The human brain that has neurons interconnected to one another, ANN also have neurons, that are interconnected to one another in various layers of the networks. These neurons are known as nodes. * Neural Networks are a set of algorithms that tries to recognize the patterns, relationships, and information from the data through the process which is inspired by and works like the human brain.Components of ANN * A Simple Neural network consists of three Asimple neural network components input hidden output * Input layer layer layer —layer. * Hidden layer (middle layer) * Output layerComponents of ANN... * Input Layer: Input nodes receive inputs/information from the outside world. * Hidden Layer: Hidden layer is the set of neurons where all the computations are performed on the input data. * There can be any number of hidden layers in a neural network. The simplest network consists of a single hidden layer. * Output layer: The output layer is the output/conclusions derived from all the computations performed. * There can be single or multiple nodes in the output layer. * If we have a binary classification problem the output node is 1 but in the case of multi-class classification, the output nodes can be more than 1.Perceptron * Perceptron is a simple form of Neural Network and consists of a single layer where all the mathematical computations are performed. *1 WwW, W; Inputs x 2 Output W x3Multilayer Perceptron +» Multilayer Perceptron also known as Artificial Neural Networks consists of more than one perception which is grouped together to form a multiple layer neural network. ¢ An input layer, with 6 input nodes * Hidden Layer 1, with 4 hidden nodes/4 perceptrons * Hidden layer 2, with 4 hidden nodes | * Output layer with 1 output node Hidden layersWorking of the Artificial Neural Network Threshold Summer unit Ee Artificial Neuron i 4 x w, w, W,, w,- Weights of Connection Output b %,%,%,%,- Inputs | b- BiasComputation performed in hidden layers are done in two steps which are as follows : + Step 1. All the inputs are multiplied by their weights. * Weight is the gradient or coefficient of each variable ,. It shows the strength of the particular input. After a ‘roshold assigning the weights, a bias variable is added. : * Bias is a constant that helps the model to fit in the SS — best way possible. » ©Z, = Wy*x, + W2*x, +... + W,*X, + D * Where W1, W2, W3, W4, WS are the weights assigned to the inputs x1, x2, x3, ... xn, and b is the bias.Computation performed in hidden layers are done in two steps which are as follows ... * Step 2. Activation function is applied to the linear equation Z1.(Z, = W,*x, + W*x, + ... + W,*x, + b) * The activation function is a nonlinear transformation that is applied to the input before sending it to the next layer of neurons.How actually Neuron Works? p Activation Functionn rental) Drom oa iD) » ie Types =» Linear and Non - Linear: MT taley (om melon ceya *6 4 12 | a rr) 6 sHyperbolic Tangent(tanh) Function Tanh Function , =e a= e-€ Zz e’ + e”Tanh Hyperbolic Tangent (Tanh) Function Calculation for z =1 Given the tanh function : tanh(xz) = <=> | Cac. Forz=1: tanh(1) = 52> ~ 0.762ReLU Rectified Linear Unit (ReLU) Function Calculation for x = —3 Given the ReLU function : em ReLU(z) = max(0, x) Forz=-3: ReLU(—3) = max(0,—-3) = 0vantages O aN *They are massively parallel .i.e. they can perform multiple tasks parallelly. *Fault-tolerant — like biological ANNs can also survive and function in case some of its functional units stop functioning. *Capable to learn and generalize means they keep on updating their knowledge with time, exposure and experience. *They support Black box functioning (i.e. a system that produces results without the user being able to see or understand how it works).Disadvantages Or ANN *ANNs need massive amount of data to be trained. *ANNs black box nature also turns out to be its disadvantage (the developer can’t figure out how or why ANN came up to certain output). *ANNs are computationally expensive than traditional machine learning algorithms. *ANNs are not suitable for every type of problems as these are more complicated and the development takes longer. |
You might also like
4 KnowledgRepresentation Planning Prob Uncertainity
PDF
No ratings yet
4 KnowledgRepresentation Planning Prob Uncertainity
214 pages
Module 3
PDF
No ratings yet
Module 3
68 pages
vnd.openxmlformats-officedocument.presentationml.presentation&rendition=1
PDF
No ratings yet
vnd.openxmlformats-officedocument.presentationml.presentation&rendition=1
111 pages
W Chapter9
PDF
No ratings yet
W Chapter9
57 pages
Lecture-8-FOL-22042025-082053am
PDF
No ratings yet
Lecture-8-FOL-22042025-082053am
44 pages
M04_02 Inference in First Order Logic
PDF
No ratings yet
M04_02 Inference in First Order Logic
74 pages
Module4_AI
PDF
No ratings yet
Module4_AI
62 pages
Mod-3.2-Knowledge Representation-FOL
PDF
No ratings yet
Mod-3.2-Knowledge Representation-FOL
64 pages
The Facts:: Limitation of Propositional Logic
PDF
No ratings yet
The Facts:: Limitation of Propositional Logic
16 pages
9-Predicate Logic
PDF
No ratings yet
9-Predicate Logic
34 pages
AI-unit-3
PDF
No ratings yet
AI-unit-3
154 pages
using-predicate-logic
PDF
No ratings yet
using-predicate-logic
22 pages
ai 3,4,5 vtu nOTES
PDF
No ratings yet
ai 3,4,5 vtu nOTES
22 pages
Ch 9 Inferrence in FOL
PDF
No ratings yet
Ch 9 Inferrence in FOL
47 pages
8-class-notes
PDF
No ratings yet
8-class-notes
35 pages
Inference in First-Order Logic (FOL)
PDF
No ratings yet
Inference in First-Order Logic (FOL)
56 pages
First Order Logic + Inference in First Order Logic
PDF
No ratings yet
First Order Logic + Inference in First Order Logic
58 pages
Unit III
PDF
No ratings yet
Unit III
109 pages
AI-U4
PDF
No ratings yet
AI-U4
7 pages
CS 2710, ISSP 2160: Inference in First-Order Logic
PDF
No ratings yet
CS 2710, ISSP 2160: Inference in First-Order Logic
45 pages
unit 5 -
PDF
No ratings yet
unit 5 -
60 pages
AI_Unit - 4
PDF
No ratings yet
AI_Unit - 4
53 pages
Artificial Intelligence PPT-7- Inference in FOL
PDF
No ratings yet
Artificial Intelligence PPT-7- Inference in FOL
46 pages
Inference in FOL-Part A
PDF
No ratings yet
Inference in FOL-Part A
24 pages
3.3 -UNIFICATION - FORWARD &BACKWARD CHAINING
PDF
No ratings yet
3.3 -UNIFICATION - FORWARD &BACKWARD CHAINING
18 pages
Ai 3rd Ia Answers[1] (1)
PDF
No ratings yet
Ai 3rd Ia Answers[1] (1)
21 pages
AI unit 3 sem 5_watermark
PDF
No ratings yet
AI unit 3 sem 5_watermark
38 pages
FOL
PDF
No ratings yet
FOL
30 pages
Unit3 ai - unit 3
PDF
No ratings yet
Unit3 ai - unit 3
19 pages
Notes 8: Predicate Logic and Inference: ICS 270a Spring 2003
PDF
No ratings yet
Notes 8: Predicate Logic and Inference: ICS 270a Spring 2003
35 pages
AI Unit 3 Notes
PDF
No ratings yet
AI Unit 3 Notes
6 pages
PPT03-First Order Logic & Inference in FOL I & II
PDF
No ratings yet
PPT03-First Order Logic & Inference in FOL I & II
73 pages
First-Order Deduction: Universal Elimination
PDF
No ratings yet
First-Order Deduction: Universal Elimination
9 pages
Lec13 Fol
PDF
No ratings yet
Lec13 Fol
38 pages
FOL_Inference
PDF
No ratings yet
FOL_Inference
4 pages
Chapter 4 Using Predicate Logic
PDF
No ratings yet
Chapter 4 Using Predicate Logic
22 pages
AI Uni 2
PDF
No ratings yet
AI Uni 2
16 pages
Ai Mid 2 Answers
PDF
No ratings yet
Ai Mid 2 Answers
16 pages
Knowledge Representation
PDF
No ratings yet
Knowledge Representation
50 pages
AI_module_5
PDF
No ratings yet
AI_module_5
16 pages
Inference in First-Order Logic
PDF
No ratings yet
Inference in First-Order Logic
43 pages
15 KB Systems Part3 6up
PDF
No ratings yet
15 KB Systems Part3 6up
7 pages
First-Order Logic in Artificial Intelligence
PDF
No ratings yet
First-Order Logic in Artificial Intelligence
21 pages
Pl&fopl 1
PDF
No ratings yet
Pl&fopl 1
101 pages
BCS515B - Module 5
PDF
No ratings yet
BCS515B - Module 5
15 pages
Module 1 Notes 4
PDF
No ratings yet
Module 1 Notes 4
24 pages
UNit 4 - Logic Programming
PDF
No ratings yet
UNit 4 - Logic Programming
32 pages
Propositional Calculus: Logical Agents
PDF
No ratings yet
Propositional Calculus: Logical Agents
11 pages
module5-241203070654-1dbd6d3d
PDF
No ratings yet
module5-241203070654-1dbd6d3d
16 pages
First-Order Logic: CS472 - Fall 2007 Thorsten Joachims
PDF
No ratings yet
First-Order Logic: CS472 - Fall 2007 Thorsten Joachims
8 pages
AI Unit 2
PDF
No ratings yet
AI Unit 2
198 pages
PPT03-First Order Logic & Inference in FOL
PDF
No ratings yet
PPT03-First Order Logic & Inference in FOL
59 pages
Unification and Lifting
PDF
No ratings yet
Unification and Lifting
8 pages
Unit 4 (Micro) (Ai)
PDF
No ratings yet
Unit 4 (Micro) (Ai)
6 pages
Answer:: Convert The Following To Clausal Form
PDF
No ratings yet
Answer:: Convert The Following To Clausal Form
10 pages
Using Predicate Logic
PDF
No ratings yet
Using Predicate Logic
29 pages
First-Order Logic in Artificial Intelligence
PDF
No ratings yet
First-Order Logic in Artificial Intelligence
21 pages
Knowledge Representation Using Logic
PDF
No ratings yet
Knowledge Representation Using Logic
55 pages
AI CH4 Unit4
PDF
100% (1)
AI CH4 Unit4
14 pages