0% found this document useful (0 votes)
92 views53 pages

AI_Unit - 4

The document covers the concepts of First Order Logic (FOL) in artificial intelligence, including its components, inference rules, and the differences between propositional and first-order inference. It explains key processes such as unification and the functioning of inference engines, detailing forward and backward chaining methods. Additionally, it provides examples and applications of these concepts in various AI domains, including natural language processing and expert systems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
92 views53 pages

AI_Unit - 4

The document covers the concepts of First Order Logic (FOL) in artificial intelligence, including its components, inference rules, and the differences between propositional and first-order inference. It explains key processes such as unification and the functioning of inference engines, detailing forward and backward chaining methods. Additionally, it provides examples and applications of these concepts in various AI domains, including natural language processing and expert systems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 53

B.

Tech - II Year – IV Semester

Artificial Intelligence
AI&ML – BTMT2502 /
AI& DS – BTAT2502

Prepared by:
Prof. D. Jagadeesan, B.E (CSE), M.Tech (CSE), Ph.D (CSE),
MISTE, MIEEE,MCSI

Python Programming 1
Unit 4
Logic & Knowledge Representation: First order logic. Inference in
first order logic, propositional vs. first order inference, Unification &
lifts forward chaining, Backward chaining, Resolution, learning from
observation, Explanation-based learning, Statistical Learning methods,
and Reinforcement Learning.
First order logic
• First Order Logic also known as Predicate logic, is a formal system used in artificial intelligence,
mathematics and logic to represent relationships between objects and make inferences. It extends
Propositional Logic by incorporating quantifiers, predicates and variables, making it more
expressive.
• Components of Predicate Logic
1.Constants (Objects): Represent specific entities in the domain. Example: Alice, 5, Paris
2.Variables: Represent unspecified elements in the domain. Example: x, y, z
3.Predicates (Relations/Properties): Functions that define a property or a relation between
objects. Example: Loves(Alice, Bob) (Alice loves Bob)
4.Functions: Define deterministic mappings from objects to objects. Example: Father(John) = Robert
(John’s father is Robert)
5.Logical Connectives: ¬ (Negation), ∧ (AND), V (OR), → (Implication), (Biconditional)
6.Quantifiers:
1. Universal Quantifier (∀ ): "For all" – expresses generality. Example: ∀x Loves(x, Chocolate) →
(Everyone loves chocolate)
2. Existential Quantifier (∃): "There exists" – expresses existence. Example: ∃x Loves(x, Pizza)
(At least one person loves pizza)
First Order Logic
• Examples of Predicate Logic Sentences
1. Basic Statements
• "John is a student." → Student(John)
• "Mary is a doctor." → Doctor(Mary)
2. Relationships Between Objects
• "Alice loves Bob." → Loves(Alice, Bob)
• "Tom is taller than Jerry." → Taller(Tom, Jerry)
• "Paris is the capital of France." → Capital(Paris, France)
• 3. Universal Quantifier ( ∀ ) – "For all"
• "All humans are mortal.“→∀x (Human(x) → Mortal(x))
• "Every student studies.“→ ∀x (Student(x) → Studies(x))
• "If a person is a teacher, they teach.“→ ∀x (Teacher(x) → Teaches(x))
• 4. Existential Quantifier ( ∃ ) – "There exists"
• "Some people like ice cream.“ → ∃x (Person(x) ∧ Likes(x, IceCream))
• "There exists a person who is a millionaire.“→ ∃x (Person(x) ∧ Millionaire(x))
• "Some dogs are friendly.“→ ∃x (Dog(x) ∧ Friendly(x))
Inference in first order logic
• Inference in First-Order Logic is used to deduce new facts or sentences from
existing sentences. Before understanding the FOL inference rule, let's understand
some basic terminologies used in FOL.
• Substitution:
• Substitution is a fundamental operation performed on terms and formulas. It occurs in all
inference systems in first-order logic. The substitution is complex in the presence of
quantifiers in FOL. If we write F[a/x], so it refers to substitute a constant "a" in place of
variable "x".
• FOL inference rules for quantifier:
• As propositional logic we also have inference rules in first-order logic, so following
are some basic inference rules in FOL:
• Universal Generalization -
• Universal Instantiation
• Existential Instantiation
• Existential introduction
Inference in first order logic
• Universal Generalization
• Universal generalization is a valid inference rule which states that if premise P(c) is true for any arbitrary
element c in the universe of discourse, then we can have a conclusion as ∀ x P(x).
• It can be represented as:
• Universal Instantiation
• Universal instantiation is also called as universal elimination or UI is a valid inference rule. It can be
applied multiple times to add new sentences.
• It can be represented as:
• Existential Instantiation
• Existential instantiation is also called as Existential Elimination, which is a valid inference rule in first-
order logic.
• It can be represented as:
• Existential introduction
• An existential introduction is also known as an existential generalization, which is a valid inference rule
in first-order logic.
• It can be represented as:
Propositional vs. First Order Inference

Feature Propositional Logic First-Order Logic


Basic Unit Propositions Predicates, constants, variables

Expressive, can represent relationships and


Expressiveness Limited to true/false statements
properties

Quantifiers None Universal (∀) and Existential (∃)

Combines propositions using


Syntax Uses predicates and quantifiers
logical connectives

Semantics Truth tables Interpretation over a domain

Simple problems (e.g., circuit Complex problems (e.g., AI reasoning,


Use Cases
design, rule-based systems) ontology modeling)

Example P→Q ∀x∃y(Likes(x,y))∀x∃y(Likes(x,y))


Unification
• Unification is a fundamental process in artificial intelligence (AI) and symbolic
reasoning that involves finding a common solution or "unified" form for
expressions containing variables.

• It is the process of making different expressions or terms identical by


assigning values to variables in a way that allows them to match or unify.

• Unification is a key component of all first order inference algorithms.

UNIFY (p,q)= 
where SUBST(,p) = SUBST(q, )
- is unifier value
Unification
• Following are some basic conditions for unification:
• Predicate symbol must be same, atoms or expression with different
predicate symbol can never be unified.
• No. of arguments in both expressions must be identical
• Unification will fail if there are two similar variables present in the same
expression.
• Example
• UNIFY(Knows(John,x), Knows(John, kumar))
SUBST  = x/kumar
(Knows(John,kumar), Knows(John, kumar))
Successfully unified
Unification
• Example
• UNIFY(Knows(John,x), Knows(y, kumar))
SUBST  = x/kumar
S1=(Knows(John,kumar), Knows(y, kumar))
SUBST  = John/y
S2=(Knows(John,kumar), Knows(y, kumar))
Unification SUBST =(x/kumar, John/y)
Successfully unified

• UNIFY(Knows(John,x), Knows(y, mother(y)))


SUBST  = x/mother(y)
S1=(Knows(John,mother(y)), Knows(y, mother(y)))
SUBST  = John/y
S2=(Knows(John,mother(John)), Knows(John, mother(John)))
Unification SUBST =(x/kumar, John/y)
Successfully unified

• UNIFY(Knows(John,x), Knows(x, kumar)


Unification failed because of same variable in both expression.
Applications of Unification
Natural Language Processing (NLP):
• Unification is used in NLP for various tasks, such as parsing and semantic
analysis.

• In parsing, unification helps identify the relationships between words in


a sentence, allowing the system to build syntactic and semantic
structures.

• Unification is also essential for handling ambiguous language constructs


and resolving pronoun references.

• For example, unification can help determine that "he" refers to a specific
person or entity mentioned earlier in a text.
• Logic Programming: Unification is a cornerstone of logic programming
languages like Prolog. In logic programming, unification is used to match
query predicates with database predicates.

• Symbolic Reasoning: In symbolic reasoning and theorem proving,


unification is employed to determine whether two logical expressions are
equivalent or if one can be transformed into the other by substituting values
for variables.

• Expert Systems: Unification is used in expert systems to match user queries


with the knowledge stored in the system's database. It helps determine
which rules or pieces of information are relevant to a specific problem or
query, facilitating the expert system's decision-making process.
Example of unification
Find the MGU of {Q(b, X, f(g(z))) and Q(z, f(y), f(y))}
Here, Ψ1 = Q(b, X, f(g(z))) and Ψ2 = Q(z, f(y), f(y))}
S0 => {Q(b, X, f(g(z))) and Q(z, f(y), f(y))}
SUBST θ= {z/b}
S1 => {Q(b, X, f(g(z))) and Q(b, f(y), f(y))}
SUBST θ= {X/f(y)}
S2 => {Q(b, f(y)}, f(g(z))) and Q(b, f(y), f(y))}
SUBST θ= {y/g(b)}
S2 => {Q(b, f(g(b)), f(g(b))); Q(b, f(g(b)), f(g(b)));}, Successfully Unified.
Example of unification
Find the MGU of Q(a, g(x, a), f(y)), Q(a, g(f(b), a), x)}
Here, Ψ1 = Q(a, g(x, a), f(y)),
and Ψ2 = Q(a, g(f(b), a), x)

S0 => {Q(a, g(x, a), f(y)); Q(a, g(f(b), a), x)}

SUBST θ= {f(b)/x}

S1 => {Q(a, g(f(b), a), f(y)); Q(a, g(f(b), a), f(b))}

SUBST θ= {b/y}

S1 => {Q(a, g(f(b), a), f(b)); Q(a, g(f(b), a), f(b))}, Successfully Unified.
Inference engine
• The inference engine is the component of the intelligent system in artificial
intelligence, which applies logical rules to the knowledge base to infer new
information from known facts.
• The first inference engine was part of the expert system.
• Inference engine commonly proceeds in two modes, which are:
1.Forward chaining
2.Backward chaining
Forward chaining
• Forward chaining is also known as a forward deduction or forward reasoning method when using
an inference engine.
• Forward chaining is a form of reasoning which start with atomic sentences in the knowledge base
and applies inference rules in the forward direction to extract more data until a goal is reached.
• The Forward-chaining algorithm starts from known facts, triggers all rules whose premises are
satisfied, and add their conclusion to the known facts. This process repeats until the problem is
solved.
• Properties of Forward-Chaining:
• It is a bottom-up approach.
• It is a process of making a conclusion based on known facts or data, by starting from the
initial state and reaches the goal state.
• Forward-chaining approach is also called as data-driven as we reach to the goal using
available data.
• Forward -chaining approach is commonly used in the expert system, such as CLIPS, business,
and production rule systems.
Forward chaining
• Example:
• Facts: It is raining
• Rule: If it is raining, the street is wet
• Conclusion: The Street is wet

• Conclude from A and A → B to B

Facts Rules Conclusion


Forward chaining
• Example:
• "As per the law, it is a crime for an American to sell weapons to hostile
nations. Country A, an enemy of America, has some missiles, and all the
missiles were sold to it by Robert, who is an American citizen."
• Prove that "Robert is criminal."
Forward chaining
• Facts Conversion into FOL:
• It is a crime for an American to sell weapons to hostile nations. (Let's say p, q, and r are variables)
American (p) ˄ weapon(q) ˄ sells (p, q, r) ˄ hostile(r) → Criminal(p) ...(1)
• Country A has some missiles. ∃p Owns(A, p) ˄ Missile(p). It can be written in two definite clauses by using
Existential Instantiation, introducing new Constant T1.
Owns(A, T1) ......(2)
Missile(T1) .......(3)
• All of the missiles were sold to country A by Robert.
∀p Missiles(p) ˄ Owns (A, p) → Sells (Robert, p, A) ......(4)
• Missiles are weapons.
Missile(p) → Weapons (p) .......(5)
• Enemy of America is known as hostile.
Enemy(p, America) →Hostile(p) ........(6)
• Country A is an enemy of America.
Enemy (A, America) .........(7)
• Robert is American
American(Robert). ..........(8)
Forward chaining
• Forward chaining proof:
• Step-1:
• we will start with the known facts and will choose the sentences which do not have implications, such
as: American(Robert), Enemy(A, America), Owns(A, T1), and Missile(T1). All these facts will be
represented as below.

Step 2: we will see those facts which infer from available facts and with satisfied premises.
Rule-(1) does not satisfy premises, so it will not be added in the first iteration.
Rule-(2) and (3) are already added.
Rule-(4) satisfy with the substitution {p/T1}, so Sells (Robert, T1, A) is added, which infers from the
conjunction of Rule (2) and (3).
Rule-(6) is satisfied with the substitution(p/A), so Hostile(A) is added and which infers from Rule-(7).
Forward chaining
• Step-3:
• we can check Rule-(1) is satisfied with the substitution {p/Robert, q/T1, r/A}, so we can add
Criminal(Robert) which infers all the available facts. And hence we reached our goal statement.

Hence it is proved that Robert is Criminal using forward chaining approach.


Backward chaining
• Backward-chaining is also known as a backward deduction or backward reasoning method when
using an inference engine.
• Properties of Backward-Chaining:
• It is known as a top-down approach.
• Backward-chaining is based on modus ponens inference rule.
• In backward chaining, the goal is broken into sub-goal or sub-goals to prove the facts true.
• It is called a goal-driven approach, as a list of goals decides which rules are selected and used.
• Backward -chaining algorithm is used in game theory, automated theorem proving tools,
inference engines, proof assistants, and various AI applications.
• The backward-chaining method mostly used a depth-first search strategy for proof.
Backward chaining
• Example:
• "As per the law, it is a crime for an American to sell weapons to hostile
nations. Country A, an enemy of America, has some missiles, and all the
missiles were sold to it by Robert, who is an American citizen."
• Prove that "Robert is criminal."
Backward chaining
• Facts Conversion into FOL:
• It is a crime for an American to sell weapons to hostile nations. (Let's say p, q, and r are variables)
American (p) ˄ weapon(q) ˄ sells (p, q, r) ˄ hostile(r) → Criminal(p) ...(1)
• Country A has some missiles. ∃p Owns(A, p) ˄ Missile(p). It can be written in two definite clauses by using
Existential Instantiation, introducing new Constant T1.
Owns(A, T1) ......(2)
Missile(T1) .......(3)
• All of the missiles were sold to country A by Robert.
∀p Missiles(p) ˄ Owns (A, p) → Sells (Robert, p, A) ......(4)
• Missiles are weapons.
Missile(p) → Weapons (p) .......(5)
• Enemy of America is known as hostile.
Enemy(p, America) →Hostile(p) ........(6)
• Country A is an enemy of America.
Enemy (A, America) .........(7)
• Robert is American
American(Robert). ..........(8)
Backward chaining
• Backward chaining proof:
• Step-1:
• we will take the goal fact. And from the goal fact, we will infer other facts, and at last, we will prove those
facts true. So our goal fact is "Robert is Criminal," so following is the predicate of it.

Step 2: we will infer other facts form goal fact which satisfies the rules. So as we can see in Rule-1, the goal
predicate Criminal (Robert) is present with substitution {Robert/P}. So we will add all the conjunctive facts below
the first level and will replace p with Robert.
Backward chaining
• Backward chaining proof:
• Step-3: we will extract further fact Missile(q) which infer from Weapon(q), as it satisfies Rule-(5). Weapon
(q) is also true with the substitution of a constant T1 at q.

• Step-4, we can infer facts Missile(T1) and Owns(A, T1) form Sells(Robert, T1, r) which satisfies the Rule- 4, with
the substitution of A in place of r. So, these two statements are proved here.
Backward chaining
• Backward chaining proof:
• At step-5, we can infer the fact Enemy(A, America) from Hostile(A) which satisfies Rule- 6. And hence all
the statements are proved true using backward chaining.
Forward and Backward chaining
• Problems:
• Problem 1: As per medical law, it is a violation for a licensed doctor to prescribe banned drugs to
patients. Drug X is a banned drug. Dr. Smith is a licensed doctor and has prescribed Drug X to a
patient. (Using Forward Chaining)

• Problem 2: According to medical ethics, it is considered negligence if a doctor fails to treat an


emergency patient without a valid reason. Patient B was in a critical condition and was denied
treatment by Dr. Johnson. Dr. Johnson did not provide any valid reason for refusing treatment.
(Using Forward and Backward Chaining)

• Problem 3: If a student has passed high school and scored above 90% in their final exams, they are
eligible for admission. If a student is eligible for admission and has extracurricular achievements,
they are shortlisted for an interview. If a student passes the interview, they are admitted to the
university. (Using Forward and Backward Chaining)
Resolution
• Resolution is a theorem proving technique that proceeds by building
refutation proofs, i.e., proofs by contradictions. It was invented by a
Mathematician John Alan Robinson in the year 1965.
• Resolution is used, if there are various statements are given, and we
need to prove a conclusion of those statements. Unification is a key
concept in proofs by resolutions. Resolution is a single inference rule
which can efficiently operate on the conjunctive normal form or clausal
form.
• Clause: Disjunction of literals (an atomic sentence) is called a clause. It is
also known as a unit clause.
• Conjunctive Normal Form: A sentence represented as a conjunction of
clauses is said to be conjunctive normal form or CNF.
Example:
We can resolve two clauses which are given below:
[Animal (g(x) V Loves (f(x), x)] and [ ¬ Loves(a, b) V ¬ Kills(a, b)]
• Where two complimentary literals are:
Loves (f(x), x) and ¬ Loves (a, b)
• These literals can be unified with unifier θ= [a/f(x), and b/x] , and it
will generate a resolvent clause: [Animal (g(x) V ¬ Kills(f(x), x)].
The Resolution Steps

• Conversion of facts into first-order logic.


• Convert FOL statements into CNF
• Negate the statement which needs to prove (proof by contradiction)
• Draw resolution graph (unification)
Example Step-1: Conversion of Facts into FOL – In the first
step we will convert all the given statements into its
• John likes all kind of food. first order logic

• Apple and vegetable are food


• Anything anyone eats and not killed is food.
• Anil eats peanuts and still alive
• Harry eats everything that Anil eats.
• Prove by resolution that:
• John likes peanuts.
Example
• Step-2: Conversion of FOL into CNF
• In First order logic resolution, it is required to convert the FOL into CNF as CNF form makes
easier for resolution proofs.
• Eliminate all implication (→ ) and rewrite
∀x ¬ food(x) V likes(John, x) ∀x [food(x) → likes(John, x)]
food(Apple) Λ food(vegetables)
∀x ∀ y ¬ [eats(x, y) Λ ¬ killed(x)] V food(y) ∀x ∀y [(eats(x, y) ∧ ¬killed(x)) → food(y)]
eats (Anil, Peanuts) Λ alive(Anil)
∀x ¬ eats(Anil, x) V eats(Harry, x) ∀x [eats(Anil, x) → eats(Harry, x)] A→B ➔ ~AVB
∀x¬ [¬ killed(x) ] V alive(x) ∀x [¬killed(x) → alive(x)]
∀x ¬ alive(x) V ¬ killed(x)
likes(John, Peanuts).
Example
Move negation (¬) inwards and rewrite Rename variables or standardize variables
∀x ¬ food(x) V likes(John, x)
∀x ¬ food(x) V likes(John, x)
food(Apple) Λ food(vegetables) food(Apple) Λ food(vegetables)
∀x ∀y ¬ eats(x, y) V killed(x) V food(y) ∀y ∀z ¬ eats(y, z) V killed(y) V food(z)
eats (Anil, Peanuts) Λ alive(Anil) eats (Anil, Peanuts) Λ alive(Anil)
∀x ¬ eats(Anil, x) V eats(Harry, x)
∀w¬ eats(Anil, w) V eats(Harry, w)
∀x ¬killed(x) ] V alive(x)
∀g ¬killed(g) ] V alive(g)
∀x ¬ alive(x) V ¬ killed(x)
likes(John, Peanuts) ∀k ¬ alive(k) V ¬ killed(k)
likes(John, Peanuts).
Example
• Drop Universal quantifiers.
• In this step we will drop all universal quantifier since
all the statements are not implicitly quantified so we
don't need it.
¬ food(x) V likes(John, x)
food(Apple) food(vegetables)
¬ eats(y, z) V killed(y) V food(z)
eats (Anil, Peanuts) alive(Anil)
¬ eats(Anil, w) V eats(Harry, w)
killed(g) V alive(g)
¬ alive(k) V ¬ killed(k)
likes(John, Peanuts).
Example
Step-3: Negate the statement to be proved – In this statement, we will apply negation to the conclusion
statements, which will be written as
¬likes(John, Peanuts)

Step-4: Draw Resolution graph:


• Now in this step, we will solve the problem by resolution tree using substitution. For the above
problem, it will be given as follows:
Supervised Learning
• Supervised learning is a category of machine learning that uses labeled datasets to train algorithms to
predict outcomes and recognize patterns.
• For example, a labeled dataset of images of Elephant, Camel and Cow would have each image
tagged with either “Elephant“, “Camel” or “Cow.”

• Types of Supervised Learning: Supervised learning is classified into two categories of algorithms:
• Regression: A regression problem is when the output variable is a real value, such as “dollars” or
“weight”.

• Classification: A classification problem is when the output variable is a category, such as “Yes” or “No”
, “disease” or “no disease”.
Unsupervised Learning
• Unsupervised learning in artificial intelligence is a type of machine learning that learns from
data without human supervision.
• Unsupervised learning is a type of machine learning that works with data that has no labels or
categories. The main goal is to find patterns and relationships in the data without any guidance.
• For example, a unlabeled dataset of images of Elephant, Camel and Cow would have each
image tagged with either “Elephant“, “Camel” or “Cow.”

• Types of Unsupervised Learning: Unsupervised learning is classified into two categories of algorithms:
• Clustering: A clustering problem is where you want to discover the inherent groupings in the data,
such as grouping customers by purchasing behavior.

• Association: An association rule learning problem is where you want to discover rules that describe
large portions of your data, such as people that buy X also tend to buy Y
Reinforcement learning
• Reinforcement Learning (RL) is a branch of machine learning that focuses on how agents can learn to
make decisions through trial and error to maximize cumulative rewards. RL allows machines to learn by
interacting with an environment and receiving feedback based on their actions. This feedback comes in the
form of rewards or penalties.

• Reinforcement Learning revolves around the idea that an agent (the learner or decision-maker) interacts with an
environment to achieve a goal. The agent performs actions and receives feedback to optimize its decision-making
over time.
• Agent: The decision-maker that performs actions.
• Environment: The world or system in which the agent operates.
• State: The situation or condition the agent is currently in.
• Action: The possible moves or decisions the agent can make.
• Reward: The feedback or result from the environment based on the agent’s action.
Statistical Learning methods
• Statistics is the science of collecting, organizing, analyzing, interpreting, and
presenting data. It encompasses a wide range of techniques for summarizing
data, making inferences, and drawing conclusions.
• Statistical methods help quantify uncertainty and variability in data, allowing
researchers and analysts to make data-driven decisions with confidence.
Types of Statistics
• There are commonly two types of statistics, which are discussed below:
• Descriptive Statistics: It helps us simplify and organize big chunks of
data. This makes large amounts of data easier to understand.
• Inferential Statistics: It is a little different. It uses smaller data to draw
conclusions about a larger group. It helps us predict and draw
conclusions about a population.
Statistical Learning methods
• Applications of Statistics in Machine Learning
• Statistics is a key component of machine learning, with broad applicability in various
fields.
• Feature engineering relies heavily on statistics to convert geometric features into
meaningful predictors for machine learning algorithms.
• In image processing tasks like object recognition and segmentation, statistics
accurately reflect the shape and structure of objects in images.
• Anomaly detection and quality control benefit from statistics by identifying
deviations from norms, aiding in the detection of defects in manufacturing
processes.
• Environmental observation and geospatial mapping leverage statistical analysis to
monitor land cover patterns and ecological trends effectively.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy