0% found this document useful (0 votes)
6 views5 pages

Aiml Assignment Answers

The document discusses Knowledge Representation in AI, focusing on semantic networks and predicate logic, explaining their structures and uses. It outlines five approaches to knowledge representation: logical representation, semantic networks, frames, production rules, and procedural representation. Additionally, it provides examples of K-NN and Naive Bayes classification algorithms using the Iris dataset in Python.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views5 pages

Aiml Assignment Answers

The document discusses Knowledge Representation in AI, focusing on semantic networks and predicate logic, explaining their structures and uses. It outlines five approaches to knowledge representation: logical representation, semantic networks, frames, production rules, and procedural representation. Additionally, it provides examples of K-NN and Naive Bayes classification algorithms using the Iris dataset in Python.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

Sure!

Let's go through each question in detail based on your syllabus from Unit 3: Knowledge
Representation and Reasoning.

1. What is Knowledge Representation? Explain Semantic Networks with a Diagram.

Knowledge Representation (KR):

Knowledge Representation is a branch of Artificial Intelligence (AI) that deals with how knowledge
can be represented symbolically and manipulated in an automated way by reasoning programs. It
enables AI systems to reason, learn from data, and make intelligent decisions.

Features of a Good Knowledge Representation:

 Representational Adequacy – Ability to represent a wide variety of knowledge.

 Inferential Adequacy – Ability to derive new knowledge.

 Inferential Efficiency – Ability to perform inference quickly.

 Acquisitional Efficiency – Ability to acquire new knowledge easily.

Semantic Networks:

A semantic network is a graph consisting of nodes (representing objects, concepts, or situations) and
edges (representing relationships between the nodes). It visually represents relationships in
knowledge.

Diagram Example:

[Animal]

is-a

[Bird]------has----->[Wings]

can

[Fly]

 Animal is a superclass

 Bird is a subclass of Animal

 Bird "has" Wings

 Bird "can" Fly

Semantic networks are useful for inheritance and reasoning. For instance, if "Penguin" is a subclass
of "Bird" and it cannot fly, we can override the "can Fly" property in Penguin.
2. Describe Predicate Logic with Examples. Translate English Statements to Predicate Logic.

Predicate Logic:

Predicate logic (also called First-Order Logic, or FOL) extends propositional logic by introducing:

 Objects (constants, like John)

 Predicates (properties or relations, like IsStudent(x))

 Quantifiers (like ∀ for "for all", ∃ for "there exists")

Syntax:

 Predicates: Loves(John, Pizza)

 Quantifiers:

o ∀x – For all x

o ∃x – There exists an x

Examples of Translation:

1. English: All humans are mortal.

o Predicate Logic: ∀x (Human(x) → Mortal(x))

2. English: Some students are intelligent.

o Predicate Logic: ∃x (Student(x) ∧ Intelligent(x))

3. English: Every cat is an animal.

o Predicate Logic: ∀x (Cat(x) → Animal(x))

4. English: John loves Mary.

o Predicate Logic: Loves(John, Mary)

Predicate logic allows machines to represent and manipulate facts, rules, and relationships.

3. What are the Different Approaches of Knowledge Representation in AI?

There are five major approaches to Knowledge Representation in AI:

1. Logical Representation:

 Based on formal logic (Propositional & Predicate logic).

 Deals with facts and rules.

 Example: ∀x (Bird(x) → CanFly(x))

2. Semantic Networks:

 Uses graph-based structure.


 Nodes represent concepts, edges represent relationships.

3. Frames:

 Data structures for representing stereotyped situations.

 Each frame has slots (attributes) and values.

 Example: A Car frame may have slots like color, model, speed.

4. Production Rules:

 "If-Then" rules used in expert systems.

 Example: IF temperature > 100, THEN turn_on_fan

5. Procedural Representation:

 Encodes knowledge in the form of procedures or algorithms.

 Useful for how-to knowledge.

 Example: Cooking recipes or sorting algorithms.

Each approach has its strengths: logic is precise, semantic networks are intuitive, frames support
inheritance, and rules allow dynamic reasoning.

4. Implement a Basic Classification Algorithm (e.g., K-NN or Naive Bayes) Using a Real Dataset

Let’s explain both K-NN and Naive Bayes in detail, and give an example using Python and a dataset
like Iris dataset.

K-NN (K-Nearest Neighbors) Classifier:

 Type: Supervised, Lazy Learner

 Idea: Classify a point based on the majority class of its K nearest neighbors.

from sklearn.datasets import load_iris

from sklearn.model_selection import train_test_split

from sklearn.neighbors import KNeighborsClassifier

from sklearn.metrics import accuracy_score

# Load dataset

iris = load_iris()

X = iris.data

y = iris.target
# Train-Test Split

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2)

# KNN Model

model = KNeighborsClassifier(n_neighbors=3)

model.fit(X_train, y_train)

# Prediction

y_pred = model.predict(X_test)

# Accuracy

print("Accuracy:", accuracy_score(y_test, y_pred))

Naive Bayes Classifier:

 Type: Probabilistic model, assumes feature independence

 Based on: Bayes Theorem

from sklearn.datasets import load_iris

from sklearn.model_selection import train_test_split

from sklearn.naive_bayes import GaussianNB

from sklearn.metrics import accuracy_score

# Load dataset

iris = load_iris()

X = iris.data

y = iris.target

# Train-Test Split

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2)

# Naive Bayes Model

model = GaussianNB()
model.fit(X_train, y_train)

# Prediction

y_pred = model.predict(X_test)

# Accuracy

print("Accuracy:", accuracy_score(y_test, y_pred))

Both algorithms are useful for classification tasks, with K-NN being simple and instance-based, while
Naive Bayes works well for text classification and probabilistic modeling.

If you want diagrams or notes in PDF for any of these, let me know—I can generate that too. Would
you like a summary sheet or handwritten-style notes for exam prep?

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy