Aiml Assignment Answers
Aiml Assignment Answers
Let's go through each question in detail based on your syllabus from Unit 3: Knowledge
Representation and Reasoning.
Knowledge Representation is a branch of Artificial Intelligence (AI) that deals with how knowledge
can be represented symbolically and manipulated in an automated way by reasoning programs. It
enables AI systems to reason, learn from data, and make intelligent decisions.
Semantic Networks:
A semantic network is a graph consisting of nodes (representing objects, concepts, or situations) and
edges (representing relationships between the nodes). It visually represents relationships in
knowledge.
Diagram Example:
[Animal]
is-a
[Bird]------has----->[Wings]
can
[Fly]
Animal is a superclass
Semantic networks are useful for inheritance and reasoning. For instance, if "Penguin" is a subclass
of "Bird" and it cannot fly, we can override the "can Fly" property in Penguin.
2. Describe Predicate Logic with Examples. Translate English Statements to Predicate Logic.
Predicate Logic:
Predicate logic (also called First-Order Logic, or FOL) extends propositional logic by introducing:
Syntax:
Quantifiers:
o ∀x – For all x
o ∃x – There exists an x
Examples of Translation:
Predicate logic allows machines to represent and manipulate facts, rules, and relationships.
1. Logical Representation:
2. Semantic Networks:
3. Frames:
Example: A Car frame may have slots like color, model, speed.
4. Production Rules:
5. Procedural Representation:
Each approach has its strengths: logic is precise, semantic networks are intuitive, frames support
inheritance, and rules allow dynamic reasoning.
4. Implement a Basic Classification Algorithm (e.g., K-NN or Naive Bayes) Using a Real Dataset
Let’s explain both K-NN and Naive Bayes in detail, and give an example using Python and a dataset
like Iris dataset.
Idea: Classify a point based on the majority class of its K nearest neighbors.
# Load dataset
iris = load_iris()
X = iris.data
y = iris.target
# Train-Test Split
# KNN Model
model = KNeighborsClassifier(n_neighbors=3)
model.fit(X_train, y_train)
# Prediction
y_pred = model.predict(X_test)
# Accuracy
# Load dataset
iris = load_iris()
X = iris.data
y = iris.target
# Train-Test Split
model = GaussianNB()
model.fit(X_train, y_train)
# Prediction
y_pred = model.predict(X_test)
# Accuracy
Both algorithms are useful for classification tasks, with K-NN being simple and instance-based, while
Naive Bayes works well for text classification and probabilistic modeling.
If you want diagrams or notes in PDF for any of these, let me know—I can generate that too. Would
you like a summary sheet or handwritten-style notes for exam prep?