UNIT 5 Artificial Intelligence
UNIT 5 Artificial Intelligence
Learning
2. Learning Agents
3. Classification of Learning
Supervised Learning: Learns a function from labeled data. (e.g., Spam email
detection)
Unsupervised Learning: Finds patterns in unlabeled data. (e.g., Clustering
customers)
Reinforcement Learning: Learns by interacting with the environment and receiving
rewards/punishments. (e.g., Chess-playing bots)
4. Learning Elements
Definition: Methods that generalize from specific examples to form general rules.
Common Techniques:
o Decision Trees
o Neural Networks
o Rule-based learning
6. Learning Decision Tree
7. Attribute-Based Representation
8. Choosing an Attribute
Definition: Selecting the best attribute for splitting data during decision tree
construction.
Criteria:
o Information gain
o Gini index
o Gain ratio
c
CopyEdit
#include <stdio.h>
int main() {
int result = 10 + 5 * 2; // Multiplication has higher precedence
than addition.
printf("Result: %d\n", result); // Outputs 20
return 0;
}
Key Concepts:
o Precedence determines the order of evaluation.
o Associativity determines the direction of evaluation.
10. Decision Tree Learning
Definition: The set of all possible hypotheses that a learning algorithm can consider
when searching for a solution to a problem.
Key Points:
o A hypothesis is a proposed explanation or model that predicts outcomes based
on input features.
o The hypothesis space is determined by the model's parameters and structure
(e.g., linear models, decision trees, neural networks).
Information Theory
Definition:
Information theory is a mathematical framework for quantifying the transmission, processing,
and storage of information. It measures the uncertainty, or lack of predictability, in a dataset
and helps evaluate how much information is gained or lost in communication or decision-
making processes.
1. Entropy (H):
o A measure of uncertainty or randomness in a dataset.
o Higher entropy indicates higher uncertainty, while lower entropy indicates greater
predictability.
o Formula: H(X)=−∑i=1nP(xi)log2P(xi)H(X) = -\sum_{i=1}^n P(x_i) \log_2
P(x_i)H(X)=−i=1∑nP(xi)log2P(xi) Where P(xi)P(x_i)P(xi) is the probability of outcome
xix_ixi.
Example:
o A fair coin toss has two outcomes (Heads, Tails), each with a probability of 0.5.
H=−(0.5⋅log20.5+0.5⋅log20.5)=1 bit.H = -(0.5 \cdot \log_2 0.5 + 0.5 \cdot \log_2
0.5) = 1 \text{ bit.}H=−(0.5⋅log20.5+0.5⋅log20.5)=1 bit.
2. Redundancy:
o Excess information that does not add value or meaning.
o Example: Repeating the same data multiple times during communication.
3. Mutual Information:
o Quantifies how much information two variables share.
4. Applications of Information Theory:
o Data compression (e.g., JPEG, MP3).
o Error detection and correction in communication.
o Decision-making in machine learning (e.g., feature selection).
Information Gain
Definition:
Information gain (IG) measures the reduction in uncertainty (entropy) of a dataset when it is
split based on a specific attribute. It is a key metric in decision trees for selecting the best
attribute to split the data.
Formula:
IG=H(Parent)−∑(∣Child∣∣Parent∣⋅H(Child))IG = H(Parent) - \sum \left( \frac{|Child|}{|Parent|} \cdot
H(Child) \right)IG=H(Parent)−∑(∣Parent∣∣Child∣⋅H(Child))
Where:
Example:
Dataset:
Weighted entropy:
Information Gain:
1. Decision Trees:
o Used to decide the best attribute for splitting data.
2. Feature Selection:
o Helps identify the most informative features in machine learning models.
3. Classification Tasks:
o Improves prediction accuracy by prioritizing attributes that reduce uncertainty.
2. Hypothesis
4. Naïve Bayes
A factorial of a number nnn is the product of all positive integers less than or equal to
nnn.
Example:
makefile
CopyEdit
Input: 5
Output: 120 (5! = 5 × 4 × 3 × 2 × 1)
6. Instance-Based Learning
7. Neural Networks
An agent interacts with the environment to learn a policy that maximizes cumulative
rewards.
14. Laboratory 15: Program to Implement Five House Logic Puzzle Problem
A logic puzzle involving constraints where each house has unique attributes (e.g.,
color, nationality, pet, drink).
Goal: Use logical reasoning to determine attributes for all houses.