Solution First Internal
Solution First Internal
GANPAT UNIVERSITY
B. TECH SEM-VI (COMPUTER ENGINEERING)
FIRST INTERNAL EXAMINATION – FEBRUARY 2025
2CEIT606: Artificial Intelligence
Page 1 of 3
Q.2 Consider the following sliding puzzle problem: [05]
State Representation: 'W' (White tile), 'B' (Black tile), '_' (Empty space). 4E
Moves:
1. Slide: A tile can move into an immediately adjacent empty cell.
2. Hop: A tile can jump over one adjacent tile into an empty cell immediately beyond.
Initial and Goal States:
Position 1 2 3 4 5
Initial State B W W B _BWWB
Goal State W W B B WW_BB
Heuristic Function:
h(n)=max(0,𝑝1 -1)+max(0,𝑝2 -2)
where, n is current state, 𝑝1 and 𝑝2 are the 1-indexed positions (from left to right) of the two
White (W) tiles in the current state and sorted so that 𝑝1 ≤ 𝑝2 .
Using the Best First Search Algorithm, determine
whether this heuristic is admissible for the given
puzzle. Provide a detailed justification for your answer.
Answer: Given Heuristic function is admissible for best
first search algorithm due to reaching to goal state.
Search Tree with heuristic function value [Marks:4],
Given Heuristic function is Admissible [Marks:1]
Page 2 of 3
Q.3 Consider the following two Boolean functions: [05]
1. Function F₁: 3C
F₁(x₁, x₂, x₃) = 1 if at least two of the inputs are 1, and 0 otherwise.
2. Function F₂:
F₂(x₁, x₂) = 𝑥1 ∧ ¬𝑥2
For each of these functions, do the following:
Step 1: Construct Truth table of function.
Step 2: Try to implement the function using a McCulloch-Pitts neuron. If this model cannot
implement the function, explain why.
Step 3: If the McCulloch-Pitts neuron fails, try using a single perceptron. If it still fails,
explain the reasons.
Answer:
Function F₁: "At Least Two Inputs Are 1"
• MP Neuron Implementation:[Marks:2]
A single MP neuron works here.
• Threshold: 2
Explanation:
The neuron fires (outputs 1) if the sum of
inputs is at least 2, exactly matching F₁.
Function F₂: F₂(x₁, x₂) = 𝑥1 ∧ ¬𝑥2
A single perceptron allows different weights for each input. By choosing a positive weight for 𝑥1
and a negative weight for 𝑥2 (for example, 𝑤1 =1 and 𝑤2 =−1) along with an appropriate bias 𝑤0
=-0.5), the perceptron can correctly compute the function:
For (1,0): -0.5+1×1+(−1) ×0=0.5 (exceeds 0, output 1)
For (0,1): -0.5+1×0+(−1) ×1=−1.5 (does not exceed 0, output 0)
Thus, while the MP neuron without weights fails to implement the function due to its inability
to treat inputs differently, a single perceptron with adjustable weights can successfully
implement.
----------------------------------------------END OF PAPER--------------------------------------------
Page 3 of 3