0% found this document useful (0 votes)
153 views3 pages

AI Mid-Term

Uploaded by

sewagtanay
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
153 views3 pages

AI Mid-Term

Uploaded by

sewagtanay
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

AI Mid-Term

Q1. (a) Express the following facts in clause form


(i) Blondes have more fun
( ii) All men are created equal
iii) You always hurt the one you love
(b) What is bayes theorem and what are its characteristics?
Q2. Differentiate between the neural networks, reinforcement learning methods?
Q3. What are the steps in natural language processing? How are various grammars constructed?
Or
Write a short note on Genetic Algorithm?

Sure, let's break down and address each part of the questions:

Q1. (a) Express the following facts in clause form

To express statements in clause form, we typically convert them into a formal logic representation.
Here's how we can do it:

(i) Blondes have more fun

In clause form:

Let's define predicates:


`Blonde(x)`: x is a blonde.
`HaveMoreFun(x)`: x has more fun.
The clause can be expressed as:

∀x (Blonde(x) → HaveM oreF un(x))

(ii) All men are created equal

In clause form:

Let's define predicates:


`Man(x)`: x is a man.
`Equal(x, y)`: x and y are equal.
The clause can be expressed as:

∀x∀y (M an(x) ∧ M an(y) → Equal(x, y))

(iii) You always hurt the one you love

In clause form:

Printed using Save ChatGPT as PDF, powered by PDFCrowd HTML to PDF API. 1/3
Let's define predicates:
`You(x)`: x is you.
`Love(x, y)`: x loves y.
`Hurt(x, y)`: x hurts y.
The clause can be expressed as:

∀x∀y (Y ou(x) ∧ Love(x, y) → Hurt(x, y))

Q1. (b) What is Bayes' Theorem and what are its characteristics?

Bayes' Theorem is a fundamental theorem in probability theory that describes how to update the
probabilities of hypotheses when given evidence. It is expressed as:

P (B∣A) ⋅ P (A)
P (A∣B) =
P (B)

where:

P (A∣B) is the posterior probability of event A given that B is true.


P (B∣A) is the likelihood of event B given that A is true.
P (A) is the prior probability of event A.
P (B) is the probability of event B.

Characteristics of Bayes' Theorem:

Posterior Probability: It allows for the computation of the probability of a hypothesis after
taking into account the observed data.
Prior Probability: It takes into account the initial belief before new evidence is taken into
account.
Likelihood: It assesses the probability of the observed data under different hypotheses.
Normalization: It ensures the probabilities sum to one, maintaining a consistent probabilistic
framework.

Q2. Differentiate between neural networks and reinforcement learning methods

Neural Networks:

Definition: A type of machine learning model inspired by the structure and function of the brain,
consisting of layers of interconnected nodes (neurons).
Training: Trained using supervised learning, unsupervised learning, or semi-supervised learning
methods.
Use Cases: Image recognition, speech recognition, natural language processing, etc.
Structure: Includes input, hidden, and output layers, with weights adjusted through
backpropagation.

Reinforcement Learning (RL):

Definition: A type of machine learning where an agent learns to make decisions by performing
actions in an environment to maximize cumulative reward.
Training: Trained using a trial-and-error approach, receiving feedback in the form of rewards or
penalties.
Use Cases: Game playing, robotics, autonomous driving, etc.

Printed using Save ChatGPT as PDF, powered by PDFCrowd HTML to PDF API. 2/3
Structure: Involves states, actions, and rewards, with learning focused on policy optimization,
value functions, and Q-learning.

Q3. Steps in Natural Language Processing (NLP) and Grammar Construction

Steps in Natural Language Processing:

1. Tokenization: Splitting text into individual words or tokens.


2. Part-of-Speech Tagging: Assigning parts of speech to each token.
3. Lemmatization/Stemming: Reducing words to their base or root form.
4. Parsing: Analyzing the grammatical structure of sentences.
5. Named Entity Recognition (NER): Identifying and classifying named entities in text.
6. Sentiment Analysis: Determining the sentiment or emotion expressed in text.
7. Machine Translation: Translating text from one language to another.
8. Speech Recognition: Converting spoken language into text.

Grammar Construction:

Formal Grammars: Consist of rules that define the syntax of a language. Common types
include:
Context-Free Grammars (CFGs): Used to define the syntax of programming languages and
simple languages.
Regular Grammars: Used for regular expressions and lexical analysis.
Context-Sensitive Grammars: More expressive, used for natural languages.
Probabilistic Grammars: Assign probabilities to rules to handle ambiguity in natural languages.

Or: Write a Short Note on Genetic Algorithm

Genetic Algorithm (GA):

Definition: A search heuristic inspired by the process of natural selection and genetics. It is used
to find approximate solutions to optimization and search problems.
Process:
1. Initialization: Generate an initial population of candidate solutions.
2. Selection: Evaluate the fitness of each candidate and select the fittest individuals for
reproduction.
3. Crossover: Combine pairs of selected individuals to produce offspring with mixed traits.
4. Mutation: Introduce random changes to some offspring to maintain genetic diversity.
5. Replacement: Replace the least fit individuals with new offspring.
6. Iteration: Repeat the selection, crossover, mutation, and replacement steps until a
satisfactory solution is found or a stopping criterion is met.
Applications: Used in optimization problems, machine learning, engineering design, and
artificial life simulations.
Characteristics:
Population-Based: Maintains a population of solutions rather than a single solution.
Fitness Function: Evaluates the quality of solutions.
Genetic Operators: Uses crossover and mutation to explore the solution space.
Evolutionary Process: Mimics the process of natural evolution to iteratively improve
solutions.
ChatGPT can make mistakes. Check important info.

Printed using Save ChatGPT as PDF, powered by PDFCrowd HTML to PDF API. 3/3

You might also like