Intelligent Agent: Dept. of Computer Science Faculty of Science and Technology
Intelligent Agent: Dept. of Computer Science Faculty of Science and Technology
Course Code: CSC4226 Course Title: Artificial Intelligence and Expert System
ENVIRONMENT
SENSOR
ACTUATORS
PERCEPT
PERCEPT SEQUENCE
AGENT FUNCTION
AGENT PROGRAM
AGENT FUNCTION
The agent function is a mathematical function that maps a sequence of perceptions into
action.
The function is implemented as the agent program.
The part of the agent taking an action is called an actuator.
environment -> sensors -> agent function -> actuators -> environment
VACUUM CLEANING AGENT
GOOD BEHAVIOR:
THE CONCEPT OF RATIONALITY
Rational Agent
- one does the right thing
- Every entry in the table for the agent function is filled out correctly
What does it mean to do the right thing ?
by considering the consequences of the agent's behavior
Agent - >plunked down in an environment
-> generates a sequence of actions
-> according to the percepts it receives.
A rational agent is one that can take the right decision in every situation.
architecture makes the percepts from the sensors available to the agent
program,
runs the program,
and feeds the program's action choices to the actuators as they are
generated.
Agent programs
take the current percept as input from the sensors and return an action
to the actuators.
AGENT EXAMPLE:
TABLE DRIVEN AGENT
Table-driven agents: the function consists in a lookup table of actions to be
taken for every possible state of the environment.
If the environment has n variables, each with t possible states, then the table
size is tn.
Only works for a small number of possible states for the environment.
Simple reflex agents: deciding on the action to take based only on the
current perception and not on the history of perceptions.
percepts = []
table = {}
(b) the designer would not have time to create the table,
(c) no agent could ever learn all the right table entries from its
experience, and
3. Goal-based agents
4. Utility-based agents.
SIMPLE REFLEX AGENTS
select actions on the basis of the current percept, ignoring the rest of the
percept history.
a condition-action rule, written as
if car-in-front-is-braking then initiate-braking.
general and flexible approach is first to build a
general-purpose interpreter for condition-action rules and
then to create rule sets for specific task environments.
Background Information Current internal state
used in the Process of Decision Process
Updating this internal state information as time goes by requires two kinds of knowledge to be
encoded in the agent program.
First, we need some information about how the world evolves independently of the agent
[ex. overtaking car]
Second, we need some information about how the agent's own actions affect the world
[ex. Turn steering wheel clockwise..]
This knowledge about "how the world works"—whether implemented in simple Boolean circuits
or in complete scientific theories - is called a model of the world.
GOAL-BASED AGENT
Searching
Planning
decision making
UTILITY-BASED AGENTS
Goals alone are not enough to generate high-quality behavior in most environments.
An agent's utility function is essentially an internalization of the performance measure.
1. when there are conflicting goals, only some of which can be achieved
(for example, speed and safety), the utility function specifies the appropriate tradeoff.
2. when there are several goals that the agent can aim for, none of which can be
achieved with certainty, utility provides a way in which the likelihood of success can be
weighed against the importance of the goals.
LEARNING AGENT
HOW COMPONENT OF
AGENT PROGRAMS WORK
References