Lecture - 3 - 6
Lecture - 3 - 6
Lecture - 3 - 6
Human???
Types of Environments
1. Fully observable vs Partially Observable: A fully
observable environment is one in which an agent sensor can
perceive or access the complete state of an agent at any given
time; otherwise, it is a partially observable environment.
• When the agent has no sensors in all environments, it is said
to be unobservable.
• It’s simple to maintain a completely observable environment
because there’s no need to keep track of the environment’s
past history.
• Example:
• Chess — the board is fully observable, so are opponent moves.
• Driving — the environment is partially observable because you
never know what’s around the corner.
Types of Environments
2. Deterministic vs Stochastic: A deterministic environment
is one in which an agent’s current state and chosen action
totally determine the next state of the environment.
• Unlike deterministic environments, stochastic environments
are random in nature and cannot be totally predicted by an
agent.
• Example:
• Chess has only a few possible movements for pieces in their current
state, and these moves can be predicted.
• Self-Driving Cars – The activities of a self-driving car are not
consistent; they change over time.
3. Competitive vs Collaborative: When an agent competes
with another agent to optimize output, it is said to be in a
competitive environment. Ex: Chess is a competitive game in
Types of Environments
4. Static vs Dynamic: A static environment is one in which
there is no change in its state.
• When an agent enters a vacant house, there is no change in
the surroundings.
• A dynamic environment is one that is always changing when
the agent is performing some action.
• A roller coaster ride is dynamic since it is in motion and the
surroundings change all the time.