Ai Principles
Ai Principles
Ai Principles
(A statistical approach)
Jeremy Wyatt
Two examples
Bayes’ rule
Bayes’ nets
Bob lives in San Francisco. He has a burglar alarm on his house, which
can be triggered by burglars and earthquakes. He has two neighbours,
John and Mary, who will call him if the alarm goes off while he is at
work, but each is unreliable in their own way. All these sources of
uncertainty can be quantified. Mary calls, how likely is it that there has
been a burglary?
Using probabilistic reasoning we can calculate how likely a
hypothesised cause is.
e.g a coin. Let’s use Throw as the random variable denoting the
outcome when we toss the coin.
The set of possible outcomes for a random variable is called its domain.
e.g. if our world consists of only two Boolean random variables, then the
world has a four possible atomic events
P (a b) P (a) P (b) P (a b)
P (a b) P (a ) P (b) P (a b)
So conditional probabilities reflect the fact that some events make other
events more (or less) likely
If one event doesn’t affect the likelihood of another event they are said to
be independent and therefore
P ( a | b) P ( a )
E.g. if you roll a 6 on a die, it doesn’t make it more or less likely that you
will roll a 6 on the next throw. The rolls are independent.
P (a b) P (a | b) P (b) P (b | a ) P(a )
P(a | b) P(b) P (b | a ) P (a )
P (b | a) P(a)
P ( a | b)
P (b)
This is known as Bayes’ rule.
If we model how likely observable effects are given hidden causes (how
likely toothache is given a cavity)
Then Bayes’ rule allows us to use that model to infer the likelihood of the
hidden cause (and thus answer our question)
Pin( sthe
She also knows that the probability | mgeneral
) 0.population
5 of someone having a stiff
neck at any time is 1/20
Pof( smeningitis
She also has to know the incidence ) 0.05in the population (1/50,000)
Using Bayes’ rule she can calculate the probability the patient has meningitis:
P( m) 0.00002
But sometimes it’s harder to find out P(effect|cause) for all causes
independently than it is simply to find out P(effect)
Note that Bayes’ rule here relies on the fact the effect must have arisen
because of one of the hypothesised causes. You can’t reason directly
about causes you haven’t imagined.
AI Principles, Lecture on Reasoning Under Uncertainty
Bayes’ rule: combining evidence
Suppose we have several pieces of evidence we want to
combine:
• John rings and Mary rings
• I have toothache and the dental probe catches on my tooth
How do we do this?
P(cavity | toothache catch) P(toothache catch | cavity) P(cavity)
Toothache and catch are not independent, but they are independent
given the presence or absence of a cavity.
In other words we can use the knowledge that cavities cause toothache
and they cause the catch, but the catch and the toothache do not cause
each other (they have a single common cause).
AI Principles, Lecture on Reasoning Under Uncertainty
Bayes’ nets
This can be captured in a picture, where the arcs capture conditional
independence relationships
Or in a new equation:
Some kinds of
inference don’t seem to
be obviously
explainable using
probabilistic reasoning
alone
It is not the case that statistical methods are the only way