Baysean Decision Making
© Copyright 2009 Herbert J. Bernstein

It is often the case that we get to make decisions with a substantial body of information supplementing our assumed probability distributions. This additional information allows us to revise our estimates appropriately.

The study of probability estimates in the presence of additional information is call Bayesian Analysis, " named after the Reverend Thomas Bayes (ca. 1702–176" (see http://en.wikipedia.org/wiki/Bayesian). There are differing schools of thought about Bayesian Analysis, but all make use of Bayes' formula:

P(A|B)P(B)
P(B|A) = ----------
P(A)

which says that the probability of B given A (i.e. under the assumption that A is true) is equal to the probability A given B times the probability of B, divided by the probability of A.

The probability of A given B, is essentially the probability of A confined to a universe conisting of B, so we need to take the intersection of A and B and scale by the reciprocal to P(B) to make the probability of our new universe equal to one. Thus

P(A ∩ B)
P(A|B) = ----------
P(B)

Similarly

P(A ∩ B)
P(B|A) = ----------
P(A)

To understand Bayesian analysis, consider the "Monty Hall problem". There are N doors. One of the doors hides a prize. The other doors have nothing behind then. You are the contestant who must guess which door hides the prize. You announce that you are choosing door number k. The game show host, knowing which door does hide the prize, open some door, j, other than k, to show you that door j does not hide the prize. Should we change which door we have picked. Call the case that the prize is really behind door i, Ci. Clearly, when we make our initial choice, P(Ci) = 1/N. The host knows where the prize really is, and he know we picked door k. With no additional information, the probability that the host opens door j is just 1/(N-1)

If the prize is really behind door k, he is free of pick any of N-1 doors to open. Therefore the probability that he opens door j given that the prize is behind door k is 1/(N-1).

If the prize is behind some door other than k and other than j, there were N-2 doors for the host to choose from. Therefore, the probability that he opens door j given that the prize is behind a door other than k and j is 1/(N-2).

If the prize is behind door j, the probability that the host opens door j is zero.

Therefore, the probability that the prize is behind door k, given that the host opens door j is:

P(Ci | host opens door j) = (P(host opens door j | Ci ) * P(Ci )/P(host opens door j)
=(1/(N-1))*(1/N)/(1/(N-1)) = 1/N

and the probability that the prize is behind some door n other than i and other than j is:

P(Cn | host opens door j) = (P(host opens door j | Cn ) * P(Cn )/P(host opens door j)
=(1/(N-2))*(1/N)/(1/(N-1)) = (N-1)/(N*(N-2))

Finally, the probability that the prize is behind door j is clearly zero, so the probability that the prize is behind the door we orginally chose is lower than the probability that it is behind any other door, and we should always change our original selection when the host offers the option of choosing a different door.

As in this case, in decision-making a major use of Bayesian analysis is to adjust estimates of probabilities we have before gaining additional information, so-called, "prior probabilities" by applying Bayes's formula to compute the probability in light of what we know, forming "posterior probabilities"

See http://www.bionicturtle.com/learn/article/fido_helps_explain_bayes_formula/