Skip to the Main Content

Note:These pages make extensive use of the latest XHTML and CSS Standards. They ought to look great in any standards-compliant modern browser. Unfortunately, they will probably look horrible in older browsers, like Netscape 4.x and IE 4.x. Moreover, many posts use MathML, which is, currently only supported in Mozilla. My best suggestion (and you will thank me when surfing an ever-increasing number of sites on the web which have been crafted to use the new standards) is to upgrade to the latest version of your browser. If that's not possible, consider moving to the Standards-compliant and open-source Mozilla browser.

December 21, 2007

Progic VI

Posted by David Corfield

I talked about Bayesian networks back here as a prelude to the shift from propositional to first-order relational structures. But before we take that step, it’s worth mentioning that there are other graphical models used in probabilistic reasoning. (If you prefer videos try this or this.)

In particular, there’s a class of models called Markov networks, which unlike Bayesian networks involve undirected edges. Given an undirected graph, a distribution is given by assigning non-negative real functions, ϕ k\phi_k, to cliques, kk, in the graph. These are called potential functions.

Then the probability of the node variables being in a given state is given by the product of the potential functions evaluated at the corresponding clique states, divided by a normalizing factor. All of which should put you in mind of certain models from statistical physics.

What Markov networks can’t represent is the classic Bayesian network converging arrows situation, where a variable, CC, has two independent causes, AA and BB. Whereas P(A,B)=P(A)P(B),P(A, B) = P(A) \cdot P(B), it is not the case that P(A,B|C)=P(A|C)P(B|C).P(A, B | C) = P(A | C) \cdot P(B | C). The best a Markov network can do is form a complete graph on AA, BB and CC.

On the other hand, where Markov networks can surpass Bayesian networks is in situations of cyclic dependency. The most natural illustration of this structure involves four heterosexual people, but as this is a family blog, let’s do this with infected computers. So we have personal computers AA and BB, and servers, CC and DD. Each computer is connected to the two servers, and vice versa, but the computers aren’t directly connected, nor are the servers.

Then we have independencies such as P(Aisinfected|B,C,Dareinfected)=P(Aisinfected|C,Dareinfected), P(A is infected| B, C, D are infected) = P(A is infected| C, D are infected), which are not representable by Bayesian networks.

So, neither Bayesian networks nor Markov networks capture all the independencies you might wish to represent. But graphical modellers haven’t stopped there. There are also factor graphs and their cousins directed factor graphs. Then there are chain graphs, which allow directed and undirected edges in the same graph. I’d like a clearer picture of how these models related to each other.

Posted at December 21, 2007 12:03 PM UTC

TrackBack URL for this Entry:

2 Comments & 0 Trackbacks

Re: Progic VI

Markov Nets and Bayesian nets are very closely related and one can easily move from one to another.

Lauritzen and Spiegelhalter’s Junction Tree Algorithm for computing probabilities in a Bayesian net first converts the Bayesian net to a Markov net. Going the other way round, a maximum entropy probability distribution is most naturally represented by a Markov net, but one can straightforwardly convert that to a Bayesian net representation, called an Objective Bayesian Net.

For probability logic it is often important to use another kind of network representation. If one is given premiss sentences and their probabilities, then these constrain a closed convex set of probability functions (a *credal set*). This set is most naturally represented by a *credal net*, which looks like a Bayesian net but contains probability intervals rather than probabilities in the probability tables. See here.

Some colleagues and I are currently working out how to use credal nets for inference in probability logic, in our progicnet project.

Posted by: Jon Williamson on December 21, 2007 3:00 PM | Permalink | Reply to this

Re: Progic VI

But there’s no mapping from Bayesian nets to Markov nets or the other way which retains precisely the independencies. E.g., when you move from the three node converging arrows Bayesian net to a Markov net, it’s to the complete graph on three nodes, i.e., one with no independencies.

With nn binary variables the space of probability distributions forms a 2 n12^n - 1 dimensional space. Network structures correspond to subspaces of this space. So the set of all nn-node Bayesian networks and the set of all nn-node Markov networks each form posets mapping to the relevant subspace inclusion.

These posets of subspaces are not identical, so one may lose information passing between them. If we restricted to graphs which were rooted trees, however, then we can translate accurately between them.

Directed factor graphs, it is claimed at Frey’s page, can express all independencies of Bayesian and Markov networks.

Posted by: David Corfield on December 21, 2007 3:36 PM | Permalink | Reply to this

Post a New Comment