# Pearl's Belief Propagation Algorithm

More details later

## Purpose of Algorithm

"... deals with fusing and propagating the impact of new evidence and beliefs through Bayesian networks so that each proposition eventually will be assigned a certainty measure consistent with the axioms of probalility theory." [Pearl, 1988]

## Algorithm

This propagation algorithm assumes that the Bayesian network is singly connected, ie. the graph is a directed acyclic graph (DAG).

### Propagation Rules

The likelihood vector is equals to the term-by-term product of all the message passed from the node's children.

The prior probabilities vector is equals to the dot product of the conditional probabilities matrix of X given all the possible combination values of its parents and the message passed down from its parents;

Our belief of the values of X is equal to the normalised term-by-term product of the likelihood vector and the prior probabilities vector.

### Updating

Hard to explain by words, see examples. NB: If (x) is an unit vector (ie. all 1's) then the output of the formula would also be an unit vector.

### Updating

The message that X is going to pass onto a particular child is equals to the belief of X divide (term-by-term) by the message that child sent to X. Here, division by zero is only defined when the numerator is also equals to zero. Zero divided by zero is defined as zero in this case.

### Boundary Conditions

1. Root nodes: If X is a node with no parents, we set the value equal to the prior probabilities of P(x).

2. Anticipatory nodes: If X is a childless node that has not been instantiated, we set the value as a vector of all 1's.

3. Evidence nodes: If evidence X=xi is obtained, we set the value to (0,...,0,1,0,...0) with 1 at the ith position.

## Dealing with cycles in graph

Index Previous Page Next Page