More details later
"... deals with fusing and propagating the impact of new evidence and beliefs through Bayesian networks so that each proposition eventually will be assigned a certainty measure consistent with the axioms of probalility theory." [Pearl, 1988]
This propagation algorithm assumes that the Bayesian network is singly connected, ie. the graph is a directed acyclic graph (DAG).
The likelihood vector is equals to the term-by-term product of all the message passed from the node's children.
The prior probabilities vector is equals to the dot product of the conditional probabilities matrix of X given all the possible combination values of its parents and the message passed down from its parents;
Our belief of the values of X is equal to the normalised term-by-term product of the likelihood vector and the prior probabilities vector.
Hard to explain by words, see examples. NB: If (x) is an unit vector (ie. all 1's) then the output of the formula would also be an unit vector.
The message that X is going to pass onto a particular child is equals to the belief of X divide (term-by-term) by the message that child sent to X. Here, division by zero is only defined when the numerator is also equals to zero. Zero divided by zero is defined as zero in this case.