opening it up with Common Lisp

Favorite weblogs

Lisp Related

Bill Clementson

Finding Lisp

Lemonodor

Lispmeister.com

Planet Lisp

Politics

Orcinus

Talking Points Memo

This Modern World

Working for Change

Other home

Polliblog

Recent Readings

Book review: Darwinia
Reviewed: Friday, August 11, 2006

Summer reading: Spin
Reviewed: Saturday, August 5, 2006

Runner
Reviewed: Tuesday, July 18, 2006

the Omnivoire's Delimma
Reviewed: Wednesday, July 12, 2006

the Golem's Eye
Reviewed: Wednesday, May 31, 2006





tinderbox

Some Practical Issues in Constructing Belief Networks
Max Henrion, 1989
Tuesday, June 1, 2004

Max Henrion presents a detailed case study applying Knowledge Engineering (KE) to the construction of a Bayesian Network for predicating damage in apple orchards. This is a well written and mostly easy to follow paper (even for someone like me with little background in the field). He steps through the phases of belief net construction and provides detailed explanations of noisy-or and sensitivity analysis. It is an old paper and not deep but it provides a welcome relief from some of the high flying abstractions found in more recent work.

Noisy-or defined

What is Noisy-or anyways?

Noisy-or applies when there are several independent causes and each cause has some probability q to produce the binary effect y even in the absence of the others. The probability of y given a subset of the causes is then found by multiplying. The win here is that we need only n parameters instead of 2 to the n.

Adding an additional leak probability (the chance that the effect occurs even when none of the (known) causes is true) is a handy modeling trick.

Noisy-or can be generalized to non-boolean causes and non-boolean effects. In the first case, we need to asses the probability of the cause for each level of the non-boolean cause. In the second, we treat an n-ary discrete variable as n-1 binary ones. The final value for the effect is then the maximum of the levels produced by each influencing variable. Note that we are assuming causal independence so even if every binary variable indicates "medium" (for example), we are still at medium, not high. [Frankly, I don't think I've understood what he means on this last one. Sigh.]

Sensitivity analysis

I'm sensitive to analysis, are you?

Sensitivity analysis indicates the relative importance of one variable with respect to another. One measure is the Sensitivity Range of y with respect to x: it is the maximum possible change in probability of y as the probability of x goes from 0 to 1. Obviously, the magnitude of the sensitivity range must be less than or equal to 1. This means that the further apart two nodes are (in the Bayes Graph), the less effect they can have on each other.

Things work differently for diagnostic links. Suppose A influence B and that there is a chance of error E in this assessment (assume that A and E are independent). Then L(b,a | e) = p(b|a,e) / p(b|-a, e). I'm sorry to say that I lose track of the math about here but Henrion asserts that we can get high sensitivity factors when the prior probability of A is high and the L(b,a | e) is small. I'll try to update this when I understand it.


Home | About | Quotes | Recent | Archives

Copyright -- Gary Warren King, 2004 - 2006