opening it up with Common Lisp |
||
Favorite weblogs Lisp Related Politics Other home Recent Readings
Book review: Darwinia
Summer reading: Spin
Runner
the Omnivoire's Delimma
the Golem's Eye |
Dependency Networks for Inference, Collaborative Filtering and Data Visualization Heckerman et. al. present Dependency Networks (DNs), a generalization of Bayesian Networks (BN). A BN is a directed acyclic graph with probability distributions associated with each node such that the joint probability distribution for all the nodes can be computed. A DN is a directed graph -- not necessarily acyclic -- with probability distributions associated with each node such that local dependencies can be computed. There is no guarantee, however, that a consistent joint probability can be found that matches the local distributions. The chief advantage of DNs over BNs is that they can be learned significantly faster (less time and space) than BNs. They also allow for cyclic relations because they represent dependencies, not causality. Speaking more technically, a DN for some set of variables X defines a joint distribution for X via a Gibbs Sampler. Note that Gibbs sampling is order dependent (we can get different distributions depending on the order in which we iterate over the variables in X). The nice thing is that if the learned local distributions are close to the true local distributions, then the Gibbs sampling will produce a joint distribution that is also close to the true joint distribution (Hoffman, 2000 proves this for perterbations measured with an L2 norm). Each local distribution in a DN can be learned via any regression/classification method. "The DN can be thought of as a mechanism for combining regression/classification models via Gibbs sampling to determine a joint distribution." Heckerman et. al. use decision trees to learn each local node. After setting the stage by introducing and explaining DNs, the paper goes on to describe using them for probabalistic inference, collaborative filtering (preference matching) and Data Visualization. DNs work well for all of these tasks because they tend to be so much cheaper computationally. Indeed, BNs give slightly better results in several of the tasks presented but at a far higher cost. This is a nice paper: clean, well presented and mathy without being too mired in the muck. |
Home | About | Quotes | Recent | Archives Copyright -- Gary Warren King, 2004 - 2006 |