opening it up with Common Lisp

Favorite weblogs

Lisp Related

Bill Clementson

Finding Lisp

Lemonodor

Lispmeister.com

Planet Lisp

Politics

Orcinus

Talking Points Memo

This Modern World

Working for Change

Other home

Polliblog

Recent Readings

Book review: Darwinia
Reviewed: Friday, August 11, 2006

Summer reading: Spin
Reviewed: Saturday, August 5, 2006

Runner
Reviewed: Tuesday, July 18, 2006

the Omnivoire's Delimma
Reviewed: Wednesday, July 12, 2006

the Golem's Eye
Reviewed: Wednesday, May 31, 2006





tinderbox

Fully Distributed Representations
Pentti Kanerva, 1997 , (Paper URL)
Monday, August 8, 2005

Pentti Kanerva made a name for himself way back in 1988 with a little book called Sparse Distributed Memory. In it, he outlined a computational model of memory that made sense from both computational and neurological perspectives. In this 1997 paper, he builds on the work Plate, Hinton, Pollack and others to describe a simple distributed representation for stuff that the rest of us would store in fixed size records with fields. The representation is slightly reminiscent of bloom filters: represent each thing (field name or value) as a very long random bit string (you can use vectors of reals or complex numbers too); (name, value) pairs are bound together with pair-wise boolean exclusive-OR; sets of these bound pairs can then be chunked together according to majority rule (i.e., each bit in the result vector is set according to the value of the bits that appear most often with ties being broken at random).

The amazing thing is that even with all this randomness and bitwise operations, the resultant chunked vectors retain a similarity to the pieces from which they were derived and you can pull out pairs and values from the vector in a variety of ways that represent both regular lookup and more analogical search. In contrast to 'normal' representations where a single flipped bit brings all to ruin, these holistic (holographic) representations handle noise and combine "structure and semantics" so that similarity actually reflects meaning. Since you cannot continue to chunk values together without a disastrous loss of information, the encoding also might explain George Miller's magical 5 plus or minus 2.

In sum, this is a wonderful, eye opening paper that combines math, mind and amazement.


Home | About | Quotes | Recent | Archives

Copyright -- Gary Warren King, 2004 - 2006