Timezone: »
Oral
Feedforward Learning of Mixture Models
Matthew Lawlor · Steven W Zucker
We develop a biologically-plausible learning rule that provably converges to the class means of general mixture models. This rule generalizes the classical BCM neural rule within a tensor framework, substantially increasing the generality of the learning problem it solves. It achieves this by incorporating triplets of samples from the mixtures, which provides a novel information processing interpretation to spike-timing-dependent plasticity. We provide both proofs of convergence, and a close fit to experimental data on STDP.
Author Information
Matthew Lawlor (Yale University)
Steven W Zucker (Yale University)
Related Events (a corresponding poster, oral, or spotlight)
-
2014 Poster: Feedforward Learning of Mixture Models »
Wed. Dec 10th 12:00 -- 04:59 AM Room Level 2, room 210D
More from the Same Authors
-
2013 Poster: Third-Order Edge Statistics: Contour Continuation, Curvature, and Cortical Connections »
Matthew Lawlor · Steven W Zucker -
2009 Workshop: The Curse of Dimensionality Problem: How Can the Brain Solve It? »
Simon Haykin · Terrence Sejnowski · Steven W Zucker