Skip to yearly menu bar Skip to main content


Session

Oral Session 3

Hugo Larochelle

Abstract:
Chat is not available.

Tue 9 Dec. 11:00 - 11:50 PST

Invited Talk
Role of Coupled Networks in 21st Century Energy Infrastructure

Arunava Majumdar

Our modern economy overwhelmingly depends on the electricity infrastructure or the grid. The architecture of the grid owes its origins to Tesla, Edison and their industrial partners, and has remained largely unchanged since then. But this paradigm is being challenged by a confluence of factors, namely, age of the physical assets, cyber-physical security, weather-related stresses, and rapidly reducing cost of renewable electricity and storage. These are producing some unmistakable trends towards increased integration of distributed generation and storage. Furthermore, the integration of communication, computing and control to automate system operation is at its early stages. At scale, the resulting architecture will produce coupling of two large networks - electricity and information. Finally, the abundance and low cost of natural gas has already led to increased dependence of natural gas for electricity generation. The coupling between natural gas and electricity networks is producing dramatic fluctuations in prices with severe economic impact. This talk will discuss the changing landscape of the electricity infrastructure, the challenges that may arise from network coupling, and opportunities to use machine learning to address these challenges.

Tue 9 Dec. 11:50 - 12:10 PST

Oral
Analog Memories in a Balanced Rate-Based Network of E-I Neurons

Dylan Festa · Guillaume Hennequin · Mate Lengyel

The persistent and graded activity often observed in cortical circuits is sometimes seen as a signature of autoassociative retrieval of memories stored earlier in synaptic efficacies. However, despite decades of theoretical work on the subject, the mechanisms that support the storage and retrieval of memories remain unclear. Previous proposals concerning the dynamics of memory networks have fallen short of incorporating some key physiological constraints in a unified way. Specifically, some models violate Dale's law (i.e. allow neurons to be both excitatory and inhibitory), while some others restrict the representation of memories to a binary format, or induce recall states in which some neurons fire at rates close to saturation. We propose a novel control-theoretic framework to build functioning attractor networks that satisfy a set of relevant physiological constraints. We directly optimize networks of excitatory and inhibitory neurons to force sets of arbitrary analog patterns to become stable fixed points of the dynamics. The resulting networks operate in the balanced regime, are robust to corruptions of the memory cue as well as to ongoing noise, and incidentally explain the reduction of trial-to-trial variability following stimulus onset that is ubiquitously observed in sensory and motor cortices. Our results constitute a step forward in our understanding of the neural substrate of memory.

Tue 9 Dec. 12:10 - 12:30 PST

Oral
Feedforward Learning of Mixture Models

Matthew Lawlor · Steven W Zucker

We develop a biologically-plausible learning rule that provably converges to the class means of general mixture models. This rule generalizes the classical BCM neural rule within a tensor framework, substantially increasing the generality of the learning problem it solves. It achieves this by incorporating triplets of samples from the mixtures, which provides a novel information processing interpretation to spike-timing-dependent plasticity. We provide both proofs of convergence, and a close fit to experimental data on STDP.