Dimensionality reduction, a form of compression, can simplify representations of information to increase efficiency and reveal general patterns. Yet, this simplification also forfeits information, thereby reducing representational capacity. Hence, the brain may benefit from generating both compressed and uncompressed activity, and may do so in a heterogeneous manner across diverse neural circuits that represent low-level (sensory) or high-level (cognitive) stimuli. However, precisely how compression and representational capacity differ across the cortex remains unknown. Here we predict different levels of compression across regional circuits by using random walks on networks to model activity flow and to formulate rate-distortion functions, which are the basis of lossy compression. Using a large sample of youth ($n=1,040$), we test predictions in two ways: by measuring the dimensionality of spontaneous activity from sensorimotor to association cortex, and by assessing the representational capacity for 24 behaviors in neural circuits and 20 cognitive variables in recurrent neural networks. Our network theory of compression predicts the dimensionality of activity ($t=12.13, p<0.001$) and the representational capacity of biological ($r=0.53, p=0.016$) and artificial ($r=0.61, p<0.001$) networks. The model suggests how a basic form of compression is an emergent property of activity flow between distributed circuits that communicate with the rest of the network.
Dale Zhou (University of Pennsylvania)
Jason Kim (Cornell University)
I am generally interested in designing and programming functions in distributed systems. More specifically, I am interested in how recurrent neural networks represent information, and how to program their interactions to run algorithms. I am currently working on how to decompile the internal models and algorithms learned by RNNs from their dynamics and recurrent weights. My long term goal is to develop a low-level understanding of distributed neural computation which allows us to engineer neural networks with the same degree of precision and algorithmic complexity as silicon computers. My undergraduate degree was in Biomedical Engineering at Duke University, where I worked with Dr. Marc Sommer on characterizing the response of biological neurons to transcranial magnetic stimulation in macaques. I did my PhD in Bioengineering at the University of Pennsylvania with D. Dani Bassett on designing advanced functions in complex neural and mechanical systems. I am currently a postdoctoral fellow in Physics at Cornell University working with Dr. Itai Cohen and Dr. James Sethna working to interface distributed neural computing with microrobotics.
Danielle S Bassett (University of Pennsylvania)
Prof. Bassett is the J. Peter Skirkanich Professor at the University of Pennsylvania, with appointments in the Departments of Bioengineering, Electrical & Systems Engineering, Physics & Astronomy, Neurology, and Psychiatry. Bassett is also an external professor of the Santa Fe Institute. Bassett is most well-known for blending neural and systems engineering to identify fundamental mechanisms of cognition and disease in human brain networks. Bassett is currently writing a book for MIT Press entitled Curious Minds, with co-author Perry Zurn Professor of Philosophy at American University. Bassett received a B.S. in physics from Penn State University and a Ph.D. in physics from the University of Cambridge, UK as a Churchill Scholar, and as an NIH Health Sciences Scholar. Following a postdoctoral position at UC Santa Barbara, Bassett was a Junior Research Fellow at the Sage Center for the Study of the Mind. Bassett has received multiple prestigious awards, including American Psychological Association's ‘Rising Star’ (2012), Alfred P Sloan Research Fellow (2014), MacArthur Fellow Genius Grant (2014), Early Academic Achievement Award from the IEEE Engineering in Medicine and Biology Society (2015), Harvard Higher Education Leader (2015), Office of Naval Research Young Investigator (2015), National Science Foundation CAREER (2016), Popular Science Brilliant 10 (2016), Lagrange Prize in Complex Systems Science (2017), Erdos-Renyi Prize in Network Science (2018), OHBM Young Investigator Award (2020), AIMBE College of Fellows (2020). Bassett is the author of more than 300 peer-reviewed publications, which have garnered over 24,000 citations, as well as numerous book chapters and teaching materials. Bassett is the founding director of the Penn Network Visualization Program, a combined undergraduate art internship and K-12 outreach program bridging network science and the visual arts. Bassett’s work has been supported by the National Science Foundation, the National Institutes of Health, the Army Research Office, the Army Research Laboratory, the Office of Naval Research, the Department of Defense, the Alfred P Sloan Foundation, the John D and Catherine T MacArthur Foundation, the Paul Allen Foundation, the ISI Foundation, and the Center for Curiosity.
Related Events (a corresponding poster, oral, or spotlight)
2022 : Compression supports low-dimensional representations of behavior across neural circuits »
Sat. Dec 3rd 03:50 -- 03:58 PM Room
More from the Same Authors
2022 : Characterizing information loss in a chaotic double pendulum with the Information Bottleneck »
Kieran Murphy · Danielle S Bassett
2020 : Panel discussion 2 »
Danielle S Bassett · Yoshua Bengio · Cristina Savin · David Duvenaud · Anna Choromanska · Yanping Huang
2020 : Invited Talk Danielle Bassett »
Danielle S Bassett