Timezone: »

Visual Search Asymmetry: Deep Nets and Humans Share Similar Inherent Biases
Shashi Kant Gupta · Mengmi Zhang · CHIA-CHIEN WU · Jeremy Wolfe · Gabriel Kreiman

Thu Dec 09 12:30 AM -- 02:00 AM (PST) @

Visual search is a ubiquitous and often challenging daily task, exemplified by looking for the car keys at home or a friend in a crowd. An intriguing property of some classical search tasks is an asymmetry such that finding a target A among distractors B can be easier than finding B among A. To elucidate the mechanisms responsible for asymmetry in visual search, we propose a computational model that takes a target and a search image as inputs and produces a sequence of eye movements until the target is found. The model integrates eccentricity-dependent visual recognition with target-dependent top-down cues. We compared the model against human behavior in six paradigmatic search tasks that show asymmetry in humans. Without prior exposure to the stimuli or task-specific training, the model provides a plausible mechanism for search asymmetry. We hypothesized that the polarity of search asymmetry arises from experience with the natural environment. We tested this hypothesis by training the model on augmented versions of ImageNet where the biases of natural images were either removed or reversed. The polarity of search asymmetry disappeared or was altered depending on the training protocol. This study highlights how classical perceptual properties can emerge in neural network models, without the need for task-specific training, but rather as a consequence of the statistical properties of the developmental diet fed to the model. All source code and data are publicly available at https://github.com/kreimanlab/VisualSearchAsymmetry.

Author Information

Shashi Kant Gupta (Indian Institute of Technology Kanpur)
Mengmi Zhang (Boston Children's Hospital; Harvard Medical School)
CHIA-CHIEN WU (Harvard Medical School)
Jeremy Wolfe
Gabriel Kreiman (Harvard Medical School)

Gabriel Kreiman is Associate Professor at Children's Hospital, Harvard Medical School and leads the thrust to study neural circuits in the Center for Brains, Minds and Machines (MIT/Harvard). He received the NSF Career Award, the NIH New Innovator Award and the Pisart Award for Vision Research. Research in the Kreiman laboratory combines computational, neurophysiological and behavioral tools to further our understanding of how intelligent computations are implemented by neural circuits in the brain. His work has shed light on the biological codes to represent information in cortex and the fundamental principles underlying computations involved in vision and learning. For further details about his work, please visit klab.tch.harvard.edu

More from the Same Authors