Seminar - Neural Computations Underlying Auditory-object Processing
Date: November 5, 2010
Time: 4:00 PM
Location: Matheson Hall, Room: 109
Yale E. Cohen, Ph.D.
Departments of Otorhinolaryngology, Bioengineering, and Neuroscience
University of Pennsylvania
Object formation is fundamental to perceptual processing. Independent of modality, three general principles are involved in the analysis of sensory objects: (1) the extraction of sensory features, (2) combining the features to form an object, and (3) the formation of categories that represent higher-order properties of the objects. While feature extraction is well characterized in early areas of the auditory system, the neural computations underlying feature extraction in higher areas is not well understood. Moreover, the computations underlying object and category formation are not known, and consequently, the cortical circuits that mediate these computations are not well described. In this talk, I characterize the role of the superior temporal sulcus and the prefrontal cortex in (1) feature extraction by characterizing their first-order (linear) filter properties, (2) object formation through the application of a linear pattern-discriminator and maximum-likelihood methods, and (3) category formation. These data demonstrate a hierarchical flow of information processing between these areas and suggest that they are part of a circuit involved in auditory-object formation.
Yale E. Cohen is an Associate Professor in the Departments of Otorhinolaryngology, Bioengineering, and Neuroscience at the University of Pennsylvania. He is also head of the Auditory Research Laboratory. He received his PhD in Bioengineering from the University of Pennsylvania and was a post-doctoral fellow at Stanford and Caltech with a focus on neurobiology. Following post-doctoral training, he established a laboratory at Dartmouth College in the Department of Psychological and Brain Sciences. His research program examines how the brain combines sensory, motor, and cognitive cues form internal computational models of the external world. His work focuses on focuses on understanding the representation of auditory information in the cortex, how auditory information is integrated with cognitive processes, such as attention, decision making, motor planning, or memory, and how auditory and visual information is combined to form unified sensory percepts.
Matheson Hall is located at 32nd and Market Streets.