Research
Our research aims to understand brain and behavioral mechanisms involved in auditory perception and vocal communication. We are interested in how different brain areas are involved in processing the sounds that we hear. We are particularly interested in how these different brain regions interact to help support vocal communication, a complex process that involves both production and perception of communication sounds. Such communication behaviors are important for the survival of a species, and understanding them can yield important insight into human communication ncluding speech and language. We approach these problems using multiple approaches, including research in both humans and vocal primates, combining neurophysiologic techniques with behavioral measurements, computational tools, and engineering approaches.
Current Projects
Neural basis of auditory-self monitoring and vocal production (auditory-vocal interaction)
Humans continuously monitor the sound of our speech to ensure accurate vocal production, a process known as self-monitoring, The neural mechanisms of this control are largely unknown. We record neural activity in both auditory and frontal cortex during communicative vocal production, and perturb auditory feedback with headphones. We examine the relationship between cortical activity and self-monitoring and compensatory vocal control. We also manipulate cortical circuits to see if we can change vocal production behaviors. We are particularly interested in the idea of sensory prediction during vocal production, and how the brain might calculate an auditory 'error signal' that tells us when our voice doesn't match what we expected.
Humans continuously monitor the sound of our speech to ensure accurate vocal production, a process known as self-monitoring, The neural mechanisms of this control are largely unknown. We record neural activity in both auditory and frontal cortex during communicative vocal production, and perturb auditory feedback with headphones. We examine the relationship between cortical activity and self-monitoring and compensatory vocal control. We also manipulate cortical circuits to see if we can change vocal production behaviors. We are particularly interested in the idea of sensory prediction during vocal production, and how the brain might calculate an auditory 'error signal' that tells us when our voice doesn't match what we expected.
Brain mechanisms of auditory self-monitoring during human speech
Despite advances in understanding vocal production in animal models, the relationship to more complicated human speech is unclear. In conjunction with collaborators in Neurosurgery, we investigate brain activity in human neurosurgical patients undergoing monitoring as part of their clinical treatment. In parallel to our animal work, we examine the role of different brain areas during speech production, allowing direct comparisons of human speech and animal vocalization. We are really excited to ask questions about self-perception that we can't ask in animals, for example: how do you know if that voice you heard was yourself? Its a question about metacognition, and is not well understood in the auditory/vocal field.
Despite advances in understanding vocal production in animal models, the relationship to more complicated human speech is unclear. In conjunction with collaborators in Neurosurgery, we investigate brain activity in human neurosurgical patients undergoing monitoring as part of their clinical treatment. In parallel to our animal work, we examine the role of different brain areas during speech production, allowing direct comparisons of human speech and animal vocalization. We are really excited to ask questions about self-perception that we can't ask in animals, for example: how do you know if that voice you heard was yourself? Its a question about metacognition, and is not well understood in the auditory/vocal field.
Neural coding of communications sounds in complex, naturalistic environments
Our current understanding of neural coding of complex sounds is based upon responses to sounds delivered in carefully controlled conditions. It has recently become clear that behavioral context strongly modulates neural coding at the cortical level. We record neural activities during interactive vocal communication in behaviorally relevant, naturalistic environments (akin to doing an experiment by recording everything one hears in day-to-day life). We examine the effects this context has both on low-level sensory processing, but also on how neural activity supports vocal decision masking in complex social situations.
Our current understanding of neural coding of complex sounds is based upon responses to sounds delivered in carefully controlled conditions. It has recently become clear that behavioral context strongly modulates neural coding at the cortical level. We record neural activities during interactive vocal communication in behaviorally relevant, naturalistic environments (akin to doing an experiment by recording everything one hears in day-to-day life). We examine the effects this context has both on low-level sensory processing, but also on how neural activity supports vocal decision masking in complex social situations.
Marmoset social communication
Marmosets have a rich vocal repertoire and complex social interactions. How do they coordinate their conversations between large groups? Does the high degree of variability in their vocal sounds convey information? Understanding these questions may give us some insight into the evolutionary origins of human social communication. We use multi-animal behavioral recording to try and construct a model of the 'social network' between animals and answer these questions. We hope to ultimately pair this with video recording of visual/postural communication as well as neural recoding to gain better understanding of the 'social brain.'
Marmosets have a rich vocal repertoire and complex social interactions. How do they coordinate their conversations between large groups? Does the high degree of variability in their vocal sounds convey information? Understanding these questions may give us some insight into the evolutionary origins of human social communication. We use multi-animal behavioral recording to try and construct a model of the 'social network' between animals and answer these questions. We hope to ultimately pair this with video recording of visual/postural communication as well as neural recoding to gain better understanding of the 'social brain.'
Marmosets as a neurologic and psychiatric disease model
Important insights have been gained into human neurologic and psychiatric disease based upon rodent models. Rodents, while excellent and convenient models, often lack many of the rich social and communicative behaviors seen in humans that are often affected by these disease. Marmosets offer the potential of a convenient middle ground. For example, self-monitoring and auditory-vocal interaction in the brain appear to be dysfunctional in schizophrenia, and are thought by some to be the origin of auditory hallucinations. Our preliminary work suggests that we can temporarily reproduce much of this dysfunction using commonly available medications, and may yield valuable insights into the disorder.
Important insights have been gained into human neurologic and psychiatric disease based upon rodent models. Rodents, while excellent and convenient models, often lack many of the rich social and communicative behaviors seen in humans that are often affected by these disease. Marmosets offer the potential of a convenient middle ground. For example, self-monitoring and auditory-vocal interaction in the brain appear to be dysfunctional in schizophrenia, and are thought by some to be the origin of auditory hallucinations. Our preliminary work suggests that we can temporarily reproduce much of this dysfunction using commonly available medications, and may yield valuable insights into the disorder.
Cochlear implant-related cortical plasticity
Despite decades of clinical use, the effects of cochlear implantation (CI) on the brain of recipients are poorly understood. Plasticity in the cortex is thought to affect performance of CI users, but has not been studies. We plan to chronically record cortical neural activity following cochlear implantation to better understand the long-term effects on neural processing, in the hopes of designing better rehabilitation and CI programming strategies.
Despite decades of clinical use, the effects of cochlear implantation (CI) on the brain of recipients are poorly understood. Plasticity in the cortex is thought to affect performance of CI users, but has not been studies. We plan to chronically record cortical neural activity following cochlear implantation to better understand the long-term effects on neural processing, in the hopes of designing better rehabilitation and CI programming strategies.
Clinical and Translational Research
Feedback-dependent speech control in hearing loss and communication disorders
Similar to our basic science models, we are interested in how patient populations are able to listen to the sound of their own voice, and use this auditory feedback to help control their speech and voice. We record the speech production of different while changing what they hear of their own voice. We have previously tested patients with cochlear implants and those with spasmodic dysphonia. This will allow us to develop better strategies to help patients with hearing loss improve their speech, and will also yield better insight into the underlying mechanisms of speech motor control.
Similar to our basic science models, we are interested in how patient populations are able to listen to the sound of their own voice, and use this auditory feedback to help control their speech and voice. We record the speech production of different while changing what they hear of their own voice. We have previously tested patients with cochlear implants and those with spasmodic dysphonia. This will allow us to develop better strategies to help patients with hearing loss improve their speech, and will also yield better insight into the underlying mechanisms of speech motor control.
Neural coding and auditory performance in cochlear implant patients
Patients who receive cochlear implants (CI) have to learn to use their implant, a process that can take a year or more to achieve. How the brain accomplishes this and how we can improve this is unclear. We have begun examining patient level factors that might affect learning and plasticity, and are discussions with collaborators in psychology and engineering to see how we can better measure and modulate the learning process. We are also working with industry to see how implant placement and programming, in particular pitch-place mismatches, might affect outcomes.
Patients who receive cochlear implants (CI) have to learn to use their implant, a process that can take a year or more to achieve. How the brain accomplishes this and how we can improve this is unclear. We have begun examining patient level factors that might affect learning and plasticity, and are discussions with collaborators in psychology and engineering to see how we can better measure and modulate the learning process. We are also working with industry to see how implant placement and programming, in particular pitch-place mismatches, might affect outcomes.
Vestibular testing and diagnoses
Patient who have dizziness and balance complaints will often undergo a battery of physiologic tests to try and better understand the causes of their symptoms and to aid in diagnosis. Not surprisingly, these tests are imperfect and results often conflict with one another. We are using quantitative analytic approaches to see how often results disagree, and determine if those patterns can help better diagnose patients.
Patient who have dizziness and balance complaints will often undergo a battery of physiologic tests to try and better understand the causes of their symptoms and to aid in diagnosis. Not surprisingly, these tests are imperfect and results often conflict with one another. We are using quantitative analytic approaches to see how often results disagree, and determine if those patterns can help better diagnose patients.