Psychoacoustics, Physiology of Hearing, and Auditory Modelling, from the Ear to the Brain
19-24 Jun 2022 Lyon (France)

List of authors > Cohen Yale

Rule-based and stimulus-based cues bias auditory decisions via different computational and (neuro)physiological mechanisms
Yale Cohen  1, *@  
1 : University of Pennsylvania
* : Corresponding author

Auditory perceptual decisions are modulated by expectations. Some expectations are based on learned rules (expectations) of the statistical properties of the auditory scene. For example, ambiguous speech sounds can be resolved because a listener has expectations of their language's organization. Other types of expectations are inferred directly from the ongoing temporal regularities of a stimulus: if you hear a regularly repeating stimulus, you expect to hear the same stimulus and not a different stimulus. Although both types of expectations can bias auditory decisions, it is unknown whether these sources of bias originate from the same or different computational and (neuro)physiological mechanisms. To address this question, human and non-human primates participated in a frequency-discrimination task in which participants categorized a “target” tone burst as either “low” or “high” frequency; task difficulty was titrated by embedding the target in different levels of background noise. Prior to target onset, we modulated the participant's expectations by presenting either (1) a rule-based cue, which was a visual stimulus indicating the probability of the target being “low” or “high” frequency or (2) a stimulus-based cue in the form of a temporal sequence of “low” and/or “high” tone bursts. For the human participants, we also measured their pupil diameter, which indexes arousal and is sensitive to cognitive processes that are a part of perceptual decisions. Even though rule- and stimulus-based cues could have similar effects on behavior in certain signal-to-noise regimes, they had distinct computational and physiological signatures. That is, when we modeled participant decisions and response times with a drift-diffusion model, we found model variables that were coupled together during rule-based trials but not during stimulus-based trials. Additionally, we found trial-by-trial adjustments in pupil size that were dependent on the current relationship between the rule-based cue and the auditory target: pupil modulations were larger when the frequency indicated by the rule-based cue and the actual target frequency were incongruent. These sets of findings are consistent with top-down, goal-directed adjustments to the decision process that serve to improve performance and facilitate flexible information processing. In contrast, the stimulus-based-cues effects were consistent with bottom-up processes, such as sensory adaptation. In our non-human (i.e., rhesus macaques) study, we recorded neural activity from regions of the ventral auditory pathway in order to identify mechanisms underlying rule- and stimulus-based cues. We focused on this pathway because it causally contributes to auditory decisions. Consistent with rule-based cues having a top-down origin and stimulus-based cues having a bottom-up origin, we identified a functional segregation between these two types of information processing: neurons in the prefrontal cortex were modulated preferentially by rule-based cues, whereas auditory-cortex neurons were modulated preferentially by stimulus-based cues and by the acoustic features of the stimulus. Overall, rule-based and stimulus-based expectations bias auditory decisions but rely on distinct computational, physiological, and neurophysiological mechanisms that are consistent with top-down and bottom-up, respectively, forms of information processing.


Online user: 1 Privacy
Loading...