Abstract
ABSTRACTPurposeThe interaural time difference (ITD) is a primary horizontal-plane sound localization cue computed in the auditory brainstem. ITDs are accessible in the temporal fine structure of pure tones with a frequency of no higher than about 1400 Hz. Explaining how listeners’ ITD sensitivity transitions from very best sensitivity near 700 Hz to impossible to detect within 1 octave currently lacks a fully compelling physiological explanation. Here, it was hypothesized that the rapid decline in ITD sensitivity is dictated not by a central neural limitation but by initial peripheral sound encoding, specifically, the low-frequency (apical) edge of the cochlear excitation pattern produced by a pure tone.MethodsITD sensitivity was measured in 16 normal-hearing listeners as a joint function of frequency (900-1500 Hz) and level (10-50 dB sensation level).ResultsPerformance decreased with increasing frequency and decreasing sound level. The slope of performance decline was 90 dB/octave, consistent with the low-frequency slope of the cochlear excitation pattern.ConclusionFine-structure ITD sensitivity near 1400 Hz may be conveyed primarily by “off-frequency” activation of neurons tuned to lower frequencies near 700 Hz. Physiologically, this could be realized by having neurons sensitive to fine-structure ITD up to only about 700 Hz. A more extreme model would have only a single narrow channel near 700 Hz that conveys fine-structure ITDs. Such a model is a major simplification and departure from the classic formulation of the binaural display, which consists of a matrix of neurons tuned to a wide range of relevant frequencies and ITDs.
Publisher
Cold Spring Harbor Laboratory