Abstract
In perceiving the sound produced by the movement of a visible object, the brain coordinates the auditory and visual input1,2,3 so that no delay is noticed even though the sound arrives later (for distant source objects, such as aircraft or firework displays, this is less effective). Here we show that coordination occurs because the brain uses information about distance that is supplied by the visual system to calibrate simultaneity. Our findings indicate that auditory and visual inputs are coordinated not because the brain has a wide temporal window for auditory integration, as was previously thought, but because the brain actively changes the temporal location of the window depending on the distance of the visible sound source.
Similar content being viewed by others
Main
Seven subjects with normal vision and hearing were presented through headphones with a burst of white noise (90 decibels sound-pressure level, 10-ms duration, with 4-ms rise and fall times), the spectrum of which had been processed (by using head-related transfer functions) to simulate an external sound from a frontal direction. Brief light flashes (10 ms) were produced by an array of five green light-emitting diodes (LEDs) at different distances from the subjects (1–50 m; Fig. 1). The intensity of the light flash was 14.5 candelas per square metre at a viewing distance of 1 m, and was increased in proportion to the square of the viewing distance for the other distances in order to produce consistent intensity at the eye. The difference in onset times between the sound and light stimuli was varied randomly from −125 ms to 175 ms in steps of 25 ms.
Subjects were instructed to look at the centre of the LED array and to imagine that the LEDs were the source of both light and sound, while listening to the sound directly from the sound source. To eliminate possible bias effects, we used a two-alternative forced-choice task to measure subjective simultaneity: in this task, observers judged whether the light was presented before or after the sound. Twenty responses were obtained for each condition. To determine the stimulus-onset asynchrony that corresponded to subjective simultaneity, we estimated the 50% point (the point of subjective equality) by fitting a cumulative normal-distribution function to each individual's data using a maximum-likelihood curve-fitting technique.
When the LED array was 1 m away, the point of subjective equality occurred at a sound delay of about 5 ms; however, the sound delay at this point increased with viewing distance (P < 0.001; Fig. 1a, b). This increased delay was roughly consistent with the velocity of sound (about 1 m per 3 ms at sea level and room temperature), so the point of subjective equality increased by about 3 ms with each 1-m increase in distance. This relationship was consistent at least up to a distance of 10 m.
Our results show that the brain probably takes sound velocity into account when judging simultaneity. However, it takes about 120 ms for sound to travel 40 m, and we found that the threshold for detecting the sound delay was 106 ms at a viewing distance of 40 m, so active compensation is likely to operate only for shorter distances than this.
We have shown that the brain takes sound velocity into account when integrating audiovisual information. The brain can therefore integrate audiovisual information over a wide range of temporal gaps, and correctly match sound and visual sources.
References
Sekuler, R., Sekuler, A. B. & Lau, R. Nature 385, 308 (1997).
McDonald, J. J., Teder-Sälejärvi, W. A. & Hillyard, S. A. Nature 407, 906–908 (2000).
Shams, L., Kamitani, Y. & Shimojo, S. Nature 408, 788 (2000).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing financial interests.
Rights and permissions
About this article
Cite this article
Sugita, Y., Suzuki, Y. Implicit estimation of sound-arrival time. Nature 421, 911 (2003). https://doi.org/10.1038/421911a
Issue Date:
DOI: https://doi.org/10.1038/421911a
This article is cited by
-
Perceptual simultaneity between nociceptive and visual stimuli depends on their spatial congruence
Experimental Brain Research (2023)
-
The development of audio–visual temporal precision precedes its rapid recalibration
Scientific Reports (2022)
-
Does suffering dominate enjoyment in the animal kingdom? An update to welfare biology
Biology & Philosophy (2019)
-
Audiovisual integration in depth: multisensory binding and gain as a function of distance
Experimental Brain Research (2018)
-
The sense of body ownership relaxes temporal constraints for multisensory integration
Scientific Reports (2016)
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.