Skip to main content

Research reveals ventriloquism in motion: how sound can move light

Research led by Dr Elliot Freeman, lecturer in psychology at Brunel University's School of Social Sciences, published this week in Current Biology, confirms that what we see can sometimes depend as much on our ears as on our eyes.

The study, conducted in conjunction with Professor Jon Driver at University College London, revealed that the perceived direction of motion from a given visual object (in this case, red bars across a screen), depends on minute variations in the timing of an accompanying sound (a sequence of beeps, for example). This provides evidence that the brain's integration of these visual and audio cues occurs at a very early stage of processing.

Every day examples of audio-visual integration include our ability to identify who is saying what in a noisy crowd and the illusion that sound comes directly from the an actor's lips seen on a television, rather than from the loudspeakers; the latter is the well-known 'Ventriloquist Effect', where seeing influences the location of sounds.

The audiovisual illusion revealed by this new research could be dubbed 'reverse ventriloquism in motion', as it shows that sound affects what we see. This might explain why if we watch dancing without sound, the dancers appear to have no rhythm; and why the sound of a ball hitting a racket can help us to determine the direction of the ball in a game of tennis even though the ball moves faster that the camera or eye can track.

Dr. Freeman believes that his research could have profound implications for the understanding of the neural processes that underlie multisensory perception. This knowledge could be applied in a number of industries: “The illusion could be applied to novel displays that change their appearance depending on sound, which may be of use in advertising or providing an eye-catching multisensory warning or alert in safety-critical applications. It may also eventually be useful in detecting and diagnosing subtle perceptual differences thought to be characteristic of certain clinical conditions such as dyslexia and autistic spectrum.“

Note to Editors

The research method: Eight observers were each asked to view 5-second sequences of alternately flashing bars on a computer monitor situated in a darkened laboratory, while listening to a sequence of beeps played from a central fixed loudspeaker. They then had to press one of two buttons on a keyboard depending on whether they saw the bars sweeping leftwards or rightwards across the screen. The critical timing of beeps relative to flashes could vary unpredictably between subsequent sequences, and after up to 200 repetitions per subject, average perceived motion direction could be calculated for each audio-visual timing condition.

In the cinema, objects photographed in different positions appear to move; this new audio-visual illusion recreates such illusory motion by regularly flashing a red bar on alternate sides of a screen (e.g. {Left-Flash} ... {Right-Flash} ... {Left-Flash} ... etc), but either preceding or following each flash with a beep (e.g. Beep-Flash or Flash-Beep) staggered in time by a fraction of a second. Remarkably, the ordering of flashes relative to beeps reliably determined whether observers report either rightwards motion (with a repeating '{Left-Flash}-Beep ... Beep-{Right-Flash} ... etc' sequence) or leftwards motion (for 'Beep-{Left-Flash} ... {Right-Flash}-Beep ... etc').

For more information about Dr. Freeman's research, please visit:

http://people.brunel.ac.uk/~hsstedf/

Contact details
For further information, please contact Katy Askew at the Racepoint Group.
Tel: 020 8752 3207
Email: katy.askew@racepointgroup.com