publications

[PubMed] [Google Scholar]

Preprints (under review)

  • Kononowicz, T.W., Roger, C., van Wassenhove, V. (2017) Temporal metacognition as the decoding of self-generated brain dynamics. 
  • Gauthier, B., Pestke, K., & van Wassenhove, V. (2017). The psychological arrow of time and the human brain dynamics of event ordering. bioRxiv
  • Bekhti, Y., Gramfort, A., Zilber, N., & van Wassenhove, V. (2017). Decoding the categorization of visual motion with magnetoencephalography. bioRxiv, 103044.

 

Peer-reviewed

2018

  • Polti, I., Martin, B., van Wassenhove (in press) The effect of attention and working memory on the estimation of elapsed time. Scientific Reports
Psychological models of time perception involve attention and memory: while attention typically regulates the flow of events, memory maintains timed events or intervals. The precise, and possibly distinct, roles of attention and memory in time perception remain debated. Herein, we discuss the dissociation between attention and WM in timing and scalar variability from the perspective of Bayesian models of time estimations.
  • Lambrechts, A., Falter-Wagner, C. M., & van Wassenhove, V. (2018). Diminished neural resources allocation to time processing in Autism Spectrum Disorders. NeuroImage: Clinical17, 124-136. doi: 10.1016/j.nicl.2017.09.023
Our results suggest that compared to Typically Developing Control (TDC) individuals, individuals with #Autism Spectrum Disorder (#ASD) are less able to predict the duration of the standard tone accurately, affecting the sensitivity of the comparison process. Although individuals with ASD showed top-down adaptation to the context of the task, this neuronal strategy reflects a bias in the readiness to perform different types of tasks, and in particular a diminished allocation of resources to duration processing which could have cascading effect on learning and development of other cognitive functions.

2017

Dupré la Tour, T., Tallot, L., Grabot, L., Doyère, V., van Wassenhove, V., & Grenier, Y. (2017). Non-linear auto-regressive models for cross-frequency coupling in neural time series. PLoS Comput Biol13(12), e1005893.  doi: 10.1371/journal.pcbi.1005893 | bioRxiv preprint

Using simulations we demonstrate that our parametric method can reveal neural couplings with shorter signals than non-parametric methods. We also show how the likelihood can be used to find optimal filtering parameters, suggesting new properties on the spectrum of the driving signal, but also to estimate the optimal delay between the coupled signals, enabling a directionality estimation in the coupling.

  • van Wassenhove, V. (2017) Defining moments for conscious time and moments. PsyCh Journal, 6(2):168-169. doi: 10.1002/pchj.166
The spatiotemporal scales of natural phenomena define the observational granularity needed for scientific  characterization. In cognitive neurosciences, neural #oscillations delineate #temporal scales that may naturally provide the computational resolution for information processing in the brain. Do neural oscillations define #moments for the perception of #time?
  • Grabot, L., Kösem, A., Azizi, L., & van Wassenhove, V. (2017). Prestimulus Alpha Oscillations and the Temporal Sequencing of Audio-visual Events. Journal of Cognitive Neuroscience. 29(9):1566-1582. doi: 10.1162/jocn_a_01145
Altogether, our results suggest that, under high perceptual uncertainty, the magnitude of prestimulus alpha (de)synchronization indicates the amount of compensation needed to overcome an individual’s prior in the serial ordering and temporal sequencing of information.
2
After an accident, people often report that it felt a lot longer than it actually could have been in real time. Time seemed to slow down during the event. We tried to conduct a safe experiment in our laboratory to simulate a dangerous situation. This is the first study to show the regions of the brain that are associated with a perceived slowing down of time during a threatening situation.
  • van Wassenhove, V. (2017) Time consciousness in a computational mind/brain. Journal of Consciousness Studies, 24 (3-4), 177-202.
3

 

Time consciousness may elicit different concepts for each of us: some may imagine a directional flow mapping life events, others may think of the time needed to accomplish a task, or hear the musical tempo pacing their morning jogs. While we are all experts in experiencing time, introspection provides little intuition regarding the mechanisms supporting psychological time. To understand a conscious mind, using the brain’s internal time metrics – as opposed to a physical time arrow – is essential.

 

  • Grabot, L., & van Wassenhove, V. (2017). Time order as psychological bias. Psychological science, 28(5), 670-678. doi: 10.1177/0956797616689369
3
Incorrectly perceiving the chronology of events can fundamentally alter our understanding of the causal structure of the world. Here, we show that temporal order perception is a psychological bias that attention can modulate but not fully eradicate.

 

 

 

Several theoretical and empirical work posit the existence of a common magnitude system in the brain. Our results suggest that a generalized magnitude system based on Bayesian computations would minimally necessitate multiple priors.
Figure 6

2016

  • Gauthier, B., & van Wassenhove, V. (2016). Time Is Not Space: Core Computations and Domain-Specific Networks for Mental Travels. Journal of Neuroscience, 36(47), 11891-11903. doi: 10.1523/jneurosci.1400-16.2016
As humans, we can consciously imagine ourselves at a different time (mental time travel) or at a different place (mental space navigation). Are such abilities domain-general, or are the temporal and spatial dimensions of our conscious experience separable?

1

  • Kösem, A., & van Wassenhove, V. (2016). Distinct contributions of low-and high-frequency neural oscillations to speech comprehension. Language, Cognition and Neuroscience, 1-9. doi: 10.1080/23273798.2016.1238495
In the last decade, the involvement of neural oscillatory mechanisms in speech comprehension has been increasingly investigated. Current evidence suggests that low-frequency and high-frequency neural entrainment to the acoustic dynamics of speech are linked to its analysis. One crucial question is whether acoustical processing primarily modulates neural entrainment, or whether entrainment instead reflects linguistic processing. The interdependence between low-frequency and high-frequency neural oscillations, as well as their causal role on speech comprehension, is  discussed with regard to neurophysiological models of speech processing.
  • Kösem, A., Basirat, A., Azizi, L., & van Wassenhove, V. (2016). High-frequency neural activity predicts word parsing in ambiguous speech streams. Journal of Neurophysiology, 116(6), 2497-2512. doi: 10.1152/jn.00074.2016

During speech listening, the brain parses a continuous acoustic stream of information into computational units (e.g., syllables or words) necessary for speech comprehension. Whereas changes in low-frequency neural oscillations were compatible with the encoding of prelexical segmentation cues, high-frequency activity specifically informed on an individual’s conscious speech percept.
4
2
The ability to imagine ourselves in the past, in the future or in different spatial locations suggests that the brain can generate cognitive maps that are independent of the experiential self in the here and now. Using three experiments, we asked to which extent Mental Time Travel (MTT; imagining the self in time) and Mental Space Navigation (MSN; imagining the self in space) shared similar cognitive operations. Altogether, our findings suggest that MTT and MSN are separately mapped although they require comparable allo- to ego-centric map conversion.

 

 

  • van Wassenhove, V. (2016). Temporal cognition and neural oscillations. Current Opinion in Behavioral Sciences, 8, 124-130. doi: 10.1002/pchj.166
The spatiotemporal scales of natural phenomena define the observational granularity needed for scientific characterization. In cognitive neurosciences, neural oscillations delineate temporal scales that may naturally provide the computational resolution for information processing in the brain. Do neural oscillations define moments for the perception of time?
  • Kononowicz, T. W., & van Wassenhove, V. (2016). In Search of Oscillatory Traces of the Internal Clock. Frontiers in Psychology, 7, 224.  doi: 10.3389/fpsyg.2016.00224
fpsyg-07-00224-g001
Illustration of the main interval timing theories of interval timing that rely on the notion of neural oscillations. Panel (A) illustrates the idea that faster alpha rhythms results in longer estimates of time as more pulses could be accumulated in a given physical time interval (Treisman, 1963). Panel (B) illustrates the SBF model. The gray sinusoids depict oscillators in an example trial. The amplitude of each oscillator is represented by the size of gray circle at t1 and t2 times, respectively. Panel (C) illustrates the main brain regions engaged in interval timing (PFC, SMA, PPC) and their presumed projections to the striatum as suggested by the SBF model.

 

 

2015

5
The effect of stimulation history on the perception of a current event can yield two opposite effects: adaptation or hysteresis. The perception of the current event thus goes in the opposite or in the same direction as prior stimulation, respectively. The present findings suggest that knowing when to estimate a stimulus property has a crucial impact on perceptual simultaneity judgments. Our results extend beyond AV timing perception, and have strong implications regarding the comparative study of hysteresis and adaptation phenomena.

 

 

The estimation of duration can be affected by context and surprise. Our findings suggest that interval timing undergoes time compression by capitalizing on the predicted offset of an auditory event.
1

 

  • van Wassenhove, V., & Grzeczkowski, L. (2015). Visual-induced expectations modulate auditory cortical responses. Frontiers in Neuroscience, 9, 11. doi: 10.3389/fnins.2015.00011.
 fnins-09-00011-g004
Here, we asked whether in the absence of saccades, the position of the eyes and the timing of transient color changes of visual stimuli could selectively affect the excitability of auditory cortex by predicting the where and the when of a sound, respectively. Visual transience could automatically elicit a prediction of when a sound will occur by changing the excitability of auditory cortices irrespective of the attended modality, eye position or spatial congruency of auditory and visual events. To the contrary, auditory cortical responses were not significantly affected by eye position suggesting that where predictions may require active sensing or saccadic reset to modulate auditory cortex responses, notably in the absence of spatial orientation to sounds.

 

 

  • Strauss, M., Sitt, J. D., King, J. R., Elbaz, M., Azizi, L., Buiatti, M., … & Dehaene, S. (2015). Disruption of hierarchical predictive coding during sleep. Proceedings of the National Academy of Sciences, 112(11), E1353-E1362.
  • Strauss, M., Sitt, J., King, J. R., Elbaz, M., Azizi, L., Buiatti, M., … & Dehaene, S. (2015). Atteinte des processus de prédiction mais conservation de l’adaptation sensorielle au cours du sommeil. Médecine du Sommeil, 12(1), 58.
  • Strauss, M., Sitt, J., King, J. R., Elbaz, M., Naccache, L., Van Wassenhove, V., & Dehaene, S. (2015). Atteinte des systèmes prédictifs dans le sommeil.Revue Neurologique, 171, A172.

2014

1

 

We hypothesized that the phase of neural oscillations encodes timing. Audiovisual lag-adaption was used to shift participants’ perceived simultaneity. During lag-adaptation, the phase of neural oscillations was not stationary.  Phase shifts of entrained neural oscillations are correlated with subjective timing. The phase of neural oscillations is a temporal code that serves time awareness.

 

 

 

  • Zilber N, Ciuciu P, Gramfort A, Azizi L, van Wassenhove V. (2014) Supramodal processing optimizes visual perceptual learning and plasticity. Neuroimage, 93, 1:32-46. doi: 10.1016/j.neuroimage.2014.02.017.

1-s2.0-S1053811914001165-fx1

The temporal structure of sensory events is a multisensory feature. Discriminating visual motion coherence benefits from consistent acoustic textures. hMT + selectivity improves with multisensory training. Ventro-lateral prefrontal cortex is selectively implicated in learning. Reverse hierarchy of learning capitalizes on supramodal processing.

 

2013

  • Gross J, Baillet S, Barnes G, Henson R, Hillebrand A, Jensen O, Jerbi K, Litvak V, Maess B, Oostenveld R, Parkkonen L, Taylor J, van Wassenhove V, Wibral M, Schoffelen J-M (2013) Good practice for conducting and reporting MEG research. NeuroImage, 65, 349-363. doi: 10.1016/j.neuroimage.2012.10.001
Magnetoencephalographic (MEG) recordings are a rich source of information about the neural dynamics underlying cognitive processes in the brain, with excellent temporal and good spatial resolution. In recent years there have been considerable advances in MEG hardware developments and methods. Sophisticated analysis techniques are now routinely applied and continuously improved, leading to fascinating insights into the intricate dynamics of neural processes. However, the rapidly increasing level of complexity of the different steps in a MEG study make it difficult for novices, and sometimes even for experts, to stay aware of possible limitations and caveats. Furthermore, the complexity of MEG data acquisition and data analysis requires special attention when describing MEG studies in publications, in order to facilitate interpretation and reproduction of the results. This manuscript aims at making recommendations for a number of important data acquisition and data analysis steps and suggests details that should be specified in manuscripts reporting MEG studies. These recommendations will hopefully serve as guidelines that help to strengthen the position of the MEG research community within the field of neuroscience, and may foster discussion in order to further enhance the quality and impact of MEG research.
  • van Wassenhove V (2013) Speech through ears and eyes: interfacing the senses with the supramodal brain. Front. Psychol. 4:388. doi: 10.3389/fpsyg.2013.00388.
fpsyg-04-00388-g005The comprehension of auditory-visual (AV) speech integration has greatly benefited from recent advances in neurosciences and multisensory research. AV speech integration raises numerous questions relevant to the computational rules needed for binding information (within and across sensory modalities), the representational format in which speech information is encoded in the brain (e.g., auditory vs. articulatory), or how AV speech ultimately interfaces with the linguistic system. It is argued here that the strength of predictive coding frameworks reside in the specificity of the generative internal models not in their generality; specifically, internal models come with a set of rules applied on particular representational formats themselves depending on the levels and the network structure at which predictive operations occur. As such, predictive coding in AV speech owes to specify the level(s) and the kinds of internal predictions that are necessary to account for the perceptual benefits or illusions observed in the field. Among those specifications, the actual content of a prediction comes first and foremost, followed by the representational granularity of that prediction in time. This review specifically presents a focused discussion on these issues.
Perceptual interferences in the estimation of quantities (time, space and numbers) have been interpreted as evidence for a common magnitude system. However, if duration estimation has appears sensitive to spatial and numerical interferences, space and number estimation tend to be resilient to temporal manipulations. These observations question the relative contribution of each quantity in the elaboration of a representation in a common mental metric. Here, we elaborated a task in which perceptual evidence accumulated over time for all tested quantities (space, time and number) in order to match the natural requirement for building a duration percept. For this, we used a bisection task. Experimental trials consisted of dynamic dots of different sizes appearing progressively on the screen. Participants were asked to judge the duration, the cumulative surface or the number of dots in the display while the two non-target dimensions varied independently. In a prospective experiment, participants were informed before the trial which dimension was the target; in a retrospective experiment, participants had to attend to all dimensions and were informed only after a given trial which dimension was the target. Surprisingly, we found that duration was resilient to spatial and numerical interferences whereas space and number estimation were affected by time. Specifically, and counter-intuitively, results revealed that longer durations lead to smaller number and space estimates whether participants knew before (prospectively) or after (retrospectively) a given trial which quantity they had to estimate. Altogether, our results support a magnitude system in which perceptual evidence for time, space and numbers integrate following Bayesian cue-combination rules.
  • Martin B, Giersch A, Huron C, van Wassenhove V (2013) Temporal event structure and timing in schizophrenia: preserved binding in a longer “now”. NeuroPsychologia, 51, 358-371. doi: 10.1016/j.neuropsychologia.2012.07.002
Patients with schizophrenia show no major impairment in AV speech integration. AV speech integration occurs within the temporal simultaneity profiles. Patients differ from controls in temporal integration and simultaneity profiles. Impaired temporal event structure points to a neural synchronization deficit.

2012

In natural environments, sensory information is embedded in temporally contiguous streams of events. This is typically the case when seeing and listening to a speaker or when engaged in scene analysis. In such contexts, two mechanisms are needed to single out and build a reliable representation of an event (or object): the temporal parsing of information and the selection of relevant information in the stream. It has previously been shown that rhythmic events naturally build temporal expectations that improve sensory processing at predictable points in time. Here, we asked to which extent temporal regularities can improve the detection and identification of events across sensory modalities. To do so, we used a dynamic visual conjunction search task accompanied by auditory cues synchronized or not with the color change of the target (horizontal or vertical bar). Sounds synchronized with the visual target improved search efficiency for temporal rates below 1.4 Hz but did not affect efficiency above that stimulation rate. Desynchronized auditory cues consistently impaired visual search below 3.3 Hz. Our results are interpreted in the context of the Dynamic Attending Theory: specifically, we suggest that a cognitive operation structures events in time irrespective of the sensory modality of input. Our results further support and specify recent neurophysiological findings by showing strong temporal selectivity for audiovisual integration in the auditory-driven improvement of visual search efficiency.
  • van Wassenhove V & Schroeder CS (2012) Multisensory Role of Human Auditory Cortex in Poeppel D & Overath T, Popper, AN & Fay, RR (Eds) Springer Handbook of Auditory Research vol. 43, pp. 295-331.
  • van Wassenhove V, Ghazanfar A, Munhall & Schroeder CS (2012) Bridging the Gap between Human and Nonhuman Studies of Audiovisual Integration. In Stein B (Ed.) The New Handbook of Multisensory Processing, pp. 153-167.
  • van Wassenhove, V (2012) From the dynamic structure of the brain to the emergence of time experiences. Kronoscope, 12:2, 199-216. doi: 10.1163/15685241-12341241

2011

  • Wacongne C, Labyt E, van Wassenhove V, Bekinschtein T, Naccache L, Dehaene S (2011). Evidence for a hierarchy of predictions and prediction errors in human cortex. Proc Natl Acad Sci 108: 20754-20759. doi: 10.1073/pnas.1117807108
According to hierarchical predictive coding models, the cortex constantly generates predictions of incoming stimuli at multiple levels of processing. Responses to auditory mismatches and omissions are interpreted as reflecting the prediction error when these predictions are violated. An alternative interpretation, however, is that neurons passively adapt to repeated stimuli. We separated these alternative interpretations by designing a hierarchical auditory novelty paradigm and recording human EEG and magnetoencephalographic (MEG) responses to mismatching or omitted stimuli. In the crucial condition, participants listened to frequent series of four identical tones followed by a fifth different tone, which generates a mismatch response. Because this response itself is frequent and expected, the hierarchical predictive coding hypothesis suggests that it should be cancelled out by a higher-order prediction. Three consequences ensue. First, the mismatch response should be larger when it is unexpected than when it is expected. Second, a perfectly monotonic sequence of five identical tones should now elicit a higher-order novelty response. Third, omitting the fifth tone should reveal the brain’s hierarchical predictions. The rationale here is that, when a deviant tone is expected, its omission represents a violation of two expectations: a local prediction of a tone plus a hierarchically higher expectation of its deviancy. Thus, such an omission should induce a greater prediction error than when a standard tone is expected. Simultaneous EEE- magnetoencephalographic recordings verify those predictions and thus strongly support the predictive coding hypothesis. Higher-order predictions appear to be generated in multiple areas of frontal and associative cortices.
  • van Wassenhove V, Wittmann M, Craig A and Paulus MP (2011) Psychological and neural mechanisms of subjective time dilation. Front. Neurosci. 5 (56). doi: 10.3389/fnins.2011.00056.

2010

  • Wittmann M, van Wassenhove V, Craig A and Paulus MP (2010) The neural substrates of subjective time dilation. Front. Hum. Neurosci. 4(2). doi:10.3389/neuro.09.002.2010
An object moving towards an observer is subjectively perceived as longer in duration than the same object that is static or moving away. This ”time dilation effect” has been shown for a number of stimuli that differ from standard events along different feature dimensions (e.g. color, size, and dynamics). We performed an event-related functional magnetic resonance imaging (fMRI) study, while subjects viewed a stream of five visual events, all of which were static and of identical duration except the fourth one, which was a deviant target consisting of either a looming or a receding disc. The duration of the target was systematically varied and participants judged whether the target was shorter or longer than all other events. A time dilation effect was observed only for looming targets. Relative to the static standards, the looming as well as the receding targets induced increased activation of the anterior insula and anterior cingulate cortices (the ”core control network”). The decisive contrast between looming and receding targets representing the time dilation effect showed strong asymmetric activation and, specifically, activation of cortical midline structures (the ”default network”). These results provide the first evidence that the illusion of temporal dilation is due to activation of areas that are important for cognitive control and subjective awareness. The involvement of midline structures in the temporal dilation illusion is interpreted as evidence that time perception is related to self-referential processing.

2009

  • van Wassenhove V (2009) Minding time in an amodal representational space. Philos Trans R Soc Lond B Biol Sci. 364(1525):1815-30. doi: 10.1098/rstb.2009.0023
How long did it take you to read this sentence? Chances are your response is a ball park estimate and its value depends on how fast you have scanned the text, how prepared you have been for this question, perhaps your mood or how much attention you have paid to these words. Time perception is here addressed in three sections. The first section summarizes theoretical difficulties in time perception research, specifically those pertaining to the representation of time and temporal processing. The second section reviews non-exhaustively temporal effects in multisensory perception. Sensory modalities interact in temporal judgement tasks, suggesting that (i) at some level of sensory analysis, the temporal properties across senses can be integrated in building a time percept and (ii) the representational format across senses is compatible for establishing such a percept. In the last section, a two-step analysis of temporal properties is sketched out. In the first step, it is proposed that temporal properties are automatically encoded at early stages of sensory analysis, thus providing the raw material for the building of a time percept; in the second step, time representations become available to perception through attentional gating of the raw temporal representations and via re-encoding into abstract representations.
  • Wittmann M & van Wassenhove V (2009) The experience of time: neural mechanisms and the interplay of emotion, cognition and embodiement. Philos Trans R Soc Lond B Biol Sci. 364(1525):1809-13. doi: 10.1098/rstb.2009.0025
1525.cover
Time research has been a neglected topic in the cognitive neurosciences of the last decades: how do humans perceive time? How and where in the brain is time processed? This introductory paper provides an overview of the empirical and theoretical papers on the psychological and neural basis of time perception collected in this theme issue. Contributors from the fields of cognitive psychology, psychiatry, neurology and neuroanatomy tackle this complex question with a variety of techniques ranging from psychophysical and behavioural experiments to pharmacological interventions and functional neuroimaging. Several (and some new) models of how and where in the brain time is processed are presented in this unique collection of recent research that covers experienced time intervals from milliseconds to minutes. We hope this volume to be conducive in developing a better understanding of the sense of time as part of complex set of brain–body factors that include cognitive, emotional and body states.

 

  • Poeppel D, Idsardi W J, van Wassenhove, V (2009) Speech perception at the interface of neurobiology and linguistics. In Brian CJ Moore, Lorraine K Tyler and William D Marslen-Wilson (Eds) The Perception of Speech: From sound to meaning. Oxford: Oxford University Press. 249 – 274.

2008

  • Poeppel D, Idsardi W, van Wassenhove V (2008) Speech perception at the interface of neuroscience and linguistics. Philos Trans R Soc Lond B Biol Sci , 363(1493):1071-86. doi: 10.1098/rstb.2007.2160
Speech perception consists of a set of computations that take continuously varying acoustic waveforms as input and generate discrete representations that make contact with the lexical representations stored in long-term memory as output. Adopting the perspective of Marr, we argue and provide neurobiological and psychophysical evidence for the following research programme. First, at the implementational level, speech perception is a multi-time resolution process, with perceptual analyses occurring concurrently on at least two time scales (approx. 20–80 ms, approx. 150–300 ms), commensurate with (sub)segmental and syllabic analyses, respectively. Second, at the algorithmic level, we suggest that perception proceeds on the basis of internal forward models, or uses an ‘analysis-by-synthesis’ approach. Third, at the computational level (in the sense of Marr), the theory of lexical representation that we adopt is principally informed by phonological research and assumes that words are represented in the mental lexicon in terms of sequences of discrete segments composed of distinctive features. One important goal of the research programme is to develop linking hypotheses between putative neurobiological primitives (e.g. temporal primitives) and those primitives derived from linguistic inquiry, to arrive ultimately at a biologically sensible and theoretically satisfying model of representation and computation in speech.
  • van Wassenhove V, Buonomano D, Shimojo S, Shams L (2008) Distortions of subjective time perception within and across senses. PLoS ONE 3(1): e1437. doi: 10.1371/journal.pone.0001437
The ability to estimate the passage of time is of fundamental importance for perceptual and cognitive processes. One experience of time is the perception of duration, which is not isomorphic to physical duration and can be distorted by a number of factors. Yet, the critical features generating these perceptual shifts in subjective duration are not understood. Our results support the existence of multisensory interactions in the perception of duration and suggest that vision can modify auditory temporal perception in a pure timing task. Insofar as distortions in subjective duration can neither be accounted for by the unpredictability of an auditory, visual or auditory-visual event, we propose that it is the intrinsic features of the stimulus that critically affect subjective time distortions.

2007

  • Seitz A, Kim R, van Wassenhove V, Shams L (2007) Simultaneous and independent acquisition of multisensory and unisensory associations. Perception 36(10): 1445 – 1453. doi: 10.1068/p5843
Although humans are almost constantly exposed to stimuli from multiple sensory modalities during daily life, the processes by which we learn to integrate information from multiple senses to acquire knowledge of multisensory objects are not well understood. Here, we present results of a novel audio – visual statistical learning procedure where participants are passively exposed to a rapid serial presentation of arbitrary audio — visual pairings (comprised of artificial/synthetic audio and visual stimuli). Following this exposure, participants were tested with a two-interval forced-choice procedure in which their degree of familiarity with the experienced audio-visual pairings was evaluated against novel audio — visual combinations drawn from the same stimulus set. Our results show that subjects acquire knowledge of visual — visual, audio — audio, and audio — visual stimulus associations and that the learning of these types of associations occurs in an independent manner.
  • Skipper J, van Wassenhove V, Nusbaum HC, Small S (2007) Hearing lips and seeing voices: how cortical areas supporting speech production mediate audiovisual speech perception. Cerebral Cortex 17(10): 2387-99. doi: 10.1093/cercor/bhl147
Observing a speaker’s mouth profoundly influences speech perception. For example, listeners perceive an “illusory” “ta” when the video of a face producing /ka/ is dubbed onto an audio /pa/. Here, we show how cortical areas supporting speech production mediate this illusory percept and audiovisual (AV) speech perception more generally.
1-s2.0-S002839320600011X-gr4Normal hearing participants were tested in two experiments, which focused on temporal coincidence in auditory visual (AV) speech perception. Characteristics of the temporal window over which simultaneity and fusion responses were maximal were quite similar, suggesting the existence of a 200 ms duration asymmetric bimodal temporal integration window.

 

  • van Wassenhove V, Nagarajan SS (2007) Auditory-cortical plasticity in learning to discriminate modulation rate. Journal of Neuroscience 27 (10): 2663-2672. doi: 10.1523/JNEUROSCI.4844-06.2007
Figure 4.
The discrimination of temporal information in acoustic inputs is a crucial aspect of auditory perception, yet very few studies have focused on auditory perceptual learning of timing properties and associated plasticity in adult auditory cortex. Auditory cortex plasticity associated with short-term perceptual learning was manifested as an enhancement of auditory cortical responses to trained acoustic features only in the trained task. Plasticity was also manifested as induced non-phase–locked high gamma-band power increases in inferior frontal cortex during performance in the trained task. Functional plasticity in auditory cortex is here interpreted as the product of bottom-up and top-down modulations.

 

2005

  • Schorr, E., Foxe, N., van Wassenhove V, Knudsen, E. (2005) Audio-visual fusion in speech perception in children with cochlear implants. Proceedings of the National Academy of Science, 102 (51): 18748-18750. doi: 10.1073/pnas.0508862102
7Speech, for most of us, is a bimodal percept whenever we both hear the voice and see the lip movements of a speaker. Children who are born deaf never have this bimodal experience. We tested children who had been deaf from birth and who subsequently received cochlear implants for their ability to fuse the auditory information provided by their implants with visual information about lip movements for speech perception. For most of the children with implants (92%), perception was dominated by vision when visual and auditory speech information conflicted. For some, bimodal fusion was strong and consistent, demonstrating a remarkable plasticity in their ability to form auditory-visual associations despite the atypical stimulation provided by implants. The likelihood of consistent auditory-visual fusion declined with age at implant beyond 2.5 years, suggesting a sensitive period for bimodal integration in speech perception.
  • van Wassenhove V, Grant KW, Poeppel D (2005) Visual speech speeds up the neural processing of auditory speech. Proceedings of the National Academy of Science, 102 (4): 1181-1186. doi: 10.1073/pnas.0408949102
6
Synchronous presentation of stimuli to the auditory and visual systems can modify the formation of a percept in either modality. In combined psychophysical and electroencephalography experiments we show that visual speech speeds up the cortical processing of auditory signals early (within 100 ms of signal onset). The auditory–visual interaction is reflected as an articulator-specific temporal facilitation (as well as a nonspecific amplitude reduction). The latency facilitation systematically depends on the degree to which the visual signal predicts possible auditory targets. The observed auditory–visual data support the view that there exist abstract internal representations that constrain the analysis of subsequent speech inputs. This is evidence for the existence of an “analysis-by-synthesis” mechanism in auditory–visual speech perception.

2001

  • Lakshminarayan K, Ben Shalom D, van Wassenhove V, Orbelo D, Houde J, Poeppel D (2001) The effect of spectral manipulations on the identification of affective and linguistic prosody. Brain and Language 84: 250-263. doi:10.1016/S0093-934X(02)00516-3