I’m a postdoc in the Laboratoire des Systèmes Perceptifs at the École Normale Supérieure in Paris studying perceptual and sensorimotor decision-making and confidence. I have also studied auditory motion perception and multisensory interactions. My methods of choice are psychophysics and computational modelling.
Real-world decision-making often entails a constant interaction between perception and action. Yet very little is known about how we judge our own sensorimotor performance in the absence of feedback. We demonstrate that humans can form judgements of their own performance by monitoring the stimulus and their actions. These judgements do have some correspondence to objective performance, however they are biased to considering their most recent performance (paper). Future research will investigate what sensorimotor information contributes to such judgements.
Auditory Motion Perception
There is debate about how we perceive auditory motion. Do we possess velocity-sensitive neurons, like in the visual system, or do we compute motion from static snapshots of sound position? In one study, we found a striking inability in humans to detect changes in sound velocity (paper), with some people unable to detect a two- or even three-fold increase in velocity! In another study, we investigated a spatial bias localising moving sounds, the Representational Momentum Effect, thought to be a result of the brain’s predictive processes. This bias was modulated by velocity (paper). It is unclear how we can have such poor mechanisms of velocity perception yet potentially use velocity to predict motion.
Perceptual confidence reflects the belief an observer has is the accuracy of the perceptual judgements. We investigated how prior knowledge and reward affect confidence. While both prior knowledge and rewards affect perceptual decision-making by causing shifts in the decision criterion, only the priors should contribute to confidence. This is because priors affect the estimate of the probability of being correct, whereas rewards do not. Yet, we found observers did not respond normatively to priors or rewards, either incorporating both into their confidence judgements or neither (paper).
I am interested in how our auditory and visual senses interact and combine information to make inferences about the world. Previously, I have investigated multisensory causal inference, which is an inference about whether two sources of information arriving to different sensory modalities originate from a common source in the environment (e.g., the tweet of a bird and its movement in a tree). We investigated how sequences of auditory and visual information led to a common-source percept or separate-source percepts, finding the cues in the sensory stimulus that influenced causal inference the most (paper).