I am currently a post-doctoral researcher in experimental psychology supported by a fellowship from the Fyssen Foundation. You can find me at the Laboratoire des Systèmes Perceptifs at the École Normale Supérieure in Paris working with Pascal Mamassian. My current research focus is confidence judgements for perceptual and sensorimotor decisions, but I also study spatial perception in audition and vision, as well as general perceptual decision-making. My method of choice is computational modelling combined with psychophysics.
In 2019, I obtained my PhD in Psychology from New York University where I was advised by Mike Landy. My thesis was titled Perception, Action, and Metacognition, which encompasses my broader goal of exploring the links between these aspects of existence, particularly in dynamic contexts to approximate the real-world environment.
Before that, I was at the University of Sydney in Australia, where I obtained a Bachelor of Science with university medal majoring in physiology and psychology. Several people that were key in igniting my passion for research were my honours supervisor Simon Carlile, who introduced me to auditory motion perception, as well as David Alais and Alex Holcombe who mentored me during a brief research internship.
Perceptual confidence reflects the belief an observer has is the accuracy of the perceptual judgements. The computation of confidence by the brain is a highly debated topic. In my first investigation of perceptual confidence, my collaborators ask how prior knowledge and reward/punishment (i.e., payoffs) affect confidence. While both prior knowledge and payoffs affect perceptual decision-making by causing shifts in the decision criterion, only the former should contribute to confidence. This is because priors affect the estimate of the probability of being correct, whereas payoffs do not. Yet, we found human observers did not respond normatively to priors or payoffs, either incorporating both into their confidence judgements or neither (paper). Ongoing research is investigating how we decide our relative confidence for two perceptual judgements. We provided participants with an easy categorisation task, and after every two perceptual judgements asked them to report if they were more confident about the first or second judgement. We showed them the same stimuli multiple times, and can now determine which of several competing confidence models can account for their responses and consistency in confidence judgements.
Real-world decision-making often entails a constant interaction between perception and action. Yet very little is known about how we judge our own sensorimotor performance in the absence of feedback. We demonstrate that humans can form judgements of their own performance by monitoring the stimulus and their actions. These judgements do have some correspondence to objective performance, however they are biased to considering their most recent performance (paper). Future research investigate what sensorimotor information contributes to such judgements.
Auditory Motion Perception
How humans perceive moving sounds is not well understood. There is debate about whether we possess velocity-sensitive neurons, like we do for vision, or we compute motion from static snapshots of sound position. By studying the perception of changes in velocity, my collaborators and I found a striking inability to detect changes within some velocity ranges (paper), with some participants unable to detect a two- or even three-fold increase in velocity! In subsequent research, we investigate a spatial bias in the perceived end-point of a sound’s trajectory in the direction of motion (i.e., the Representational Momentum Effect), thought to be a result of the brain’s predictive processes for motion. We find the bias is clearly modulated by velocity (paper). It is unclear how we can have such poor mechanisms of velocity perception yet potentially use velocity to predict motion, so future research is needed on this topic. In current research, my collaborators and I are investigating the degree of similarity between the visual and auditory motion biases.
I am interested in how our auditory and visual senses interact and combine information to make inferences about the world. Previously, I have investigated multisensory causal inference, which is inference of whether two sources of information arriving to different sensory modalities originate from a common source in the environment (e.g., the tweet of a bird and its movement in a tree). We investigated how sequences of auditory and visual information led to a common-source percept or separate-source percepts, finding the cues in the sensory stimulus that influenced causal inference the most (paper).