Human perceptual development evolves in a stereotyped manner, with initially limited perceptual capabilities maturing over the months or years following the commencement of sensory experience into robust proficiencies. Based on studies of atypically-developed children who skipped the initial stages of perceptual development as well as simulations with computational model systems, we recently obtained evidence, across several domains, that such initially degraded perceptual experience may act as a scaffold, rather than a hurdle, for the acquisition of robust perceptual proficiencies, while dispensing with these degradations compromises later development. This work has implications for understanding typical, atypical, and computational development.
Newborns are born with poor initial visual acuity. Based on data from late-sighted individuals and results of computational simulations, we propose that such initially poor acuity may help instantiate extended spatial integration mechanisms, which are key for achieving robustness in tasks like face recognition.
Akin to the case of visual acuity, newborns exhibit initially poor color sensitivity. Experiments with late-sighted children and computational model systems indicate that these initial degradations may be adaptive. These results also help account for the remarkable resilience to chromatic changes we exhibit.
A human fetus is able to register environmental sounds. This in-utero experience, however, is restricted to exclusively low-frequency components of auditory signals. Computational simulations suggest that such inputs yield temporally extended receptive fields, which are critical for tasks such as emotion recognition.
In this paper, we review the ‘adaptive initial degradation’ hypothesis across the visual and auditory domain. We propose that early perceptual limitations may be akin to the flapping of a butterfly’s wings, setting up small eddies that manifest in due time as significant salutary effects on later perceptual skills.
The processing of temporal information is crucial for making sense of the dynamic sensory environment we live in. Throughout my PhD, I have studied long-lasting visual temporal integration in normally-developed adults. In light of the significance of extracting temporal relationships between environmental entities early in life, I recently began also exploring its developmental dimension. Specifically, I seek to examine the mechanisms by which the brain extracts temporal regularities more broadly and across populations: throughout normal development, in congenitally blind children who gained sight late in life, and in individuals with autism.
In the sequential metacontrast paradigm, information is mandatorily integrated along motion trajectories, for up to 450 ms. Here, we find that the extent of integration is determined by absolute time, not the number of elements presented, and can be further expanded by increases in the overall processing load.
Using the same paradigm, we find that external visual objects, such as an annulus, presented during the motion stream, do not disrupt mandatory windows of integration. This highlights that temporal windows of integration, once they are started, are not easily disrupted.
Temporal coincidences between sensory events indicate their potential association. Extracting such relationships critically underlies visual cognition. Here, we report that individuals who had been treated for congenital blindness are capable of acquiring this ability despite extensive periods of deprivation.