Cognitive Neuroscience and Sensorimotor Integration Lab

 

 

 Welcome to the CoNSens Lab!

 

In the CoNSens lab we study the neural and cognitive bases of attention and perception of space and of objects. An important approach to understanding these mechanisms is to look at what function they serve in the context of the brain’s sensory inputs and motor outputs, so its sensorimotor processes, and to compare its performances to optimized models.

For our research we use large visual field presentations and 3D displays, we record behavioural and neural responses through multiple techniques (e.g., eye-tracking, brain imaging, and 3-D positioning of the body, and its parts, in space), and we use computer simulations such as neural networks.

The CoNSens lab is part of the Centre for Cognition in Action, an interdisciplinary initiative within the Department of Life Sciences at UTSC that aims at examining how human cognitive processes operate in realistic environmental settings, while still maintaining rigorous experimental control over the environment. Currently the centre comprises experimental psychology labs and facilities for large-scale computer simulations of cognitive processes. For 2006 a research clinic for outpatients and an EEG-lab are planned.

 

Selected research interests of the CoNSens Lab

Sensorimotor integration for spatial processes

Spatial perception includes perceiving the spatial relationship of objects relative to our body as well as spatial relationships of objects relative to each other. For these processes the brain integrates multiple sources of sensory information including vision, audition, proprioception and many others. We a particularly interested in the following functions:

Spatial perception across saccadic eye movements. When we move our eyes, the image on our retinas see the world as jumping around. Why do we perceive the world as being stable nevertheless? How do we integrate visual information across eye movements? We have recently shown that the brain follows optimal strategies for these processes. Strangely, as a result we are blind for certain perceptual events.

 

E.g., we cannot see our own eye movements in the mirror…

 

 

 

 

 

and sometimes we perceive movements of object in space as nonlinearly distorted and contracted.

 

 

 

 

 

 

Processing of 3D object sizes for action and perception. We use stereo displays and 3D recordings of manual grasping movements to understand the optimization principles underlying action and perception.

Perceptual properties of the attentional bias. In many spatial and/or attentional tasks normal participants show subtle asymmetries in favour of the left visual field and/or the left part of space. In contrast, lesions of the right brain often lead to severe neuropsychological deficits such as spatial neglect. These data suggest that the right cortical hemisphere is dominant for spatial and attentional tasks. However, which functions and underlying neural mechanisms are exactly involved? Currently we explore which visual and non-visual mechanisms are associated with the attentional bias.

Feature-based attention and object perception

Visual exploration involves shifts of attention to enhance perception – not only shifts between spatial locations but also to non-spatial features, such as colour, regardless of object locations. This feature-based attention is known to support perception of low level features. Does it also help with more complex forms of perception associated with high-level visual areas?

Sensory and attentional mechanisms of the lateral occipital area LO. We have recently shown that the high-level object area LO exhibits significant effects of attention.

Feature-based attention and contour integration. Since higher-level areas are activated as a function of attention the question arises whether this results from increases in neural inputs or whether attention changes processing within these areas? Quite surprisingly, recent studies suggest that attention has little effect on high-level perception itself. To explore this further we are using dual-task paradigms to probe whether feature-based attention improves perception of simple objects such as loops outside the attentional focus.

 

 

This Web Page is maintained by niemeier@utsc.utoronto.ca
Last modified: Jan 3, 2006
2001 University of Toronto at Scarborough. All rights reserved.