A large-scale neurocomputational model of spatial cognition integrating memory with vision

Research output: Contribution to journalResearch articleContributedpeer-review

Contributors

Abstract

We introduce a large-scale neurocomputational model of spatial cognition called ’Spacecog’, which integrates recent findings from mechanistic models of visual and spatial perception. As a high-level cognitive ability, spatial cognition requires the processing of behaviourally relevant features in complex environments and, importantly, the updating of this information during processes of eye and body movement. The Spacecog model achieves this by interfacing spatial memory and imagery with mechanisms of object localisation, saccade execution, and attention through coordinate transformations in parietal areas of the brain. We evaluate the model in a realistic virtual environment where our neurocognitive model steers an agent to perform complex visuospatial tasks. Our modelling approach opens up new possibilities in the assessment of neuropsychological data and human spatial cognition.

Details

Original languageEnglish
Pages (from-to)473-488
Number of pages16
JournalNeural Networks
Volume167
Publication statusPublished - 7 Sept 2023
Peer-reviewedYes

External IDs

Scopus 85172394879
Mendeley 40c4e6ac-793b-3224-8934-c029b9da585a

Keywords

Research priority areas of TU Dresden

Keywords

  • Brain-inspired neural networks, Parietal cortex, Spatial memory and imagery, Spatial reference transformation, Visual attention