Author:
Lee Nina,Guo Lin Lawrence,Nestor Adrian,Niemeier Matthias
Abstract
AbstractIt is commonly held that computations of goal-directed behaviour are governed by conjunctive neural representations of the task features. However, support for this view comes from paradigms with arbitrary combinations of task features and task affordances that require representations in working memory. Therefore, in the present study we used a task that is well-rehearsed with task features that afford minimal working memory representations to investigate the temporal evolution of feature representations and their potential integration in the brain. Specifically, we recorded electroencephalography data from human participants while they first viewed and then grasped objects or touched them with a knuckle. Objects had different shapes and were made of heavy or light materials with shape and weight being features relevant for grasping but not for knuckling. Using multivariate analysis, we found that representations of object shape were similar for grasping and knuckling. However, only for grasping did early shape representations reactivate at later phases of grasp planning, suggesting that sensorimotor control signals feed back to early visual cortex. Grasp-specific representations of material/weight only arose during grasp execution after object contact during the load phase. A trend for integrated representations of shape and material also became grasp-specific but only briefly during movement onset. These results argue against the view that goal-directed actions inevitably join all features of a task into a sustained and unified neural representation. Instead, our results suggest that the brain generates action-specific representations of relevant features as required for the different subcomponent of its action computations.Significance statementThe idea that all the features of a task are integrated into a joint representation or event file is widely supported but importantly based on paradigms with arbitrary stimulus-response combinations. Our study is the first to investigate grasping using electroencephalography to search for the neural basis of feature integration in such a daily-life task with overlearned stimulus-response mappings. Contrary to the notion of event files we find limited evidence for integrated representations. Instead, we find that task-relevant features form representations at specific phases of the action. Our results show that integrated representations do not occur universally for any kind of goal-directed behaviour but in a manner of computation on demand.
Publisher
Cold Spring Harbor Laboratory