Skip to content


Martin Eimer

Professor Martin Eimer

Professor Martin Eimer

Research Fellow

Grants awarded

Scheme: Wolfson Research Merit Awards

Organisation: Birkbeck College, University of London

Dates: Jan 2005-Dec 2009

Value: £150,000

Summary: In our research, we study perception, attention, and action by measuring electrophysiological markers of cognitive processes that can be obtained non-invasively from EEG scalp electrodes. These markers reflect cognitive and brain processes on a millisecond-by-millisecond basis, and are highly useful to investigate how they unfold over time. In the research conducted in 2009, we have studied several different but interrelated topics. Two examples are briefly summarized to illustrate this research. In one study conducted in 2009, we have demonstrated that actively listening to a story (Michael Palin narrating his travel adventures), and remembering this story for subsequent recall, has a negative impact on the perception and the attentional selection of simultaneously presented task-relevant visual events. This finding can have implications for current discussions about the impact of mobile phone use on driving performance, as it demonstrates that surprisingly early stages of visual processing are impaired during active listening. In another set of experiments, we studied whether attention is invariably captured by highly noticeable visual events (such as flashing a single red item among uniform grey items – a “singleton” that is often assumed to “pop out” and capture attention automatically), or whether their ability to attract attention is determined by current intentional task settings. We demonstrated that far from being automatic and involuntary, attentional capture by such events in fact depends on whether they are potentially task-relevant. For example, red “singletons” only capture attention when observers look for a red target stimulus. In contrast, when they are instructed to search for a green target, or for a target stimulus that is not defined by colour (e.g., an item that is larger than its context), attentional capture by such singletons can be successfully avoided. This shows that attentional selection of visual events is driven in a top-down fashion by our current intentions, and is not controlled by external stimuli, even when these are highly salient.

Was this page useful?
Thank you for your feedback
Thank you for your feedback. Please help us improve this page by taking our short survey.