This paper reviews studies on eye movements in decision making, and compares their observations to theoretical predictions concerning the role of attention in decision making. Four decision theories are examined: rational models, bounded rationality, evidence accumulation, and parallel constraint satisfaction models. Although most theories were confirmed with regard to certain predictions, none of the theories adequately accounted for the role of attention during decision making. Several observations emerged concerning the drivers and down-stream effects of attention on choice, suggesting that attention processes plays an active role in constructing decisions. So far, decision theories have largely ignored the constructive role of attention by assuming that it is entirely determined by heuristics, or that it consists of stochastic information sampling. The empirical observations reveal that these assumptions are implausible, and that more accurate assumptions could have been made based on prior attention and eye movement research. Future decision making research would benefit from greater integration with attention research.
The present paper argues for the notion that when attention is spread across the visual field in the first sweep of information through the brain visual selection is completely stimulus-driven. Only later in time, through recurrent feedback processing, volitional control based on expectancy and goal set will bias visual selection in a top–down manner. Here we review behavioral evidence as well as evidence from ERP, fMRI, TMS and single cell recording consistent with stimulus-driven selection. Alternative viewpoints that assume a large role for top–down processing are discussed. It is argued that in most cases evidence supporting top–down control on visual selection in fact demonstrates top–down control on processes occurring later in time, following initial selection. We conclude that top–down knowledge regarding non-spatial features of the objects cannot alter the initial selection priority. Only by adjusting the size of the attentional window, the initial sweep of information through the brain may be altered in a top–down way.
Many recent studies have examined the association between number acuity, which is the ability to rapidly and non-symbolically estimate the quantity of items appearing in a scene, and symbolic math performance. However, various contradictory results have been reported. To comprehensively evaluate the association between number acuity and symbolic math performance, we conduct a meta-analysis to synthesize the results observed in previous studies. First, a meta-analysis of cross-sectional studies (36 samples, = 4705) revealed a significant positive correlation between these skills ( = 0.20, 95% CI = [0.14, 0.26]); the association remained after considering other potential moderators (e.g., whether general cognitive abilities were controlled). Moreover, a meta-analysis of longitudinal studies revealed 1) that number acuity may prospectively predict later math performance ( = 0.24, 95% CI = [0.11, 0.37]; 6 samples) and 2) that number acuity is retrospectively correlated to early math performance as well ( = 0.17, 95% CI = [0.07, 0.26]; 5 samples). In summary, these pieces of evidence demonstrate a moderate but statistically significant association between number acuity and math performance. Based on the estimated effect sizes, power analyses were conducted, which suggested that many previous studies were underpowered due to small sample sizes. This may account for the disparity between findings in the literature, at least in part. Finally, the theoretical and practical implications of our meta-analytic findings are presented, and future research questions are discussed.
Expert video game players often outperform non-players on measures of basic attention and performance. Such differences might result from exposure to video games or they might reflect other group differences between those people who do or do not play video games. Recent research has suggested a causal relationship between playing action video games and improvements in a variety of visual and attentional skills (e.g., [Green, C. S., & Bavelier, D. (2003). Action video game modifies visual selective attention. , 534–537]). The current research sought to replicate and extend these results by examining both expert/non-gamer differences and the effects of video game playing on tasks tapping a wider range of cognitive abilities, including attention, memory, and executive control. Non-gamers played 20+ h of an action video game, a puzzle game, or a real-time strategy game. Expert gamers and non-gamers differed on a number of basic cognitive skills: experts could track objects moving at greater speeds, better detected changes to objects stored in visual short-term memory, switched more quickly from one task to another, and mentally rotated objects more efficiently. Strikingly, extensive video game practice did not substantially enhance performance for non-gamers on most cognitive tasks, although they did improve somewhat in mental rotation performance. Our results suggest that at least some differences between video game experts and non-gamers in basic cognitive performance result either from far more extensive video game experience or from pre-existing group differences in abilities that result in a self-selection effect.
On its 43rd anniversary the Simon effect can look back at a long and varied history. First treated as a curious observation with implications for human factors research, it slowly evolved not only into a valuable target of psychological theorizing itself but also into a handy means to investigate attentional operations, the representation of space and of one's body, the cognitive representation of intentional action, and executive control. This article discusses the major characteristics of the Simon effect and the Simon task that laid the ground for this success and reviews the major lines of research, theoretical developments, and ongoing controversies on and around the Simon Effect and the cognitive processes it reflects.
We describe the key features of the visual world paradigm and review the main research areas where it has been used. In our discussion we highlight that the paradigm provides information about the way language users integrate linguistic information with information derived from the visual environment. Therefore the paradigm is well suited to study one of the key issues of current cognitive psychology, namely the interplay between linguistic and visual information processing. However, conclusions about linguistic processing (e.g., about activation, competition, and timing of access of linguistic representations) in the absence of relevant visual information must be drawn with caution.
A meta-analysis of 117 experiments evaluated the effects of cognitive load on duration judgments. Cognitive load refers to information-processing (attentional or working-memory) demands. Six types of cognitive load were analyzed to resolve ongoing controversies and to test current duration judgment theories. Duration judgments depend on whether or not participants are informed in advance that they are needed: prospective paradigm (informed) versus retrospective paradigm (not informed). With higher cognitive load, the prospective duration judgment ratio (subjective duration to objective duration) decreases but the retrospective ratio increases. Thus, the duration judgment ratio differs depending on the paradigm and the specific type of cognitive load. As assessed by the coefficient of variation, relative variability of prospective, but not retrospective, judgments increases with cognitive load. The prospective findings support models emphasizing attentional resources, especially executive control. The retrospective findings support models emphasizing memory changes. Alternative theories do not fit with the meta-analytic findings and are rejected.
Mind-wandering refers to the occurrence of thoughts whose content is both decoupled from stimuli present in the current environment and unrelated to the task being carried out at the moment of their occurrence. The core of this phenomenon is therefore stimulus-independent and task-unrelated thoughts (SITUTs). In the present study, we designed a novel experience sampling method which permitted to isolate SITUTs from other kinds of distractions (i.e., irrelevant interoceptive/exteroceptive sensory perceptions and interfering thoughts related to the appraisal of the current task). In Experiment 1, we examined the impact of SITUTs on the performance of the Sustained Attention to Response Task (SART; a Go/No-Go task). Analyses demonstrated that SITUTs impair SART performance to the same extent as irrelevant sensory perceptions. In Experiment 2, we further examined SITUTs in order to assess the possible functions of mind-wandering. We observed that the content of most of reported SITUTs refers to the anticipation and planning of future events. Furthermore, this “prospective bias” was increased when participants' attention had been oriented toward their personal goals before performing the SART. These data support the view that an important function of mind-wandering relates to the anticipation and planning of the future. ► Mind-wandering and external distractions similarly impair task performance. ► Most mind-wandering episodes are oriented toward the future. ► Directing attention to one’s personal goals increases this prospective bias. ► An important function of mind-wandering is to plan for the future.
Presenting a face stimulus upside-down generally causes a larger deficit in perceiving metric distances between facial features (“configuration”) than local properties of these features. This effect supports a qualitative account of face inversion: the same transformation affects the processing of different kinds of information differently. However, this view has been recently challenged by studies reporting equal inversion costs of performance for discriminating featural and configural manipulations on faces. In this paper I argue that these studies did not replicate previous results due to methodological factors rather than largely irrelevant parameters such as having equal performance for configural and featural conditions at upright orientation, or randomizing trials across conditions. I also argue that identifying similar diagnostic features (eyes and eyebrows) for discriminating individual faces at upright and inverted orientations by means of response classification methods does not dismiss at all the qualitative view of face inversion. Considering these elements as well as both behavioral and neuropsychological evidence, I propose that the generally larger effect of inversion for processing configural than featural cues is a mere consequence of the disruption of holistic face perception. That is, configural relations necessarily involve two or more distant features on the face, such that their perception is most dependent on the ability to perceive simultaneously multiple features of a face as a whole.
Much recent research attention has focused on understanding individual differences in the approximate number system, a cognitive system believed to underlie human mathematical competence. To date researchers have used four main indices of ANS acuity, and have typically assumed that they measure similar properties. Here we report a study which questions this assumption. We demonstrate that the numerical ratio effect has poor test–retest reliability and that it does not relate to either Weber fractions or accuracy on nonsymbolic comparison tasks. Furthermore, we show that Weber fractions follow a strongly skewed distribution and that they have lower test–retest reliability than a simple accuracy measure. We conclude by arguing that in the future researchers interested in indexing individual differences in ANS acuity should use accuracy figures, not Weber fractions or numerical ratio effects.
We examined the relation of action video game practice and the optimization of executive control skills that are needed to coordinate two different tasks. As action video games are similar to real life situations and complex in nature, and include numerous concurrent actions, they may generate an ideal environment for practicing these skills (Green & Bavelier, 2008). For two types of experimental paradigms, dual-task and task switching respectively; we obtained performance advantages for experienced video gamers compared to non-gamers in situations in which two different tasks were processed simultaneously or sequentially. This advantage was absent in single-task situations. These findings indicate optimized executive control skills in video gamers. Similar findings in non-gamers after 15 h of action video game practice when compared to non-gamers with practice on a puzzle game clarified the causal relation between video game practice and the optimization of executive control skills.
The numerical ratio effect (NRE) and the Weber fraction ( ) are common metrics of the precision of the approximate numbers sense (ANS), a cognitive mechanism suggested to play a role in the development of numerical and arithmetic skills. The task most commonly used to measure the precision of the ANS is the numerical comparison task. Multiple variants of this task have been employed yet it is currently unclear how these affect metrics of ANS acuity, and how these relate to arithmetic achievement. The present study investigates the reliability, validity and relationship to standardized measures of arithmetic fluency of the NRE and elicited by three variants of the nonsymbolic number comparison task. Results reveal that the strengths of the NRE and differ between task variants. Moreover, the reliability and validity of the reaction time NRE and the were generally significant across task variants, although reliability was stronger for . None of the task variants revealed a correlation between ANS metrics and arithmetic fluency in adults. These results reveal important consistencies across nonsymbolic number comparison tasks, indicating a shared cognitive foundation. However, the relationship between ANS acuity and arithmetic performance remains unclear. ► We investigate the numerical ratio effect (NRE) and Weber fraction (w). ► We compare performance in three nonsymbolic number comparison task variants. ► We test NRE/w strength, reliability, validity and relationship to arithmetic fluency. ► NRE and w appear generally reliable and valid across tasks. ► NRE and w do not correlate with arithmetic fluency in any task variant.
It is widely accepted that human and nonhuman species possess a specialized system to process large approximate numerosities. The theory of an evolutionarily ancient (ANS) has received converging support from developmental studies, comparative experiments, neuroimaging, and computational modelling, and it is one of the most dominant and influential theories in numerical cognition. The existence of an ANS system is significant, as it is believed to be the building block of numerical development in general. The acuity of the ANS is related to future arithmetic achievements, and intervention strategies therefore aim to improve the ANS. Here we critically review current evidence supporting the existence of an ANS. We show that important shortcomings and confounds exist in the empirical studies on human and non-human animals as well as the logic used to build computational models that support the ANS theory. We conclude that rather than taking the ANS theory for granted, a more comprehensive explanation might be provided by a sensory-integration system that compares or estimates large approximate numerosities by integrating the different sensory cues comprising number stimuli.
Dance is a rich source of material for researchers interested in the integration of movement and cognition. The multiple aspects of embodied cognition involved in performing and perceiving dance have inspired scientists to use dance as a means for studying motor control, expertise, and action-perception links. The aim of this review is to present basic research on cognitive and neural processes implicated in the execution, expression, and observation of dance, and to bring into relief contemporary issues and open research questions. The review addresses six topics: 1) dancers' exemplary , in terms of postural control, equilibrium maintenance, and stabilization; 2) how dancers' are influenced by attention demands and motor experience; 3) the critical roles played by ; 4) how dancers make strategic use of ; 5) the insights into the neural coupling between action and perception yielded through exploration of the brain architecture mediating dance observation; and 6) a perspective that sheds new light on the way audiences perceive and evaluate dance expression. Current and emerging issues are presented regarding future directions that will facilitate the ongoing dialog between science and dance. ► Dance is a valuable source for studying the integration of movement and cognition. ► Dance expertise involves motor control, memory, imagery, and synchronization. ► Neural correlates of watching dance yield insight into action-perception coupling.
A basic issue in the neurosciences of language is whether an L2 can be processed through the same neural mechanism underlying L1 acquisition and processing. In the present paper I review data from functional neuroimaging studies focusing on grammatical and lexico-semantic processing in bilinguals. The available evidence indicates that the L2 seems to be acquired through the same neural structures responsible for L1 acquisition. This fact is also observed for grammar acquisition in late L2 learners contrary to what one may expect from critical period accounts. However, neural differences for an L2 may be observed, in terms of more extended activity of the neural system mediating L1 processing. These differences may disappear once a more ‘native-like’ proficiency is established, reflecting a change in language processing mechanisms: from controlled processing for a weak L2 system (i.e., a less proficient L2) to more automatic processing. The neuroimaging data reviewed in this paper also support the notion that language control is a crucial aspect specific to the bilingual language system. The activity of brain areas related to cognitive control during the processing of a ‘weak’ L2 may reflect competition and conflict between languages which may be resolved with the intervention of these areas.
Although bilinguals rarely make random errors of language when they speak, research on spoken production provides compelling evidence to suggest that both languages are active when only one language is spoken (e.g., [Poulisse, N. (1999). . Amsterdam/Philadelphia: John Benjamins]). Moreover, the parallel activation of the two languages appears to characterize the planning of speech for highly proficient bilinguals as well as second language learners. In this paper, we first review the evidence for cross-language activity during single word production and then consider the two major alternative models of how the intended language is eventually selected. According to language-specific selection models, both languages may be active but bilinguals develop the ability to selectively attend to candidates in the intended language. The alternative model, that candidates from both languages compete for selection, requires that cross-language activity be modulated to allow selection to occur. On the latter view, the selection mechanism may require that candidates in the nontarget language be inhibited. We consider the evidence for such an inhibitory mechanism in a series of recent behavioral and neuroimaging studies.
Are challenging stimuli appreciated due to perceptual insights during elaboration? Drawing on the literature regarding aesthetic appreciation, several approaches can be identified. For instance, fluency of processing as well as perceptual challenge are supposed to increase appreciation: One group ( ) claims that fluency of processing increases appreciation. Others link aesthetics to engagement: Creation and manipulation of sense itself should be rewarding ( ). We experimentally tested the influence of insights during elaboration on liking. Pairs of stimuli – hardly detectable two-tone images including a face (Mooney face) and meaningless stimuli matched for complexity – were presented repeatedly. Having an insight as well as the intensity of the insight predicted subsequent gains in liking. This paper qualifies the role of insight (—aha!) on aesthetic appreciation through the effects of elaboration and problem-solving on understanding the processing of modern art.
The Gratton (or sequential congruency) effect is the finding that conflict effects (e.g., Stroop and Eriksen flanker effects) are larger following congruent trials relative to incongruent trials. The standard account given for this is that a cognitive control mechanism detects conflict when it occurs and adapts to this conflict on the following trial. Others, however, have questioned the conflict adaptation account and suggested that sequential biases might account for the Gratton effect. In two experiments, contingency biases were removed from the task and stimulus repetitions were deleted to control for stimulus bindings. This eliminated the Gratton effect in the response times in both experiments, supporting a non-conflict explanation of the Gratton effect. A Gratton effect did persist in the errors of Experiment 1; however, this effect was not produced by the type of errors (word reading errors) that a conflict adaptation account should predict. Instead, tentative support was found for a congruency switch cost hypothesis. In all, the conflict adaptation account failed to account for any of the reported data. Implications for future work on cognitive control are discussed. ► Contingency biases and stimulus repetitions explain the Gratton effect. ► Stimulus repetition trims performed in two contingency-unbiased experiments. ► The Gratton effect was eliminated in response times. ► An effect in errors was observed, but was attributed to congruency switch costs. ► No support for the conflict adaptation account of Gratton effects was observed.
Many educated adults possess exact mathematical abilities in addition to an approximate, intuitive sense of number, often referred to as the Approximate Number System (ANS). Here we investigate the link between ANS precision and mathematics performance in adults by testing participants on an ANS-precision test and collecting their scores on the Scholastic Aptitude Test (SAT), a standardized college-entrance exam in the USA. In two correlational studies, we found that ANS precision correlated with SAT-Quantitative (i.e., mathematics) scores. This relationship remained robust even when controlling for SAT-Verbal scores, suggesting a small but specific relationship between our primitive sense for number and formal mathematical abilities. ► Investigated Approximate Number System (ANS) acuity and math ability in college ► Two correlational studies using scores on a college-entrance exam (the SAT) ► ANS acuity correlates with math but not with verbal skills on college-entrance exam. ► Link between ANS acuity and math abilities continues into college-years.
Current etiological models of anxiety disorders emphasize both internal diatheses, or risk factors, and external stressors as important in the development and maintenance of clinical anxiety. Although considerable evidence suggests personality, genetic, and environmental variables are important to these diathesis–stress interactions, this general approach could be greatly enriched by incorporating recent developments in experimental research on fear and anxiety learning. In this article, we attempt to integrate the experimental literature on fear/anxiety learning and the psychopathology literature on clinical anxiety, identify areas of inconsistency, and recommend directions for future research. First, we provide an overview of contemporary models of anxiety disorders involving fear/anxiety learning. Next, we review the literature on individual differences in associative learning among anxious and non-anxious individuals. We also examine additional possible sources of individual differences in the learning of both fear and anxiety, and indicate where possible parallels may be drawn. Finally, we discuss recent developments in basic experimental research on fear conditioning and anxiety, with particular attention to research on contextual learning, and indicate the relevance of these findings to anxiety disorders.