Skip to main content

A comprehensive review of attention tests: can we assess what we exactly do not understand?

Abstract

Attention, as it is now defined as a process matching data from the environment to the needs of the organism, is one of the main aspects of human cognitive processes. There are several aspects to attention including tonic alertness (a process of intrinsic arousal that varies by minutes to hours), phasic alertness (a process that causes a quick change in attention as a result of a brief stimulus), selective attention (a process differentiating multiple stimuli), and sustained attention (a process maintaining persistence of response and continuous effort over an extended period). Attention dysfunction is associated with multiple disorders; therefore, there has been much effort in assessing attention and its domains, resulting in a battery of tests evaluating one or several attentional domains; instances of which are the Stroop color-word test, Test of Everyday Attention, Wisconsin Card Sorting Test, and Cambridge Neuropsychological Test Automated Battery. These tests vary in terms of utilities, range of age, and domains. The role of attention in human life and the importance of assessing it merits an inclusive review of the efforts made to assess attention and the resulting tests; Here we highlight all the necessary data regarding neurophysiological tests which assess human attentive function and investigates the evolution of attention tests over time. Also, the ways of assessing the attention in untestable patients who have difficulty in reading or using a computer, along with the lack of ability to comprehend verbal instructions and executive tasks, are discussed. This review can be of help as a platform for designing new studies to researchers who are interested in working on attention and conditions causing deficits in this aspect of body function, by collecting and organizing information on its assessment.

Introduction

Cognition is the general term used to refer to the process of gaining knowledge [1]. Cognition encompasses several domains, including attention, language, learning, memory, perception, and higher reasoning [2]. These domains are highly interconnected, for instance, attention and memory cannot function with another lacking operation; as memory capacity is limited comes attention to filter what gets to be encoded [3]. As a domain of cognition, attention can be defined as a crucial process for keeping thoughts organized, but psychologists and philosophers have always tried to define attention.

By the 1950s several models for attention were proposed [4]. Now an acceptable definition for Attention is a cognitive process matching data from the environment to the needs of the organism [5]. Being able to focus on one thing and ignore other stimuli is another definition provided by some [6]. Despite these definitions Bernhard Hommel and colleagues, after a discussion about the concept of attention, the difference between attention and intention, and the synthetic approach to attention found that “no one knows, or can ever know, exactly what attention is” [7]. This was in contrast with the prior belief about attention which was defined by William James who mentioned “Everyone knows what attention is.”

There are several aspects to attention including tonic alertness (defined as a process of intrinsic arousal that varies by minutes to hours), phasic alertness (defined as a process that causes a quick change in attention as a result of a brief stimulus), selective attention (defined as a process differentiating multiple stimuli), and sustained attention (defined as a process maintaining persistence of response and continuous effort over an extended period) [8,9,10,11]. Attention is also divided based on where it functions: "bottom-up" which refers to guidance that stems from external stimuli and "top-down" which refers to guidance that comes from intrinsic factors rooted in previously acquired knowledge [12].

It is generally agreed that attention can be focused on voluntarily using “top-down” signals derived from knowledge of the current task (finding your lost keys), and automatically using “bottom-up” signals captured by salient sensory events in the sensorium (a flashing fire alarm)[13, 14]. Studies in neuroimaging and neurophysiology have discovered that a large network, involving the posterior parietal cortex, temporoparietal junction (TPJ), superior temporal sulcus, and dorsal areas of the frontal cortex, supports the orienting of attention [14,15,16]. Although there are neural correlations between both types of attention in the frontal and posterior parietal cortex, and both can modulate processing in sensory processing regions; prefrontal neurons reflected the target location first during top-down attention, whereas parietal neurons signaled it earlier during bottom-up attention [13, 17]. According to studies, involuntary attention is highly related to ventral regions (temporoparietal junction) and can be engaged more rapidly, and is more resistant to interference than voluntary attention which is related to more dorsal segments of the frontal and parietal lobes [14].

Regardless of definition, a need for assessing attention, as one of the components of human cognition, leads to designing different attention tests. Testing attention is included in assessing the neuropsychological function of a subject [18]. Besides a few exceptions, such as the Test of Everyday Attention for Children (TEA-Ch), most attention tests evaluate attention in a wide range of ages (as young as a 4-year-old to as old as one can get) [19]. Traditionally most of the tests were taken on paper. With the invention and widespread use of computers, many tests are now conducted via the computer, although some remain only on paper or in the form of cards. Using audio tapes are also used as a way of evaluating attention and specifically to assess the auditory domain of attention [20,21,22,23].

Attention dysfunction is a symptom that is associated with multiple disorders [24, 25]. In addition to attention deficit hyperactivity disorders (ADHD), Schizophrenia [26], anxiety disorder [27], depression [28], posttraumatic stress disorder [29], epilepsy [30], dementia [31], hearing loss [32], even heart disease [33] and anemia [34], can cause impairment in attentive function. Studies suggested that episodic memory loss, which is the characteristic symptom of Alzheimer's disease (AD) type dementia, may be secondary to failures of attentional processes [35]. Also, a recent study found overlapping cognitive profiles between mild cognitive impairment (MCI) and ADHD [36]. Attention has a vital role in determining one’s ability in dealing with problems, planning, organizing thoughts and behavior and choosing targets and acting towards them. Hence deficits in this domain of cognition can herald difficulties in many everyday activities in subjects suffering aforementioned conditions. Therefore, assessing attention seems necessary in evaluation of these conditions [37].

Studies on patients with AD revealed that with progressive dysfunction of prefrontal cortex (PFC) and medial temporal lobe, signs of cognitive impairment and inattention appear. Thus, a part of attention circuit consists of PFC and medial temporal lobe [38]. It is also believed that information related to bottom-up attention comes from visual cortex and is directed toward PFC. It has also been seen that when differentiating stimuli with different color and shape, PFC is activated. PFC and posterior parietal cortex (PPC) are origins for signals responsible for top-down attention. Role of PFC and PPC in maintaining attention was further established by studies stimulating these two areas during an attention-requiring activity which resulted in shorter latency of detecting and responding. PFC is activated earlier in top-down attention than PPC. Deactivation of PFC also impaired performance in all attentive tasks while PPC disruption was noticeable only in more difficult tasks. Timing and difficulty differences give the idea of a more important role for PFC in this aspect of attention [39].

Previous research has demonstrated a connection between attention dysfunction and several mental and neurological conditions. This association may be explained by the role of immune dysregulation, inflammation, anxiety, and stress in the development of ADHD, which can lead to a wide range of chronic illnesses, including cardiovascular, and metabolic diseases (due to inflammation), as well as psychiatric (such as schizophrenia and neurodegenerative diseases (due to immune dysregulation and increase in kynurenic acid (KYNA) as a result of the activation the tryptophan (TRP)-kynurenine (KYN) metabolic pathway) [38, 40, 41]. Stressors can induce inflammation and inflammation in turn can affect multiple brain sites resulting in impairment of cognitive function. Studies support stress role in inattention by showing a meaningful connection between childhood trauma and severity of ADHD. It has been also shown that in a stressful condition the levels of the proinflammatory cytokines rise in the brain [40]. Therefore, it is crucial to diagnose and treat attention deficit disorders early on. This makes the assessment of attention function even more important. The interaction between cognition and emotion is also considerable. Biases in attention are identified as central characteristics of depression and anxiety [42]. Emotions also impact cognition and their effect on attention has been illustrated by studies assessing reaction times under different emotions such as fear [43, 44].

In this review, our main objective to pursue is to summarize the different neurophysiological tests which assess human attentive function and investigate the evolution of attention tests over time. We have searched PubMed and Google Scholar databases for articles and studies that evaluated attention assessment methods and tests using the "attention tests" keywords. We have considered those which were relevant to the objectives in hand and reviewed the included.

Results

In this review, we have summarized the available cognitive tests for assessing attentive function. Table 1 briefly presents the results of this review. The table is sorted by the date each test was established and contains brief information on the setting each test is undertaken. The duration of each test, the domain(s) assessed, and the age of each test targeted are also noted. The details of the tests are mentioned afterward.

Table 1 Summary of available attention tests

Digit Span Test (DGS)

The Digit Span Test along with the reaction time test is thought to be among the very first tests assessing cognition and attention used by psychologists. The first time DGS has been introduced and put into work was in 1716 by Gottfried Leibniz [45]. The most current version which has been used since 2008 is embedded in Wechsler Adult Intelligence Scales IV [46,47,48]. This test is also a part of the Wechsler Intelligence Scale for Children and the Working Memory Index [47, 49, 50]. DGS is at large a measure of working memory but is also described to be of importance in evaluating everyday simple attention [51,52,53]. DGS in its latest version includes 3 subtasks: Digit Span Forward, Digit Span Backward, and Digit Span Sequencing [47]. In this task the subject is presented with a series of random digits, then the subject is required to repeat the digits in the same sequence or backward [54].

Simple and choice reaction time (RT)

This test has been used to measure an individual’s reaction time to an illustrated stimulus since the nineteenth century. This test has two major parts, one in which the participant is timed responding to a predictable single stimulus (simple RT) and another in which the participant is expected to respond appropriately to one of several stimuli (choice RT) [55, 56]. RT task is a common test evaluating a subject’s central processing speed which correlates with cognitive efficiency, namely sustained attention and alertness [57, 58]. This test is both a simple single-choice and a multiple-choice RT task which consists of five steps, each a more complex chain of reactions than the previous one. In each scenario, the participants have to respond whenever a yellow dot shows up. The dot may show up in one of five designated places, and the subjects have to respond either by pressing a pad, touching the screen, or sometimes doing both. There are 4 pieces of results to the RTI which are categorized into RT and movement time, each including a simple and a five-choice task [56].

Stroop Color and Word Test

The Stroop Color and Word Test is originally a test that was used to measure the ability to inhibit cognitive interference [21]; but studies suggested the Stroop test as a measure of attentive function, especially selective attention, too [59,60,61,62,63,64]. The standard Stroop Test includes words typed in different colors. Participants are initially asked to read the colored names and they are timed doing so. Following this step, they are asked to identify the printed color of each word. Normally, the required time for naming the color is more than that of reading the color’s name.

The emotional Stroop test

The emotional Stroop, known as the E-Stroop test, employs target and control lists that incorporate variably colored words [65]. E-Stroop is the most commonly utilized test for measuring attentional biases [66]. Attentional bias refers to hyper-attention to threatening material [67]. The sensitivity of the e-Stroop sufficiently differentiates between neutral and emotionally charged words. In the E-Stroop test, participants are required to name the ink color of words with emotional or neutral valence, which is the main difference between the traditional Stroop and E-Stroop. Specifically, the words do not represent names of colors but contain words varying in accordance with their emotional valence [68]. In the other words in the E-Stroop test, the emotional meaning of the word tends to capture and hold the participant's attention and cause a slowdown in reaction time, regardless of the ink color in the traditional test [69].

The Wisconsin Card Sorting Test (WCST)

This neuropsychological test was first devised in 1948 and has been used to assess cognitive processes of higher levels since 1963 [54, 70]. These processes include attention, perseverance, working memory, abstract thinking, cognitive flexibility, and set-shifting. The test is made up of two card packs; each containing four stimulus cards and 64 response cards. The dimensions of each card are 7 × 7 cm and on them are various shapes of different colors and numbers. The subjects are required to sort the response cards with the stimulus cards using the feedback given to them by pre-established instructions. This study uses the Heaton technique with 128 cards. Each subject took the test separately and 12 sets of points were applied [71]. Due to the character of the test, carrying out a reliability study was not possible. However, validity studies, done on a Turkish sample, confirmed the credibility of this test [54]. This test has been used to assess cognitive function in subjects suffering from various disorders; including TBI, schizophrenia, chronic alcoholism, anorexia nervosa, autism spectrum, obsessive–compulsive disorder, and ADHD [70, 72,73,74,75].

The Mackworth Clock Test

This test was first devised in 1948 to assess vigilance in radar technicians of the British Air Force [76,77,78,79,80]. This test assesses sustained attention and was evaluated in two different time frames, one from 1962 to 1964 and another from 1980 to 1984 [77, 81,82,83]. Effects of alcohol hangovers, fatigue, and specific serotonin reuptake inhibition on the test results have been studied [84,85,86,87]. A metal box with a white face of 30.5 cm in diameter and on it a 15.25 cm long pointer, black in color, was used in this test. The face was divided into 100 3.6-degree steps and the pointer jumped one step each second. The target stimuli in this test were called double jumps, where the pointer traveled 7.2 degrees in one second. These stimuli occurred randomly in varying time intervals and each trial took 62 min to carry out. The box, or 'the clock' as it was referred to, was placed at a distance of 1.22 m but could be reduced to a minimum of 0.31 m if the subject had difficulty seeing the pointer sharply [76]. A computerized version of the test has been available since 2000 [76, 88].

Match to sample visual search task

This task is a benchmark measuring speed/accuracy trade. The task assesses how well the subject is in matching visual samples. A complex visual pattern appears in the center of the screen then after a short pause, a number of similar patterns appear at the edge of the screen. Of all these patterns only, one matches that of the center. Efficient performance is defined as the ability to reject all unintended patterns and to recognize and point out the correct ones. Percent correct is the variable of interest in this task. A subject with a better performance gets a better score[89, 90].

The rapid visual information processing (RVP)

The RVP task is a 4-min-long visual continuous performance task (CPT). This test was devised by modifying and simplifying Wesnes and Warburton task and is designed to analyze sustained attention [56]. The test has been utilized in schizophrenic patients and subjects having ADHD and social anxiety disorder [56, 91,92,93]. In the test process numbers from 2 to 9 appear in the center of a screen without any specific arrangement at a rate of 100 numbers/min. Subjects are required to press a specific button whenever the pattern of the numbers is 2–4-6, 3–5-7, or 4–6-8. Seven factors were quantified: (1) number of misses (times in which there is no response despite the pattern being shown); (2) probability of hits (h, times when the subject responds accurately), calculated by dividing the number of hits by the number of hits and misses; (3) total correct rejections (times when subject correctly ignores other patterns); (4) probability of false alarms (f, times 0 when the subject responds inaccurately), calculated by dividing false alarms by total false alarms and correct rejections; (5) A’ (calculated as\(0.5+[(h-f)+(h-f)2]/[4\times h\times (1-f)]\)), a signal detection measure of sensitivity to the target, regardless of response tendency; (6) B” (calculated as\([(h-h2)-(f-f2)]/[\left(h-h2\right)+\left(f-f2\right)]\)), a signal detection measure of the strength of trace needed to cause a response (Sahgal, 1987); and (7) mean latency (mean time needed to react in correct responses) [56].

Dichotic listening test

This test was first designed to assess auditory function and measure hemispheric asymmetry [94]. It is also used as the standard basis for all the tests studying selective auditory attention. In this test, subjects are provided with two sound sequences. Sequences are played in the same ear in one instance while in the other instance, each sequence is played in a different ear. The subject is then required to only focus on one of the formerly played sequences while rejecting the other. In the instance of dichotic listening, a male voice is presented to one ear and a female voice to the other. On the other hand, in the biotic instance, both ears are presented with both sequences [95]. This test has been used to evaluate the effect of the attentional deficit on the elderly affected by AD [96].

The continuous performance test

The continuous performance test that was first introduced in 1956 is administered to assess sustained attention and selective visual attention [97,98,99]. In the first round of testing, three tests were undertaken; the Continuous Performance Test, Identical Pairs version (CPT-IP), the Attention Span Task, and the Information Overload Task. The entire round of testing took 1.5 h to carry out. In the initial home visit the Quick Test a demographic questionnaire, and an inventory of perceived attentional styles were also administered. In the second run which is executed for confirming the reliability of the first run over time, only information-processing aspects are re-evaluated [98]. This test has been used on subjects with schizophrenia, ADHD, mania, and alcoholism [100,101,102,103,104].

Motor screening (MOT)

This task screens for difficulties in the visual, movement, and comprehension zone. Results are stated in the form of two indices, the standard score of mean latency (MOT ML) and mean error (MOT ME) [105].

The Cambridge Neuropsychological Test Automated Battery (CANTAB)

Cambridge Neuropsychological Test Automated Battery (CANTAB), is a highly sensitive and objective measures series of cognitive tests developed at the University of Cambridge. Culturally neutral and requiring no technical knowledge or prior familiarity with computers make CANTAB suitable for usage in different studies. CANTAB also offered Web-based testing. Reaction Time (CANTAB-RTI), Rapid Visual Information Processing (CANTAB-RVP), Motor Screening Task (CANTAB-MOT), and Match to Sample Visual Search (CANTAB-MTS) are subtests of CANTAB which can assess the attention and psychomotor speed [106, 107].

The Ruff 2 and 7 test

This test was designed to assess sustained attention and visual selective attention [108, 109]. Using a pen and paper, sustained attention and its voluntary and intentional sides are evaluated using varying distractors. Assessments were done on four groups of patients suffering from a single lesion limited to the right or left anterior or right or left posterior region of the cerebrum. A greater fall in processing rate, independent of the processing mode (serial or parallel), was noticed in patients suffering from a right-hemisphere lesion than in subjects with a left-hemisphere lesion. Moreover, patients with anterior lesions showed a more noticeable contrast between serial and parallel processing, as was forecasted. This difference was the most prominent in subjects with a right frontal lesion [110]. This test has also been utilized on patients suffering from post-concussion syndrome [109]. Changes in the indices of this test by aging and depression have been studied [111, 112]. The validity of the test has been evaluated for outpatient schizophrenic subjects [113].

The Test of Variables of Attention (TOVA)

Test of Variables of Attention (TOVA) is derivative of CPTs and commonly administered to assess sustained attention [114,115,116,117]. Subjects in the TOVA are presented with a screen on which two signals are shown, a square in the upper half or a square in the lower half. The subjects are expected to react based on the location of the square, respond if the square is presented in the upper half ('go' signal), and ignore if the square is presented in the lower half ('no-go' signal). Each task includes two constellations of 'go' and 'no-go' trials, differing only in respect of the ratio of the two kinds. In the first constellation, the majority of the trials are 'go’ trials assessing whether the subject is capable of denying responses when presenting with an occasional ‘no-go’ signal. A higher ratio of ‘no-go’ trials is shown in the later constellation and the subject is assessed whether he or she is capable of a swift response to the infrequent ‘go’ signals [114, 118,119,120]. TOVA has been used to diagnose ADHD and TBI [121,122,123].

Test of everyday attention (TEA)

The test of everyday attention (TEA) includes eight subtests, which are standardized to have an age-adjusted mean of 10 with a standard deviation of 3 for ages between 18 and 80, and by this, it is comparable to that of the Third Edition of Wechsler Adult Intelligence Scale and Wechsler Memory Scale–III [124, 125]. This test is designed to quantify sustained, selective, divided attention and attentional switching, both[20, 126, 127]. The subtest of the TEA is as follows:

TEA—map search

This subtest is devised to quantify visual selective attention, the subject looks at a tourist map of a city and looks for symbols representing different services. For example, a knife and a fork represent eating facilities.

TEA—elevator counting

This subtest is devised to quantify the sustained aspect of attention, asks the subjects to pretend they are in an elevator whose floor indication is not functioning, and by hearing a set of pre-recorded tapes they should guess which floor they have arrived.

TEA—elevator counting with distraction

A subtest of Elevator counting requires the subject to differentiate between high-tone and low-tone sounds as a means to evaluate auditory selective attention.

Visual elevator: subjects are required to count presented doors as a means to evaluate visual attentional switching.

TEA—auditory elevator with reversal

The same as the visual elevator subtest, evaluating auditory attentional switching.

TEA—telephone search

Subjects are required to look for symbols in a phone directory. This subtest assesses visual selective attention.

TEA—telephone search dual task

Subjects are required to do the same as a telephone search subtest but this time counting presented tones at the same time. This subtest assesses divided attention.

TEA—lottery task

Subjects are required to listen to the numbers announced on an audiotape, then write down the two letters preceding a specified number [124, 128,129,130,131,132,133].

The psychomotor vigilance task (PVT)

The psychomotor vigilance task (PVT), undertaken by a computer, is a test timing widely used to measure reduced fatigue-related changes in alertness due to sleep loss [134]. PVT is considered a validated task for measuring sustained attention [135,136,137], which is one of the primary components of processes of attention [138]. During the PVT test, the participant reacts to a particular minimal change in the labile environment [139]. In detail, participants monitor a fixed point on a display for the appearance of a millisecond counter and are taught to press a key on the computer when they want to react to a digital signal. Any error concerning missing a signal or pressing the key without a signal will be registered. Significant results which are called PVT performance lapses are those in which the subject reacts in more than 500 ms or does not respond at all [140]. The standard 10-min PVT is often considered impractical in applied contexts, so a modified brief 3-min version of the PVT (PVT-B) is suggested as a useful alternative to this test [141].

Test of everyday attention for children (TEA-Ch)

The Test of Everyday Attention for Children (TEA-Ch), designed for ages 6–16, measures different aspects of attention in children. This test compares a subject's performance to the average score of his age group[128]. This test has been utilized to assess children with ADHD and traumatic brain injury [19, 142,143,144]. The test provides a means for evaluating selective attention, sustained attention, and executive control [145]. TEA-Ch tests oriented toward sustained attention are the following:

TEA-Ch—Score!

Subject children are asked to mutely count the tones they hear on an audiotape of 15 min. 10 trials are undertaken and the number of tones in each trial ranges between 9 and 15. Intervals between tones are different, and each successfully undertaken trial is counted as one point.

TEA-Ch—Sky Search dual task

A 'dual-task' in which children are required to do Sky Search and Score! subtests at the same time. Therefore, the scores obtained in this subtest measure both sustained and selective attention.

TEA-Ch—Score dual task

A 'dual-task' in which the child is required to do the Score! subtest combining it with another auditory task where the child needs to listen to a news report on an audiotape and look for an animal's name. By the completion of the 10 trials, the child has to give the number of tones heard and the animal's name.

TEA-Ch—Walk, Do Not Walk

Children are required to note down the path on a piece of paper with a pen every time a tone is heard on the audiotape but do not make a mark if a second tone is immediately heard after the first. There are 20 trials in this subtest and the rate of the tones is increasing as the child is going through them. Each point represents one successfully completed trial.

TEA-Ch—code transmission

An audiotape is played for the children and they are required to listen for two ‘5 s’ in a row. After the two ‘5 s’ are heard the child must tell the number just before the two ‘5 s’.

TEA-Ch tests oriented toward selective attention are the following:

TEA-Ch—Sky Search

Children are tutored to look for specific spaceships on a large piece of paper that is filled with analogous ploy spaceships. In the second part, there is no ploy spaceship, and rules out motor dysfunction.

TEA-Ch—map mission

Children have 1 min to look for target symbols on a map full of distractors.

TEA-Ch tests oriented towards Attentional Control/Switching Focused are the following.

TEA-Ch—Creature Counting

Children start counting numbers until a visual stimulus, which is a creature in a tunnel, is shown. By seeing that stimulus, the child should start counting backward. This process is repeated and both speed and accuracy are considered.

TEA-Ch—Opposite Worlds

The 'Same Worlds' subtest requires the children to say 'one' and 'two' when they see their respective characters. In the later 'Opposite Worlds' subtest children are required to say 'one' when the character for 'two' is visualized and vice versa [128].

AX-Continuous Performance Test (AX-CPT)

Like TOVA, AX-Continuous Performance Test (AX-CPT) is also a derivative of CPTs and is commonly administered to assess sustained attention [114,115,116,117]. The AX-CPT is a comparable task in which subjects are required to respond as swiftly as possible to a stimulus or ignore one considering the prior stimulus [114, 146].

The Conners Continuous Performance Test—2

Attention problems, most notably in the sustained domain of attention, are assessed using The Conners Continuous Performance Test2, utilizing a computer [147,148,149]. The Conners' CPT is a helpful measure in diagnosing ADHD and has been studied on other disorders such as schizophrenia, TBI, and idiopathic epilepsy [147, 150,151,152,153,154,155]. 360 stimuli trials are performed on the screen, in letters presented 1, 2, or 4 s apart (ISI: Inter-Stimulus Interval). 18 blocks of 20, is the established division method for the trials. The ISIs are evened out across these blocks. The participation instructions require pressing the spacebar or the appropriate key on the mouse for any letter that appears, excluding the letter "x". Aside from the suggested practice test, it takes fourteen min for the CPT to be put into practice. Non-X stimuli appear 324 times, and the letter "X" (nontarget) appears 36 times during the test. One of the prominent advantages of this paradigm is putting a high quantity of targets to the test; ensuring a larger response database for producing the statistical output [148].

The Madrid Card Sorting Test

This test is the simplified version of the WCST and is done on the computer [156]. This test also features the possibility to undertake an ERP study. The test stimulus battery utilizes 24 response cards of the original 64 in the WCST and the subject is required to match these cards with the 4 stimulus cards based only on one criterion. This can be either the color, shape, or number on the card. A sensitive measure of sorting errors and set-shifting ability was possible only by using unequivocal cards. These sets of cards were used in 137 trials and put into 18 series in a semi-random manner. The intended sorting rule for each series differed and was initially unrecognized by the subjects. The stimuli were designed on the STIM package developed by NeuroScan Inc.; however, can be done in any format that supports at least 256 colors and 40 pixels/cm of resolution. Trials started with a compound stimulus displaying 4 key cards above one choice card showing up on an NEC monitor containing 10,243,768 pixels. The cards were fixed at a horizontal angle of 48 and a vertical angle of 3.58 and 1.5 m apart from the monitor. It was not proven that using smaller stimuli would make a meaningful change in EEG considering eye movement artifacts. Moreover, it is thought to be a deteriorating factor in the acuity of vision in the elderly or subjects suffering from neurological issues. The ideal contrast was achieved by using a black outline for shapes on a completely white background, while card stimuli were illustrated upon a dark one. The brightness of both the cards and the background was fixed at all times. Sequence control of the stimuli was provided by the STIM package but was tested successfully with other similar software such as MEL v1.0 and Presentation v0.50. An IBM computer was used for carrying out different parts of the trials [157].

The Cognitive Assessment Battery (CAB)

In recent years, online mobile-based cognitive assessments developed. One of them is CogniFit General Cognitive Assessment Battery (CAB). Although CAB does not identify the presence or absence of clinical diagnoses, studies found CAB a useful tool for distinguishing between controls and MCI as well as MCI and dementia [158]. The Concentration Cognitive Assessment (CAB-AT) is a tool for assessing attention. Resolution Test REST-SPER, Processing Test REST-INH, Equivalencies Test INH-REST, and Simultaneity Test DIAT-SHIF are subtests of CAB-AT which are development were inspired by CPT, classic Stroop test, TOVA, and Hooper Visual Organization Task, which was originally developed as a screening instrument for neurological dysfunction but subsequently has been used as a test of visuospatial skills [159].

Table 1 is a summary of the characteristics of cognitive tests concerning the attentive function in humans. All the mentioned tests are developed for assessing attention in patients with stable conditions and without significant disabilities. The conventional tests, which typically require the ability to read or use a computer, along with the ability to comprehend verbal instructions and executive tasks, are incompatible with some populations due to significant intelligence deficits or low cognitive and social functioning, which have been considered “untestable”[160]. Such characteristics are common in people with Intellectual Disability or Autism Spectrum Disorder (ASD), who are identified with limited abilities to understand instructions and perform tasks [161]. Cooperation and communication impairment are two significant factors in the difficulty of testing people with intellectual disability or ASD who have trouble understanding and responding to instructions [162, 163]. Ho has trouble understanding and responding to instructions. Hence, in some previous studies, examiners invited examinees to engage in a game to improve their motivation by creating a playful test environment [164,165,166,167].

Evaluating unstable patients

It is noteworthy that only a few tests are compatible with the needs of individuals who exhibit such characteristics. According to a systematic review study, three tests and three batteries are appropriate for assessing people with intellectual disability or ASD. The tests mentioned were the Test of Auditory Discrimination, the Integrated and Visual and Auditory Continuous Performance Test (IVA-CPT), and Posner's Paradigm, while the batteries identified were the Cognitive Assessment System (CAS), TEA, and the Leiter international performance scale–revised (Leiter-R)[161].

Integrated and Visual and Auditory Continuous Performance Test (IVACPT)

The Integrated Visual and Auditory (IVA) CPT is made up of 22 subscales that provide information on inattention, inhibition, response consistency, attention diversity, and overall reaction time discrimination. It can assess inattention and impulsivity in the visual and auditory domains on the same task. When a computer screen displays a target stimulus, the individual must hit a button. However, when non-target stimuli flash, the user must not press the button. The IVA can be administered to individuals aged 5 and above. The IVA-CPT was created to aid in the measurement of ADHD symptoms; however, it has been used to assess attention and impulsivity in several neurodevelopmental disorders. The test evaluates sustained attention, processing speed, and attention consistency over time [168,169,170,171].

Test of auditory discrimination

This test assesses auditory distractibility, attention, and discrimination by asking the participant to detect speech sounds in both quiet and distracting (background noise) listening environments. It can be used on people aged 44 months to 70 years and older [170, 172].

Posner's paradigm

The Posner paradigm is a computer task that takes roughly 45 min to complete. This entails identifying a target (for example, a star, letters, or other symbols). The target is preceded by a cue that is either valid (indicates the target location correctly), invalid (indicates the target location incorrectly), or neutral (no indication of target location). Invalid trials necessitate the participant to disengage from the incorrectly cued location and shift to the proper one [173, 174].

Cognitive Assessment System (CAS)

CAS consists of three attention subtests, including expressive attention (The task for children aged 5 to 7 years includes pictures of ordinary animals; they must determine whether the animal depicted in real life is large or small, regardless of the relative size of the screen image; and those aged 8 to 18 are asked to name the color of the ink used to print the word rather than read that word), number detection (The aim is to identify specific numbers on a page (for ages 5 to 7 years) or specific numbers in a particular font (for ages 8 to 18 years) by underlying them), as well as receptive attention (The task is to underline pairs of objects or letters that either are identical in appearance or are the same from a lexical perspective.) [175, 176].

Test of everyday attention (TEA)

As previously stated, it was used to evaluate the capacity for selective or focused attention. This test involves quickly scanning a telephone directory page for specific symbols. The summary score is calculated by dividing the amount of time it took to search the page for symbols by the number of symbols correctly identified [177, 178].

Leiter International Performance Scale–Revised (Leiter-R)

The Leiter International Performance Scale–Revised (Leiter-R), includes 20 subtests in the visual function and reasoning battery (measuring nonverbal intelligence in the form of visual functioning, reasoning, and visuospatial skills) and the memory and attention battery (evaluating performance in memory and attention domains). Leiter-R is approved for people aged 2 months and 0 months to 20 years and 11 months [179].

In addition, there are procedures and adaptations in the test set that make it more compatible with this population. Including the training sessions that necessitate more direct interaction between examiner and examinee, extra testing time, and the development of a device that allows the gradual discrimination of each test element. How instructions are communicated is also a significant factor in comprehending the instructions. Many authors point to the value of using short, simple as well as regularly repeating sentences, and speaking enthusiastically and encouragingly [161].

From another point of view, since the performance of conventional neuropsychological tests is multifaceted and often requires intact upper limb function, their administration is limited in patients with upper limb disability and compromised motor function who are referred to rehabilitation settings and outpatient follow-up clinics due to interpretation difficulty. Upper limb dysfunction resulting in a stroke can range from complete immobility, as in limb amputation and hemiparesis to more subtle motor impairment in neurological disorders such as multiple sclerosis (MS) and Parkinson's disease [180]. Therefore, two modified tests in this population group have been developed and widely used, which include:

The Oral Trail Making Test (OTMT)

In oral TMT, the subject counts as swiftly as they can from 1 to 25 (OTMT-A) and switches between numbers and letters (OTMT-B; 1-A-2-B-3-c, and so on). At number 13, the timing was stopped. The time to completion is the outcome measure for the OTMT-A and OTMT-B[181]. An oral TMT paradigm has the potential to be used in a clinical setting as an alternate measure of cognitive flexibility.

The Symbol Digit Modalities Test-oral (SDMT-oral)

Participants are given a written key of symbol-digit pairs with a number ranging from 1 to 9 in the oral format of SDMT. A list of 110 symbols is presented to the participant, who must provide verbal answers to the relevant number for as many of the symbols as possible within a time limit of 90 s[182]. Since its inception, the SDMT-oral has been utilized with varying degrees of regularity, most notably as part of neuropsychological assessment batteries in MS. The SDMT-oral is more recommended for clinical application[180].

It should also be mentioned that, due to the wide variety of available psychological tests, selecting the appropriate test for particular clinical groups is not challenging. For example, WCST or CANTAB, which we described before, appear to be appropriate in children with developmental language disorder who are classified as having language functional abnormalities in the absence of a definite etiology [183, 184].

Discussion and conclusion

There are multiple batteries for evaluating different parts of attention including, but not limited to, the Stroop test, continuous performance test, test of everyday attention for children, psychomotor vigilance test, and Wisconsin Card Sorting Test. Although the scientific study of attention, began in psychology, at this point, a debate about the different underlying mechanisms of these behavioral patterns is still ongoing. This review can be used for designing future studies regarding this domain of human cognition.

Beyond the historical definition of attention as a selection of wanted messages from unwanted ones (filter mechanism), another proposed mechanism for attention is to take up the level of the most important arriving signal [185,186,187]. Attention is vital for the proper functioning of the rest of our cognitive capacities. The application of attention in machine learning has added to its importance in recent years [188]. Improving the input–output models as well as adopting new primitives in deep learning methods is similar to what happens in the brain during the attention process; therefore, modeling complex systems with attention mechanisms, has multiple benefits such as focusing on a subset of elements and temporal dimensions [189].

Limitations and future directions

This study's limitations should be taken into consideration when interpreting its results. First, the strategy used to identify additional journals with the scope of untestable patients may have excluded studies published in scientific journals in which these terms do not appear as part of the title/abstract or scoping interests; which suggested getting focused in future studies. Second, we consulted one specialist database for locating papers and search reference lists of the included papers as well as reference lists of literature reviews that were discovered during the screening process. Additional web searches may yield more thorough descriptions of patients who have difficulty applying traditional tests. By properly establishing the search protocol and utilizing numerous pertinent datasets, such bias can be reduced.

Availability of data and materials

Not applicable.

Abbreviations

ADHD:

Deficit hyperactivity disorders

AD:

Alzheimer's disease

MCI:

Mild cognitive impairment

DGS:

Digit Span Test

RT:

Reaction time

WCST:

Wisconsin Card Sorting Test

RVP:

Rapid Visual Information Processing

MOT:

Motor screening

CANTAB:

Cambridge Neuropsychological Test Automated Battery

TOVA:

Test of Variables of Attention

TEA:

Test of everyday attention

PVT:

Psychomotor vigilance task

AX-CPT:

AX-Continuous Performance Test

CAB:

Cognitive Assessment Battery

ASD:

Autism spectrum disorder

IVA:

Integrated Visual and Auditory

CAS:

Cognitive Assessment System

MS:

Multiple sclerosis

OTMT:

Oral Trail Making Test

SDMT-oral:

Symbol Digit Modalities Test-oral

References

  1. Brewer JB, Gabrieli JDE, Preston AR, Vaidya CJ, Rosen AC. Chapter 5—memory. In: Goetz CG, editor. Textbook of clinical neurology. 3rd ed. Philadelphia: W.B. Saunders; 2007. p. 63–78.

    Chapter  Google Scholar 

  2. Bushnell PJ, Tilson HA. 13.21—Cognitive function. In: McQueen CA, editor. Comprehensive toxicology. 2nd ed. Oxford: Elsevier; 2010. p. 367–81.

    Chapter  Google Scholar 

  3. Chun MM, Turk-Browne NB. Interactions between attention and memory. Curr Opin Neurobiol. 2007;17(2):177–84.

    Article  CAS  PubMed  Google Scholar 

  4. Burk JA, Blumenthal SA, Maness EB. Neuropharmacology of attention. Eur J Pharmacol. 2018;835:162–8.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  5. Ben Azouz O, Dellagi L, Kebir O, Tabbane K. The concept of attention. Tunis Med. 2009;87(10):680–4.

    PubMed  Google Scholar 

  6. Wilterdink J. The mental status examination in adults. UpToDate. 2019. https://www.uptodate.com/contents/the-mental-status-examination-in-adults.

  7. Hommel B, Chapman CS, Cisek P, Neyedli HF, Song J-H, Welsh TN. No one knows what attention is. Atten Percept Psychophys. 2019;81(7):2288–303.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Degutis JM, Van Vleet TM. Tonic and phasic alertness training: a novel behavioral therapy to improve spatial and non-spatial attention in patients with hemispatial neglect. Front Hum Neurosci. 2010. https://doi.org/10.3389/fnhum.2010.00060.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Valdez P, Ramírez C, García A, Talamantes J, Armijo P, Borrani J. Circadian rhythms in components of attention. Biol Rhythm Res. 2005;36(1–2):57–65.

    Article  Google Scholar 

  10. Ko L-W, Komarov O, Hairston WD, Jung T-P, Lin C-T. Sustained attention in real classroom settings: an EEG study. Front Hum Neurosci. 2017;11:388.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Johnston WA, Dark VJ. Selective attention. Ann Rev Psychol. 1986;37(1):43–75.

    Article  Google Scholar 

  12. Katsuki F, Constantinidis C. Bottom-up and top-down attention. Neuroscientisty. 2013. https://doi.org/10.3389/fnhum.2017.00388.

    Article  Google Scholar 

  13. Buschman TJ, Miller EK. Top-down versus bottom-up control of attention in the prefrontal and posterior parietal cortices. Science. 2007;315(5820):1860–2.

    Article  CAS  PubMed  Google Scholar 

  14. Bowling JT, Friston KJ, Hopfinger JB. Top-down versus bottom-up attention differentially modulate frontal-parietal connectivity. Hum Brain Mapp. 2020;41(4):928–42.

    Article  PubMed  Google Scholar 

  15. Hopfinger JB, Slotnick SD. Attentional control and executive function. Cogn Neurosci. 2020;11(1–2):1–4.

    PubMed  Google Scholar 

  16. Koziol LF, Joyce AW, Wurglitz G. The neuropsychology of attention: revisiting the “Mirsky model.” Appl Neuropsychol Child. 2014;3(4):297–307.

    Article  PubMed  Google Scholar 

  17. Long N, Kuhl B. Bottom-up and top-down factors differentially influence stimulus representations across large-scale attentional networks. J Neurosci. 2018;38:2724–817.

    Article  Google Scholar 

  18. Finney GR, Minagar A, Heilman KM. Assessment of mental status. Neurol Clin. 2016. https://doi.org/10.1016/j.ncl.2015.08.001.

    Article  PubMed  Google Scholar 

  19. Heaton SC, Reader SK, Preston AS, Fennell EB, Puyana OE, Gill N, et al. The Test of Everyday Attention for Children (TEA-Ch): patterns of performance in children with ADHD and clinical controls. Child Neuropsychol. 2001;7(4):251–64.

    Article  CAS  PubMed  Google Scholar 

  20. Robertson I, Ward T, Ridgeway V, Nimmo-Smith I, Anespie C. The test of everyday attention. Manual. 1994. https://www.researchgate.net/profile/Tony-Ward-11/publication/267552527_The_Test_of_Everyday_Attention_Manual/links/5469b4a80cf2f5eb180500f3/The-Test-of-Everyday-Attention-Manual.pdf.

  21. Stroop JR. Studies of interference in serial verbal reactions. J Exp Psychol. 1935;18(6):643.

    Article  Google Scholar 

  22. Heaton RK, Staff P. Wisconsin card sorting test: computer version 2, vol. 4. Odessa: Psychological Assessment Resources; 1993. p. 1–4.

    Google Scholar 

  23. Chelune GJ, Baer RA. Developmental norms for the Wisconsin Card Sorting test. J Clin Exp Neuropsychol. 1986;8(3):219–28.

    Article  CAS  PubMed  Google Scholar 

  24. Halperin JM. The clinical assessment of attention. Int J Neurosci. 1991;58(3–4):171–82.

    Article  CAS  PubMed  Google Scholar 

  25. Talebi M, Sadigh-Eteghad S, Talebi M, Naseri A, Zafarani F. Predominant domains and associated demographic and clinical characteristics in multiple sclerosis-related cognitive impairment in mildly disabled patients. Egypt J Neurol Psychiatr Neurosurg. 2022;58(1):48.

    Article  Google Scholar 

  26. Carter JD, Bizzell J, Kim C, Bellion C, Carpenter KLH, Dichter G, et al. Attention deficits in schizophrenia–preliminary evidence of dissociable transient and sustained deficits. Schizophr Res. 2010;122(1–3):104–12.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Najmi S, Kuckertz JM, Amir N. Attentional impairment in anxiety: inefficiency in expanding the scope of attention. Depress Anxiety. 2012;29(3):243–9.

    Article  PubMed  Google Scholar 

  28. Keller AS, Leikauf JE, Holt-Gosselin B, Staveland BR, Williams LM. Paying attention to attention in depression. Transl Psychiatry. 2019;9(1):279.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Vasterling JJ, Brailey K, Constans JI, Sutker PB. Attention and memory dysfunction in posttraumatic stress disorder. Neuropsychology. 1998;12(1):125–33.

    Article  CAS  PubMed  Google Scholar 

  30. Dunn DW, Kronenberger WG. Childhood epilepsy, attention problems, and ADHD: review and practical considerations. Semin Pediatr Neurol. 2005;12(4):222–8.

    Article  PubMed  Google Scholar 

  31. McGuinness B, Barrett SL, Craig D, Lawson J, Passmore AP. Attention deficits in Alzheimer’s disease and vascular dementia. J Neurol Neurosurg Psychiatry. 2010;81(2):157–9.

    Article  PubMed  Google Scholar 

  32. Tharpe AM, Ashmead D, Sladen DP, Ryan HA, Rothpletz AM. Visual attention and hearing loss: past and current perspectives. J Am Acad Audiol. 2008;19(10):741–7.

    Article  PubMed  Google Scholar 

  33. Gazzellini S, Dettori M, Amadori F, Paoli B, Napolitano A, Mancini F, et al. Association between attention and heart rate fluctuations in pathological worriers. Front Hum Neurosci. 2016;10:648.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Islam K, Seth S, Saha S, Roy A, Das R, Datta AK. A study on association of iron deficiency with attention deficit hyperactivity disorder in a tertiary care center. Indian J Psychiatry. 2018;60(1):131–4.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Malhotra PA. Impairments of attention in Alzheimer’s disease. Curr Opin Psychol. 2019;29:41–8.

    Article  PubMed  Google Scholar 

  36. Mendonca F, Sudo FK, Santiago-Bravo G, Oliveira N, Assuncao N, Rodrigues F, et al. Mild cognitive impairment or attention-deficit/hyperactivity disorder in older adults? A cross sectional study. Front Psychiatry. 2021;12: 737357.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Lyon RE, Rizeq J, Flora DB, Martinussen R, Andrade BF, Toplak ME. Age-related variance in performance versus ratings of attention and impulse regulation in children: implications for the assessment of ADHD. Brain Sci. 2022;12(8):1033.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Martos D, Tuka B, Tanaka M, Vécsei L, Telegdy G. Memory enhancement with kynurenic acid and its mechanisms in neurotransmission. Biomedicines. 2022;10(4):849.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  39. Katsuki F, Constantinidis C. Bottom-up and top-down attention: different processes and overlapping neural systems. Neuroscientist. 2014;20(5):509–21.

    Article  PubMed  Google Scholar 

  40. Saccaro LF, Schilliger Z, Perroud N, Piguet C. Inflammation, anxiety, and stress in attention-deficit/hyperactivity disorder. Biomedicines. 2021. https://doi.org/10.3390/biomedicines9101313.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Ahmed GK, Darwish AM, Khalifa H, Haridy NA. Relationship between Attention Deficit Hyperactivity Disorder and epilepsy: a literature review. Egypt J Neurol Psychiatr Neurosurg. 2022;58(1):52.

    Article  Google Scholar 

  42. Joormann J, Arditte KA. The relational theory of attention: Implications for the processing of emotional stimuli in psychological disorders. Aust Psychol. 2013;48(6):399–401.

    Article  Google Scholar 

  43. Battaglia S, Cardellicchio P, Di Fazio C, Nazzi C, Fracasso A, Borgomaneri S. Stopping in (e)motion: reactive action inhibition when facing valence-independent emotional stimuli. Front Behav Neurosci. 2022;16: 998714.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Battaglia S, Cardellicchio P, Di Fazio C, Nazzi C, Fracasso A, Borgomaneri S. The influence of vicarious fear-learning in “infecting” reactive action inhibition. Front Behav Neurosci. 2022;16: 946263.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  45. Wambach D, Lamar M, Swenson R, Penney DL, Kaplan E, Libon DJ. Digit span. In: Kreutzer JS, DeLuca J, Caplan B, editors. Encyclopedia of clinical neuropsychology. New York: Springer, New York; 2011. p. 844–9.

    Chapter  Google Scholar 

  46. Bilder R, Sabb F, Parker D, Kalar D, Chu W, Fox J, et al. Cognitive ontologies for neuropsychiatric phenomics research. Cogn Neuropsychiatry. 2009;14:419–50.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Holdnack JA. 4—The development, expansion, and future of the WAIS-IV as a cornerstone in comprehensive cognitive assessments. In: Goldstein G, Allen DN, DeLuca J, editors. Handbook of psychological assessment. 4th ed. San Diego: Academic Press; 2019. p. 103–39.

    Chapter  Google Scholar 

  48. Raiford SE, Coalson DL, Saklofske DH, Weiss LG. CHAPTER 2—practical issues in WAIS-IV administration and scoring. In: Weiss LG, Saklofske DH, Coalson DL, Raiford SE, editors. WAIS-IV clinical use and interpretation. San Diego: Academic Press; 2010. p. 25–59.

    Chapter  Google Scholar 

  49. Weiss LG, Saklofske DH, Holdnack JA, Prifitera A. Chapter 1 - WISC-V: Advances in the assessment of intelligence. In: Weiss LG, Saklofske DH, Holdnack JA, Prifitera A, editors. WISC-V. 2nd ed. San Diego: Academic Press; 2019. p. 1–21.

    Google Scholar 

  50. Wahlstrom D, Weiss LG, Saklofske DH. Chapter 2—practical issues in WISC-V administration and scoring. In: Weiss LG, Saklofske DH, Holdnack JA, Prifitera A, editors. WISC-V Assessment and Interpretation. San Diego: Academic Press; 2016. p. 25–62.

    Chapter  Google Scholar 

  51. Groth-Marnat G, Baker S. Digit Span as a measure of everyday attention: a study of ecological validity. Percept Mot Skills. 2003;97(3 Pt 2):1209–18.

    Article  PubMed  Google Scholar 

  52. Cullum CM. 4.11—neuropsychological assessment of adults. In: Bellack AS, Hersen M, editors. Comprehensive clinical psychology. Oxford: Pergamon; 1998. p. 303–47.

    Chapter  Google Scholar 

  53. Cullum CM, Larrabee GJ. CHAPTER 6—WAIS-IV use in neuropsychological assessment. In: Weiss LG, Saklofske DH, Coalson DL, Raiford SE, editors. WAIS-IV clinical use and interpretation. San Diego: Academic Press; 2010. p. 167–87.

    Chapter  Google Scholar 

  54. Coulacoglou C, Saklofske DH. Chapter 5—executive function, theory of mind, and adaptive behavior. In: Coulacoglou C, Saklofske DH, editors. Psychometrics and psychological assessment. San Diego: Academic Press; 2017. p. 91–130.

    Chapter  Google Scholar 

  55. Deary IJ, Liewald D, Nissan J. A free, easy-to-use, computer-based simple and four-choice reaction time programme: the Deary-Liewald reaction time task. Behav Res Methods. 2011;43(1):258–68.

    Article  PubMed  Google Scholar 

  56. Gau SSF, Huang WL. Rapid visual information processing as a cognitive endophenotype of attention deficit hyperactivity disorder. Psychol Med. 2014;44(2):435–46.

    Article  PubMed  Google Scholar 

  57. Zeinalzadeh A, Nazary-Moghadam S, Sayyed Hoseinian SH, Ebrahimzadeh MH, Imani E, Karimpour S. Intra- and inter-session reliability of methods for measuring reaction time in participants with and without patellofemoral pain syndrome. Arch Bone Jt Surg. 2021;9(1):102–9.

    PubMed  PubMed Central  Google Scholar 

  58. Rigoli M, Facchin A, Cardile D, Beschin N, Luzzatti C. Open-source open-access reaction time test (OORTT): an easy tool to assess reaction times. Neurol Sci. 2021;42(6):2461–9.

    Article  PubMed  Google Scholar 

  59. Jensen AR, Rohwer WD. The stroop color-word test: a review. Acta Physiol (Oxf). 1966;25:36–93.

    CAS  Google Scholar 

  60. Barbarotto R, Laiacona M, Frosio R, Vecchio M, Farinato A, Capitani E. A normative study on visual reaction times and two Stroop colour-word tests. Ital J Neurol Sci. 1998;19(3):161–70.

    Article  CAS  PubMed  Google Scholar 

  61. Valgimigli S, Padovani R, Budriesi C, Leone M, Lugli D, Nichelli P. The Stroop test: a normative Italian study on a paper version for clinical use. G Ital Psicol. 2010;37:945–56.

    Google Scholar 

  62. Scarpina F, Tagini S. The stroop color and word test. Front Psychol. 2017;8:557.

    Article  PubMed  PubMed Central  Google Scholar 

  63. Assef EC, Capovilla AG, Capovilla FC. Computerized stroop test to assess selective attention in children with attention deficit hyperactivity disorder. Span J Psychol. 2007;10(1):33–40.

    Article  PubMed  Google Scholar 

  64. Wright BC. What Stroop tasks can tell us about selective attention from childhood to adulthood. Br J Psychol. 2017;108(3):583–607.

    Article  PubMed  Google Scholar 

  65. Williams JMG, Mathews A, MacLeod C. The emotional Stroop task and psychopathology. Psychol Bull. 1996;120(1):3.

    Article  CAS  PubMed  Google Scholar 

  66. Peach N, Jovev M, Foster A, Jackson H. Testing the Stroop effect in a nonclinical sample: hypervigilance or difficulty to disengage? J Exp Psychopathol. 2012;3(3):496–510.

    Article  Google Scholar 

  67. Muris P, Merckelbach H. 620—Specific phobias. In: Bellack AS, Hersen M, editors. Comprehensive clinical psychology. Oxford: Pergamon; 1998. p. 461–74.

    Chapter  Google Scholar 

  68. De Angelis J, Ricciardelli P. Emotional Stroop task. In: Zeigler-Hill V, Shackelford TK, editors. Encyclopedia of personality and individual differences. Cham: Springer International Publishing; 2017. p. 1–4.

    Google Scholar 

  69. McKenna FP, Sharma D. Reversing the emotional Stroop effect reveals that it is not what it seems: the role of fast and slow components. J Exp Psychol Learn Mem Cogn. 2004;30(2):382.

    Article  PubMed  Google Scholar 

  70. Gómez-de-Regil L. Assessment of executive function in patients with traumatic brain injury with the Wisconsin card-sorting test. Brain Sci. 2020;10(10):699.

    Article  PubMed  PubMed Central  Google Scholar 

  71. Kafadar H, Orhan İB. The relationship between wisconsin card sorting test and raven standard progressive matrices: a latent variable analysis. Int Online J Educ Sci. 2016;8(1). https://doi.org/10.15345/iojes.2016.01.005.

  72. Sullivan EV, Mathalon DH, Zipursky RB, Kersteen-Tucker Z, Knight RT, Pfefferbaum A. Factors of the Wisconsin Card Sorting Test as measures of frontal-lobe function in schizophrenia and in chronic alcoholism. Psychiatry Res. 1993;46(2):175–99.

    Article  CAS  PubMed  Google Scholar 

  73. Westwood H, Stahl D, Mandy W, Tchanturia K. The set-shifting profiles of anorexia nervosa and autism spectrum disorder using the Wisconsin Card Sorting Test: a systematic review and meta-analysis. Psychol Med. 2016;46(9):1809–27.

    Article  CAS  PubMed  Google Scholar 

  74. Henry JD. A meta-analytic review of Wisconsin Card Sorting Test and verbal fluency performance in obsessive-compulsive disorder. Cogn Neuropsychiatry. 2006;11(2):156–76.

    Article  PubMed  Google Scholar 

  75. Arán Filippetti V, Krumm GL, Raimondi W. Computerized versus manual versions of the Wisconsin Card Sorting Test: implications with typically developing and ADHD children. Appl Neuropsychol Child. 2020;9(3):230–45.

    Article  PubMed  Google Scholar 

  76. Lichstein K, Riedel B, Richman S. The Mackworth clock test: a computerized version. J Psychol. 2000;134:153–61.

    Article  CAS  PubMed  Google Scholar 

  77. Veksler B, Gunzelmann G. Functional equivalence of sleep loss and time on task effects in sustained attention. Cogn Sci. 2018;42(2):600–32. https://doi.org/10.1111/cogs.12489.

    Article  PubMed  Google Scholar 

  78. Somma A, Becker S, Leitner C, Fossati A. Reliability, factor validity, and neuropsychological correlates of the child concentration inventory–2 in a community sample of Italian adolescents. Assessment. 2021. https://doi.org/10.1177/10731911211033349.

    Article  PubMed  Google Scholar 

  79. Westwood S, Criaud M, Lam S-L, Lukito S, Wallace-Hanlon S, Kowalczyk O, et al. Transcranial direct current stimulation (tDCS) combined with cognitive training in adolescent boys with ADHD: a double-blind, randomised, sham-controlled trial. Psychol Med. 2021. https://doi.org/10.1017/S0033291721001859.

    Article  PubMed  PubMed Central  Google Scholar 

  80. Jerison HJ. Performance on a simple vigilance task in noise and quiet. J Acoustical Soc Am. 1957;29(11):1163–5.

    Article  Google Scholar 

  81. Giambra L, Quilter R. Sustained attention in adulthood: a unique, large-sample, longitudinal and multicohort analysis using the Mackworth clock-test. Psychol Aging. 1988;3:75–83.

    Article  CAS  PubMed  Google Scholar 

  82. Giambra LM, Quilter RE, Phillips PB. The relationship of age and extraversion to arousal and performance on a sustained attention task: a cross-sectional investigation using the Mackworth clock-test. Personality Individ Differ. 1988;9(2):225–30.

    Article  Google Scholar 

  83. Whittenburg JA, Ross S, Andrews TG. Sustained perceptual efficiency as measured by the Mackworth “Clock” Test. Percept Mot Skills. 1956;6(3):109–16.

    Article  Google Scholar 

  84. Verster JC, van Duin D, Volkerts ER, Schreuder AH, Verbaten MN. Alcohol hangover effects on memory functioning and vigilance performance after an evening of binge drinking. Neuropsychopharmacology. 2003;28(4):740–6.

    Article  CAS  PubMed  Google Scholar 

  85. Williamson AM, Feyer A-M, Mattick RP, Friswell R, Finlay-Brown S. Developing measures of fatigue using an alcohol comparison to validate the effects of fatigue on performance. Accid Anal Prev. 2001;33(3):313–26.

    Article  CAS  PubMed  Google Scholar 

  86. Riedel WJ, Eikmans K, Heldens A, Schmitt JAJ. Specific serotonergic reuptake inhibition impairs vigilance performance acutely and after subchronic treatment. J Psychopharmacol. 2005;19(1):12–20.

    Article  CAS  PubMed  Google Scholar 

  87. Schmitt JAJ, Ramaekers JG, Kruizinga MJ, van Boxtel MPJ, Vuurman EFPM, Riedel WJ. Additional dopamine reuptake inhibition attenuates vigilance impairment induced by serotonin reuptake inhibition in man. J Psychopharmacol. 2002;16(3):207–14.

    Article  CAS  PubMed  Google Scholar 

  88. Johns M. The assessment of sleepiness in human drug trials: a new perspective. Curr Psychopharmacol. 2019;8(1):5–26.

    Article  Google Scholar 

  89. Green R, Till C, Al-Hakeem H, Cribbie R, Téllez-Rojo MM, Osorio E, et al. Assessment of neuropsychological performance in Mexico City youth using the Cambridge Neuropsychological Test Automated Battery (CANTAB). J Clin Exp Neuropsychol. 2019;41(3):246–56.

    Article  PubMed  Google Scholar 

  90. Levaux MN, Potvin S, Sepehry AA, Sablier J, Mendrek A, Stip E. Computerized assessment of cognition in schizophrenia: promises and pitfalls of CANTAB. Eur Psychiatry. 2007;22(2):104–15.

    Article  PubMed  Google Scholar 

  91. Cattapan-Ludewig K, Hilti CC, Ludewig S, Vollenweider FX, Feldon J. Rapid visual information processing in schizophrenic patients: the impact of cognitive load and duration of stimulus presentation. A pilot study. Neuropsychobiology. 2005;52(3):130–4.

    Article  PubMed  Google Scholar 

  92. Hilti CC, Hilti LM, Heinemann D, Robbins T, Seifritz E, Cattapan-Ludewig K. Impaired performance on the Rapid Visual Information Processing task (RVIP) could be an endophenotype of schizophrenia. Psychiatry Res. 2010;177(1–2):60–4.

    Article  PubMed  Google Scholar 

  93. Tetik D, Gica S, Bestepe EE, Buyukavsar A, Gulec H. Emotional information processing and assessment of cognitive functions in social anxiety disorder: an event-related potential study. Clin EEG Neurosci. 2020. https://doi.org/10.1177/1550059420981506.

    Article  PubMed  Google Scholar 

  94. Eskicioglu E, Taslica S, Narin B, Guducu C, Oniz A, Ozgoren M. Brain asymmetry in directing attention during dichotic listening test: an fNIRS study. Laterality. 2019;24(4):377–92.

    Article  PubMed  Google Scholar 

  95. Werner LA, et al. 350—Human auditory development. In: Masland RH, Albright TD, Albright TD, Masland RH, Dallos P, Oertel D, et al., editors. The senses: a comprehensive reference. New York: Academic Press; 2008. p. 871–93.

    Chapter  Google Scholar 

  96. Bouma A, Gootjes L. Effects of attention on dichotic listening in elderly and patients with dementia of the Alzheimer type. Brain Cogn. 2011;76(2):286–93.

    Article  PubMed  Google Scholar 

  97. Kropotov JD. Chapter 12—Psychometrics and neuropsychological assessment. In: Kropotov JD, editor. Functional neuromarkers for psychiatry. San Diego: Academic Press; 2016. p. 9–16.

    Chapter  Google Scholar 

  98. Cornblatt B, Risch N, Faris G, Friedman D, Erlenmeye’Kimling L. The continuous performance test, identical pairs version (CPT-IP): I. new findings about sustained attention in normal families. Psychiatry Res. 1988;26:223–38.

    Article  CAS  PubMed  Google Scholar 

  99. Gervais NJ, Brake WG, Lacreuse A. 1.16—ovarian hormones and prefrontal cortex-related cognition. In: Pfaff DW, Joëls M, editors. Hormones, Brain and Behavior. 3rd ed. Oxford: Academic Press; 2017. p. 439–51.

    Chapter  Google Scholar 

  100. Kurtz MM, Ragland JD, Bilker W, Gur RC, Gur RE. Comparison of the continuous performance test with and without working memory demands in healthy controls and patients with schizophrenia. Schizophr Res. 2001;48(2):307–16.

    Article  CAS  PubMed  Google Scholar 

  101. Losier BJ, McGrath PJ, Klein RM. Error patterns on the continuous performance test in non-medicated and medicated samples of children with and without ADHD: a meta-analytic review. J Child Psychol Psychiatry. 1996;37(8):971–87.

    Article  CAS  PubMed  Google Scholar 

  102. Epstein JN, Erkanli A, Conners CK, Klaric J, Costello JE, Angold A. Relations between Continuous Performance Test performance measures and ADHD behaviors. J Abnorm Child Psychol. 2003;31(5):543–54.

    Article  PubMed  Google Scholar 

  103. Sax KW, Strakowski SM, Zimmerman ME, DelBello MP, Keck PE, Hawkins JM. Frontosubcortical neuroanatomy and the Continuous Performance Test in Mania. Am J Psychiatry. 1999;156(1):139–41.

    Article  CAS  PubMed  Google Scholar 

  104. Dougherty DM, Moeller FG, Steinberg JL, Marsh DM, Hines SE, Bjork JM. Alcohol increases commission error rates for a Continuous Performance Test. Alcoholism. 1999;23(8):1342–51.

    Article  CAS  PubMed  Google Scholar 

  105. Bogaczewicz A, Sobow T, Kowalski J, Ząbek J, Woźniacka A, Bogaczewicz J. Cambridge Neuropsychological Test Automated Battery in assessment of cognitive parameters in patients with systemic lupus erythematosus in relation to autoantibody profile. Reumatologia. 2015;53(3):131–8.

    Article  PubMed  PubMed Central  Google Scholar 

  106. Robbins T, editor Cambridge neuropsychological test automated battery (CANTAB): utility and validation. IEE Colloquium on Computer-Aided Tests of Drug Effectiveness; IET. 1994.

  107. Sabahi Z, Farhoudi M, Naseri A, Talebi M. Working memory assessment using Cambridge neuropsychological test automated battery can help in the diagnosis of mild cognitive impairment: a systematic review and meta-analysis. Dement Neuropsychol. 2022;16(4):444–56.

    Article  PubMed  PubMed Central  Google Scholar 

  108. Messinis L, Kosmidis MH, Tsakona I, Georgiou V, Aretouli E, Papathanasopoulos P. Ruff 2 and 7 Selective Attention Test: normative data, discriminant validity and test–retest reliability in Greek adults. Arch Clin Neuropsychol. 2007;22(6):773–85.

    Article  PubMed  Google Scholar 

  109. Cicerone KD, Azulay J. Diagnostic utility of attention measures in postconcussion syndrome. Clin Neuropsychol. 2002;16(3):280–9.

    Article  PubMed  Google Scholar 

  110. Ruff RM, Niemann H, Allen CC, Farrow CE, Wylie T. The ruff 2 and 7 selective attention test: a neuropsychological application. Percept Mot Skills. 1992;75(3 suppl):1311–9.

    Article  CAS  PubMed  Google Scholar 

  111. Knight RG, McMahon J, Skeaff CM, Green TJ. Reliable change indices for the ruff 2 and 7 selective attention test in older adults. Appl Neuropsychol. 2010;17(4):239–45.

    Article  PubMed  Google Scholar 

  112. Ruff RM. What role does depression play on the performance of the ruff 2 and 7 selective attention test? Percept Mot Skills. 1994;78(1):63–6.

    Article  CAS  PubMed  Google Scholar 

  113. Lee P, Li P-C, Liu C-H, Lin H-Y, Huang C-Y, Hsieh C-L. Practice effects, test-retest reliability, and minimal detectable change of the ruff 2 and 7 selective attention test in patients with schizophrenia. Int J Environ Res Public Health. 2021;18(18):9440.

    Article  PubMed  PubMed Central  Google Scholar 

  114. Gorman TE, Green CS. Chapter 6—young minds on video games. In: Blumberg FC, Brooks PJ, editors. Cognitive development in digital contexts. San Diego: Academic Press; 2017. p. 121–43.

    Chapter  Google Scholar 

  115. Lewandowski L, Wood W, Miller LA. Chapter 3—technological applications for individuals with learning disabilities and ADHD. In: Luiselli JK, Fischer AJ, editors. Computer-assisted and web-based innovations in psychology, special education, and health. San Diego: Academic Press; 2016. p. 61–93.

    Chapter  Google Scholar 

  116. Leark RA, Greenberg LM, Kindschi C, Dupuy T, Hughes SJ. TOVA professional manual. Los Alamitos: TOVA Company; 2008.

    Google Scholar 

  117. Mitko A, Rothlein D, Poole V, Robinson M, McGlinchey R, DeGutis J, et al. Individual differences in sustained attention are associated with cortical thickness. Hum Brain Mapp. 2019. https://doi.org/10.1002/hbm.24594.

    Article  PubMed  PubMed Central  Google Scholar 

  118. Crow A. Associations between neuroticism and executive function outcomes: response inhibition and sustained attention on a continuous performance test. Percept Mot Skills. 2019;126:623–38.

    Article  PubMed  Google Scholar 

  119. Greenberg L, Waldman I. Developmental normative data on the test of variables of attention (T.O.V.A.?). J Child Psychol Psychiatry. 1993;34:1019–30.

    Article  CAS  PubMed  Google Scholar 

  120. Selander H, Strand N, Almberg M, Lidestam B. Ready for a learner’s permit? Clinical Neuropsychological off-road tests and driving behaviors in a simulator among adolescents with ADHD and ASD. Dev Neurorehabil. 2021;24(4):256–65. https://doi.org/10.1080/17518423.2020.1869339.

    Article  CAS  PubMed  Google Scholar 

  121. Forbes GB. Clinical utility of the test of variables of attention (TOVA) in the diagnosis of attention-deficit/hyperactivity disorder. J Clin Psychol. 1998;54(4):461–76.

    Article  CAS  PubMed  Google Scholar 

  122. Lambez B, Harwood-Gross A, Zion-Golumbic E, Rassovsky Y. Non-pharmacological interventions for cognitive difficulties in ADHD: a systematic review and meta-analysis. J Psychiatr Res. 2019. https://doi.org/10.1016/j.jpsychires.2018.09.009.

    Article  PubMed  Google Scholar 

  123. Welsh MB. May I have your attention? Testing a subjective attention scale. In: Denison S, Mack M, Xu Y, Armstrong BC (eds.) Proceedings of the 42nd Annual Conference of the Cognitive Science Society. Cognitive Science Society. pp. 2535–41.

  124. Hall J, O’Carroll RE, Frith CD. 7—Neuropsychology. In: Johnstone EC, Owens DC, Lawrie SM, McIntosh AM, Sharpe M, editors. Companion to Psychiatric studies. 8th ed. St. Louis: Churchill Livingstone; 2010. p. 121–40.

    Chapter  Google Scholar 

  125. Gourlay C, Collin P, D’Auteuil C, Jacques M, Caron P-O, Scherzer P. Age differences in social-cognitive abilities across the stages of adulthood and path model investigation of adult social cognition. Neuropsychol Dev Cogn B Aging Neuropsychol Cogn. 2022;29(6):1033–67.

    Article  PubMed  Google Scholar 

  126. Robertson IH, Ward T, Ridgeway V, Nimmo-Smith I. The structure of normal human attention: The Test of Everyday Attention. J Int Neuropsychol Soc. 1996;2(6):525–34.

    Article  CAS  PubMed  Google Scholar 

  127. Chan RC, Lai MK, Robertson IH. Latent structure of the Test of Everyday Attention in a non-clinical Chinese sample. Arch Clin Neuropsychol. 2006;21(5):477–85.

    Article  PubMed  Google Scholar 

  128. Gillen G. Chapter 8—managing attention deficits to optimize function. In: Gillen G, editor. Cognitive and perceptual rehabilitation. Saint Louis: Mosby; 2009. p. 184–209.

    Chapter  Google Scholar 

  129. Jones S, Davies-Thompson J, Tree J. Can machines find the bilingual advantage? Machine learning algorithms find no evidence to differentiate between lifelong bilingual and monolingual cognitive profiles. Front Hum Neurosci. 2021;15:621772.

    Article  PubMed  PubMed Central  Google Scholar 

  130. Pinto J, Dores A, Peixoto B, Geraldo A, Barbosa F. Systematic review of sensory stimulation programs in the rehabilitation of acquired brain injury. Eur Psychol. 2020. https://doi.org/10.1027/1016-9040/a000421.

    Article  Google Scholar 

  131. Kourtesis P, Collina S, Doumas L, MacPherson S. Validation of the Virtual Reality Everyday Assessment Lab (VR-EAL): an immersive virtual reality neuropsychological battery with enhanced ecological validity. J Int Neuropsychol Soc. 2021;27(2):181–96. https://doi.org/10.1017/S1355617720000764.

    Article  PubMed  Google Scholar 

  132. Moran C, McGovern D, Warren G, Grálaigh R, Kenney J, Smeaton A, et al. Young and restless, old and focused: age-differences in mind-wandering frequency and phenomenology. Psychol Aging. 2020. https://doi.org/10.1037/pag0000526.

    Article  Google Scholar 

  133. David D, Fleminger S, Kopelman M, Lovestone S, Mellers J. Lishman’s organic psychiatry: a textbook of neuropsychiatry. New York: Wiley; 2012.

    Google Scholar 

  134. Grant DA, Honn KA, Layton ME, Riedy SM, Van Dongen HPA. 3-minute smartphone-based and tablet-based psychomotor vigilance tests for the assessment of reduced alertness due to sleep deprivation. Behav Res Methods. 2017;49(3):1020–9.

    Article  PubMed  Google Scholar 

  135. D’Rozario AL, Field CJ, Hoyos CM, Naismith SL, Dungan GC, Wong KKH, et al. Impaired neurobehavioural performance in untreated obstructive sleep apnea patients using a novel standardised test battery. Front Surg. 2018;5:35.

    Article  PubMed  PubMed Central  Google Scholar 

  136. Wong KF, Teng J, Chee MWL, Doshi K, Lim J. Positive effects of mindfulness-based training on energy maintenance and the EEG correlates of sustained attention in a cohort of nurses. Front Hum Neurosci. 2018;12:80.

    Article  PubMed  PubMed Central  Google Scholar 

  137. Di Muzio M, Diella G, Di Simone E, Novelli L, Alfonsi V, Scarpelli S, et al. Nurses and night shifts: poor sleep quality exacerbates psychomotor performance. Front Neurosci. 2020;14:1050.

    Article  Google Scholar 

  138. Cohen RA. Sustained attention. In: Kreutzer JS, DeLuca J, Caplan B, editors. Encyclopedia of clinical neuropsychology. New York: Springer, New York; 2011. p. 2440–3.

    Chapter  Google Scholar 

  139. Reifman J, Kumar K, Khitrov MY, Liu J, Ramakrishnan S. PC-PVT 2.0: an updated platform for psychomotor vigilance task testing, analysis, prediction, and visualization. J Neurosci Methods. 2018;304:39–45.

    Article  PubMed  Google Scholar 

  140. Lee I-S, Bardwell WA, Ancoli-Israel S, Dimsdale JE. Number of lapses during the psychomotor vigilance task as an objective measure of fatigue. J Clin Sleep Med. 2010;6(2):163–8.

    Article  PubMed  PubMed Central  Google Scholar 

  141. Basner M, Mollicone D, Dinges DF. Validity and sensitivity of a Brief Psychomotor Vigilance Test (PVT-B) to total and partial sleep deprivation. Acta Astronaut. 2011;69(11–12):949–59.

    Article  PubMed  PubMed Central  Google Scholar 

  142. Araujo G, Antonini T, Anderson V, Vannatta K, Salley C, Bigler E, et al. Profiles of executive function across children with distinct brain disorders: traumatic brain injury, stroke, and brain tumor. J Int Neuropsychol Soc. 2017;23:1–10.

    Article  Google Scholar 

  143. Manly T, Anderson V, Nimmo-Smith I, Turner A, Watson P, Robertson IH. The differential assessment of children’s attention: the Test of Everyday Attention for Children (TEA-Ch), normative sample and ADHD performance. J Child Psychol Psychiatry. 2001;42(8):1065–81.

    Article  CAS  PubMed  Google Scholar 

  144. Malegiannaki AC, Aretouli E, Metallidou P, Messinis L, Zafeiriou D, Kosmidis MH. Test of Everyday Attention for Children (TEA-Ch): Greek normative data and discriminative validity for children with combined type of attention deficit-hyperactivity disorder. Dev Neuropsychol. 2019;44(2):189–202.

    Article  PubMed  Google Scholar 

  145. Chan RC, Wang L, Ye J, Leung WW, Mok MY. A psychometric study of the test of everyday attention for Children in the Chinese setting. Arch Clin Neuropsychol. 2008;23(4):455–66.

    Article  PubMed  Google Scholar 

  146. Gonthier C, Macnamara BN, Chow M, Conway ARA, Braver TS. Inducing proactive control shifts in the AX-CPT. Front Psychol. 2016;7:1822.

    Article  PubMed  PubMed Central  Google Scholar 

  147. Conners CK, Staff M, Connelly V, Campbell S, MacLean M, Barnes J. Conners’ Continuous Performance Test II (CPT II v5). Multi-Health Syst Inc. 2000;29:175–96.

    Google Scholar 

  148. Conners CK, Sitarenios G. Conners’ Continuous Performance Test (CPT). In: Kreutzer JS, DeLuca J, Caplan B, editors. Encyclopedia of Clinical Neuropsychology. New York: Springer, New York; 2011. p. 681–3.

    Chapter  Google Scholar 

  149. Sanchez L, Chronis-Tuscano A, Hunter S. Improving compliance with diabetes management in young adolescents with attention-deficit/hyperactivity disorder using behavior therapy pediatric diabetes and ADHD. Cogn Behav Pract. 2006;13:134–56.

    Article  Google Scholar 

  150. McGee RA, Clark SE, Symons DK. Does the Conners’ Continuous Performance Test aid in ADHD diagnosis? J Abnorm Child Psychol. 2000;28(5):415–24.

    Article  CAS  PubMed  Google Scholar 

  151. Solanto MV, Etefia K, Marks DJ. The utility of self-report measures and the Continuous Performance Test in the diagnosis of ADHD in adults. CNS Spectr. 2004;9(9):649–59.

    Article  PubMed  Google Scholar 

  152. Sanz JC, Gómez V, Vargas ML, Marín JJ. Dimensions of attention impairment and negative symptoms in schizophrenia: a multidimensional approach using the Conners Continuous Performance Test in a Spanish population. Cogn Behav Neurol. 2012;25(2):63–71.

    Article  PubMed  Google Scholar 

  153. Zane KL, Gfeller JD, Roskos PT, Bucholz RD. The clinical utility of the conners’ Continuous Performance Test-II in traumatic brain injury. Arch Clin Neuropsychol. 2016;31(8):996–1005.

    PubMed  Google Scholar 

  154. Ord JS, Boettcher AC, Greve KW, Bianchini KJ. Detection of malingering in mild traumatic brain injury with the Conners’ Continuous Performance Test–II. J Clin Exp Neuropsychol. 2010;32(4):380–7.

    Article  PubMed  Google Scholar 

  155. Borgatti R, Piccinelli P, Montirosso R, Donati G, Rampani A, Molteni L, et al. Study of attentional processes in children with idiopathic epilepsy by Conners’ Continuous Performance Test. J Child Neurol. 2004;19(7):509–15.

    Article  PubMed  Google Scholar 

  156. Periáñez JA, Barceló F. Madrid adaptation of the Wisconsin card sorting test: a comparative study of internal consistency. Rev Neurol. 2001;33(7):611–8.

    PubMed  Google Scholar 

  157. Barceló F. The Madrid card sorting test (MCST): a task switching paradigm to study executive attention with event-related potentials. Brain Res Brain Res Protoc. 2003;11(1):27–37.

    Article  PubMed  Google Scholar 

  158. Nordlund A, Påhlsson L, Holmberg C, Lind K, Wallin A. The Cognitive Assessment Battery (CAB): a rapid test of cognitive domains. Int Psychogeriatr. 2011;23(7):1144–51.

    Article  PubMed  Google Scholar 

  159. Hooper HE. Hooper visual organization test manual. Los Angeles, CA, Western Psychological Services; 1983. https://www.scirp.org/(S(351jmbntvnsjt1aadkposzje))/reference/referencespapers.aspx?referenceid=1420250.

  160. Alpern GD. Measurement of “untestable” autistic children. J Abnorm Psychol. 1967;72(6):478.

    Article  CAS  PubMed  Google Scholar 

  161. Rosário V, Gomes C, Loureiro C. Systematic review of attention testing in allegedly “Untestable” populations. Int J Psychol Res Rev. 2019. https://escipub.com/ijprr-2019-07-1905/.

  162. Wolf-Schein EG. Considerations in assessment of children with severe disabilities including deaf-blindness and autism. Int J Disabil Dev Educ. 1998;45(1):35–55. https://doi.org/10.1080/1034912980450104.

    Article  Google Scholar 

  163. Bathurst K, Gottfried AW. Untestable subjects in child development research: developmental implications. Child Dev. 1987;58(4):1135–44.

    Article  Google Scholar 

  164. Areces D, Dockrell J, García T, González-Castro P, Rodríguez C. Analysis of cognitive and attentional profiles in children with and without ADHD using an innovative virtual reality tool. PLoS ONE. 2018;13(8): e0201039.

    Article  PubMed  PubMed Central  Google Scholar 

  165. Díaz-Orueta U, Garcia-López C, Crespo-Eguílaz N, Sánchez-Carpintero R, Climent G, Narbona J. AULA virtual reality test as an attention measure: convergent validity with Conners’ Continuous Performance Test. Child Neuropsychol. 2014;20(3):328–42.

    Article  PubMed  Google Scholar 

  166. Oliveira J, Gamito P, Alghazzawi DM, Fardoun HM, Rosa PJ, Sousa T, et al. Performance on naturalistic virtual reality tasks depends on global cognitive functioning as assessed via traditional neurocognitive tests. Appl Neuropsychol Adult. 2018;25(6):555–61.

    Article  PubMed  Google Scholar 

  167. McPherson J, Burns NR. Assessing the validity of computer-game-like tests of processing speed and working memory. Behav Res Methods. 2008;40(4):969–81.

    Article  PubMed  Google Scholar 

  168. Corbett BA, Constantine LJ. Autism and attention deficit hyperactivity disorder: assessing attention and response control with the integrated visual and auditory continuous performance test. Child Neuropsychol. 2006;12(4–5):335–48.

    Article  PubMed  Google Scholar 

  169. Thompson L, Thompson M, Reid A. Neurofeedback outcomes in clients with Asperger’s syndrome. Appl Psychophysiol Biofeedback. 2010;35(1):63–81.

    Article  PubMed  Google Scholar 

  170. Matson JL, Rieske RD, Williams LW. The relationship between autism spectrum disorders and attention-deficit/hyperactivity disorder: an overview. Res Dev Disabil. 2013;34(9):2475–84.

    Article  PubMed  Google Scholar 

  171. Sandford JATA. Integrated visual and auditory continuous performance test manual. Richmond: VA Braintrain Inc; 1995.

    Google Scholar 

  172. Reynolds CR, Fletcher-Janzen E (eds). Goldman-Fristoe-Woodcock Test of Auditory Discrimination. In: Encyclopedia of Special Education. 2008. https://doi.org/10.1002/9780470373699.speced0937.

  173. Klinger LG, Renner P. Performance-based measures in autism: implications for diagnosis, early detection, and identification of cognitive profiles. J Clin Child Psychol. 2000;29(4):479–92.

    Article  CAS  PubMed  Google Scholar 

  174. Posner M. Orienting of attention. Q J Exp Psychol. 1980;32:3–25.

    Article  CAS  PubMed  Google Scholar 

  175. Naglieri JA, Otero TM. The cognitive assessment system—second edition: From theory to practice Contemporary intellectual assessment: Theories, tests, and issues, 4th edn. New York: The Guilford Press; 2018. p. 452–85.

  176. Taddei S, Contena B. Brief report: cognitive performance in autism and Asperger’s syndrome: what are the differences? J Autism Dev Disord. 2013;43(12):2977–83.

    Article  PubMed  Google Scholar 

  177. de Vries PJ, Watson P. Attention deficits in tuberous sclerosis complex (TSC): rethinking the pathways to the endstate. J Intellect Disabil Res. 2008;52(Pt 4):348–57.

    Article  PubMed  Google Scholar 

  178. Rose E, Bramham J, Young S, Paliokostas E, Xenitidis K. Neuropsychological characteristics of adults with comorbid ADHD and borderline/mild intellectual disability. Res Dev Disabil. 2009;30(3):496–502.

    Article  CAS  PubMed  Google Scholar 

  179. Borgwardt L, Thuesen AM, Olsen KJ, Fogh J, Dali CI, Lund AM. Cognitive profile and activities of daily living: 35 patients with alpha-mannosidosis. J Inherit Metab Dis. 2015;38(6):1119–27.

    Article  CAS  PubMed  Google Scholar 

  180. Jaywant A, Barredo J, Ahern DC, Resnik L. Neuropsychological assessment without upper limb involvement: a systematic review of oral versions of the Trail Making Test and Symbol-Digit Modalities Test. Neuropsychol Rehabil. 2018;28(7):1055–77.

    Article  PubMed  Google Scholar 

  181. Ricker JH, Axelrod BN. Analysis of an oral paradigm for the trail making test. Assessment. 1994;1(1):47–52.

    Article  CAS  PubMed  Google Scholar 

  182. Smith A. Symbol Digit Modalities Test (SDMT). Manual (Revised). Los Angeles: Western Psychological Services; 1982.

    Google Scholar 

  183. Aljahlan Y, Spaulding TJ. Attentional shifting in children with developmental language disorder: a meta-analysis. J Commun Disord. 2021;91: 106105.

    Article  PubMed  Google Scholar 

  184. Bishop DV. What causes specific language impairment in children? Curr Dir Psychol Sci. 2006;15(5):217–21.

    Article  PubMed  PubMed Central  Google Scholar 

  185. Deutsch JA, Deutsch D. Attention: some theoretical considerations. Psychol Rev. 1963;70(1):80.

    Article  CAS  PubMed  Google Scholar 

  186. Kaldas A. Attention: a descriptive taxonomy. Hist Philos Life Sci. 2022;44(4):63.

    Article  PubMed  PubMed Central  Google Scholar 

  187. Wickens C. Attention: theory, principles, models and applications. Int J Hum Comput Interact. 2021;37(5):403–17.

    Article  Google Scholar 

  188. Lindsay GW. Attention in psychology, neuroscience, and machine learning. Front Comput Neurosci. 2020;14:29.

    Article  PubMed  PubMed Central  Google Scholar 

  189. Hernández A, Amigó JM. Attention mechanisms and their applications to complex systems. Entropy (Basel). 2021. https://doi.org/10.3390/e23030283.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

The research protocol was reviewed and approved by the Student Research Committee of Tabriz University of Medical Sciences (grant number: 68480).

Funding

This research was supported by a grant from the Student Research Committee of Tabriz University of Medical Sciences (grant number: 68480) to AN.

Author information

Authors and Affiliations

Authors

Contributions

EN, MK, ZH, SFY, AI: investigation; methodology; writing—original draft. SS-E: investigation; supervision; validation; writing—review and editing. AN*: conceptualization; methodology; project administration; funding acquisition; writing—original draft. All authors read and approved the final manuscript.

Corresponding authors

Correspondence to Saeed Sadigh-Eteghad or Amirreza Naseri.

Ethics declarations

Ethics approval and consent to participate

Ethical approval code: IR.TBZMED.VCR.REC.1400.577.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no conflicting interests to disclose.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nasiri, E., Khalilzad, M., Hakimzadeh, Z. et al. A comprehensive review of attention tests: can we assess what we exactly do not understand?. Egypt J Neurol Psychiatry Neurosurg 59, 26 (2023). https://doi.org/10.1186/s41983-023-00628-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41983-023-00628-4

Keywords