PERSPECTIVE article

Front. Psychol., 24 April 2023

Sec. Cognitive Science

Volume 14 - 2023 | https://doi.org/10.3389/fpsyg.2023.1160554

The necessary, albeit belated, transition to computerized cognitive assessment

  • 1. Facultad de Lenguas y Educación, Universidad Nebrija, Madrid, Spain

  • 2. AcqVA Aurora Center, UiT The Arctic University of Norway, Tromsø, Norway

Article metrics

View details

9

Citations

6,9k

Views

860

Downloads

Abstract

Cognitive assessment is a common and daily process in educational, clinical, or research settings, among others. Currently, most professionals use classic pencil-and-paper screenings, tests, and assessment batteries. However, as the SARS-CoV-2 health crisis has shown, the pencil-and-paper format is becoming increasingly outdated and it is necessary to transition to new technologies, using computerized cognitive assessments (CCA). This article discusses the advantages, disadvantages, and implications of this necessary transition that professionals should face in the immediate future, and encourages careful adoption of this change to ensure a smooth transition.

1. Introduction: the current state of cognitive assessment

Cognitive assessment refers to a set of techniques and procedures to ascertain the status of one or more aspects of a person’s cognitive profile. It is an essential process that is performed daily in clinical, academic, and research settings around the world. It is often used to identify behavioral markers of processes involving cognitive impairment (Wild et al., 2008) or neurodegeneration (Choi, 2008; Davis et al., 2015; Daroische et al., 2021), although it is also used in healthy users (White et al., 2012; Bertelli et al., 2018). These tests can take many forms, consist of different activities, have different application rules, and require different types of answers from the user. However, to a greater or lesser extent, they can all be applied in different formats, the main versions being paper-and-pencil and computerized.

Mainly for reasons of tradition and accessibility, most of the widely applied tests are paper-and-pencil tests. Classical examples include the Montreal Cognitive Assessment (MoCA; Nasreddine et al., 2005), the Stroop Test (Stroop, 1935), Raven’s progressive matrices (RSPM; Raven, 1938), Rey Auditory Verbal Learning Test (RAVLT; Rey, 1941; Schmidt, 1996), Trail Making Test (TMT; Reitan, 1955), the Corsi Block-Tapping Test (Corsi, 1972), the mini-mental state examination (MMSE; Folstein et al., 1975), Wisconsin Card Sorting Test (WCST; Heaton, 1981), Boston Naming Test (Kaplan et al., 1983), and the Wechsler scales (Wechsler, 2008; Wechsler, 2014). While quite a few of these individual tests have been digitized (Hannukkala et al., 2020), most of the assessment batteries that usually contain them have not finished making the step into the digital modality, only timidly entering the field through digitized corrections, without coming fully online. In addition, it should be noted that there is a relatively wide range of variability in the digitalization feasibility of the tests. On the one hand, the translation to digital format is straightforward for tests that require selecting one or more answers from an array of options on paper. On the other hand, tests that are manipulative or visuo-constructive, or that require drawing or complex verbal responses, represent a challenge or imply complex adaptation processes for their implementation on digital devices.

In contrast, more recently developed tests and assessment batteries are now being created fully digitized, to the point that some of them have not even been distributed in physical format. These are generally called computerized cognitive assessments (CCA), and some of the most noteworthy examples include the CogniFit General Cognitive Assessment Battery (CAB) (Haimov et al., 2008), or the Cambridge Neuropsychological Test Automated Battery (Sandberg, 2011), in addition to those that Zygouris and Tsolaki (2014) already pointed out in their review. This article outlines the advantages, disadvantages and implications of this transition to digital.

2. Reasons for the transition

Before the 2000s, it did not make sense to use digitized tests, because of the cumbersome, expensive and infrequent equipment required. However, nowadays children are educated in the use of technology (Lee et al., 2014; Wojcik et al., 2022), adults work with it every day, and older people are gaining experience and confidence in the use of these devices (Demiris et al., 2004; Kim and Preis, 2015; Kim et al., 2016). Thus, technological advances and easier access to technology have made CCAs feasible for everyday use.

The transition to CCAs is supported by different cognitive theories and psychometric approaches. For instance, computer-based assessments allow customizing the tests to the test-takers along the principles of the Item Response Theory (Mead and Drasgow, 1993; Huebner, 2010). Besides, the Cognitive Load Theory (Sweller, 1988) suggests that the computer-based format decreases ineffective cognitive load with respect to the paper-based format, thus optimizing performance (Khalil et al., 2010; for further discussion, see Taylor 1994).

Given this paradigm shift, it seems reasonable to foster a transition to the coexistence between classic cognitive assessment tools and CCA to always offer the option that best fits the circumstances of each evaluation session, evaluator, and person being assessed. The greatest example of the need to implement CCA as a regular tool and that classical tests can become non-operational has been unintendedly offered by the lockdown resulting from the SARS-CoV-2 health crisis (Larner, 2021). Many health and education professionals, and researchers who needed to assess the cognitive status of their informants have had to either pause their activity or migrate to new technologies because the classic cognitive assessment tools they used to apply were no longer suited to the needs of the situation. In fact, these circumstances have changed the modality of assessment, but not the frequency with which they are used (Webb et al., 2021). With this in mind, a selected literature review of the status and needs in the field of cognitive assessment has been conducted, and the pros and cons of transitioning to CCAs have been analyzed.

3. The advantages of CCA

Numerous studies address the advantages of computerized cognitive assessments over classic pencil-and-paper tests (Sternin et al., 2019). The most basic advantage is familiarity with the delivering device since digital technology is deeply integrated into our lives. Most people not only make use of computers, tablets, and smartphones daily, but also master these devices.

Digital platforms are also well-known for their accessibility, and the possibility of modifying the contrast, the size or color of the text, the volume, or the response requirements of people with reduced mobility or motor difficulties represent altogether a clear-cut advantage of CCA. The adaptation of the test to the specific needs of the assessed person is straightforward in CCA, including language modification when it is necessary to serve a student, patient, or user of another language (see Frances et al., 2020, for a discussion of the impact of this factor). Since COVID-19, ubiquity takes an important role, and the need to remotely assess people has skyrocketed (Larner, 2021). In this line, several studies point to the feasibility of remote cognitive assessments (Settle et al., 2015; Geddes et al., 2020; Zeghari et al., 2021).

Usability and user experience can also be taken as a clear advantage of CCA over traditional assessments. Interactive activities, automatic visual and auditory feedback, and colorful stimuli result in more engaging, motivating, and user-friendly assessments for the test-taker and are easier for the evaluator (Soto-Ruiz, 2020). But importantly, the CCA does not have to lose psychometric value because of their customizable digitized format. In fact, psychometric characteristics such as discriminant validity or test–retest reliability of CCAs are usually well documented, as in the case of conventional paper-and-pencil resources (Zygouris and Tsolaki, 2014). In fact, the ability of the device itself to record reaction times, to be extremely accurate in identifying user responses, to provide the same feedback to all users equally, to provide instant results, or to support the incorporation of more sophisticated techniques such as eye movement detection, are some of the major advantages of CCA.

Another advantage that may not be as obvious, but that stands as a critical one, is the updateable nature of the scales. If test distributors regularly update the reference scales of test scores, there is no need to buy new versions of the test to stop comparing test takers with outdated scales. This is a relatively natural process for the CCA given that the update may take place in a server, without imposing any burden on the evaluator. In contrast, any update in the reference norms of a paper-and-pencil may require renewal actions on the side of the evaluator.

4. In the way to transition

What keeps professionals attached to cognitive analog assessments? The CCA have some limitations that cannot be ignored and that will make it preferable on certain occasions to use analog tools instead of digital ones. There are still people who are digital illiterates or that have a low technological-digital command, and they may manage well or better with paper and pencil. This may be particularly important in certain pathologies, and as Witt et al. (2013) pointed out, the CCA cannot completely replace a comprehensive paper-and-pencil neuropsychological assessment in certain circumstances.

However, one of the main problems that prevents professionals from making the transition is more contextual and cultural. On the one hand, professionals are habituated to using analog screenings, tests, and cognitive assessment batteries. They have been historically trained with them, they have applied them hundreds of times and any change to a digital tool can be cumbersome. On the other hand, the prior investment of a professional needs to be considered. If professionals have spent large sums of money on analog assessment batteries, it is expected that they will be reluctant to change until the redemption of the investment is effective.

Besides, it should also be considered that access to the Internet is most of the time a mandatory technical requirement of the CCA. While there is usually a fast and stable Internet connection in urban spaces, there are places where the connection does not allow for assessments that require heavy or agile data transfer (Gerli and Whalley, 2021). The same happens with access to electricity, which is not always stable or possible, limiting access to this type of test. In addition, there is another critical aspect related to the specific testing conditions that must be taken into account when assessing remotely. Most tests are intended to be applied in laboratory-like clinical settings, with good control of potentially interfering environment-related factors (Robillard et al., 2018). When a person performs the assessment from home without being under the guidance of a professional, the chances increase that uncontrolled environmental conditions could interfere with the process (e.g., distractions, facilitations, or other elements that may affect the validity of the assessment). For this reason, a correct CCA necessarily implies advising about the necessity to prepare and secure the environment and conditions in which the assessment is to be performed and implementing mechanisms to assess the validity of the data.

In a broader sense, it should also be noted that not all tests need to be directly translated into a digital format, and that some if not all will require some adaptations or modifications. In these circumstances, it becomes especially important to ensure the psychometric qualities of CCA (Gates and Kochan, 2015) and to develop adequate quality assurance tests. Furthermore, all the modifications need to be endorsed by scientific evidence, similar to the procedure being currently developed to adapt classic laboratory 2D assessments to 3D virtual reality assessments (see Rocabado and Duñabeitia, 2022).

5. Discussion: implications of these changes

Health measures such as the lockdowns adopted during the COVID-19 pandemic have made the importance of CCA clear (Hsu et al., 2021). Nowadays digital tools are integrated into the daily life of professional and non-professional individuals at schools, healthcare centers, workplaces, and social contexts (Tomasik et al., 2020; Nordin et al., 2021; Paganin and Simbula, 2021). They are well-known and familiar tools for most users. The design and implementation of CCA based on these widespread digital tools could offer a better service to students, patients, or research participants, making the activities more attractive and dynamic, individualizing the tests, and updating the reference standards.

CCA is increasingly integrated into the daily lives of professionals and has a great and growing potential. Without sacrificing the quality of the tests, CCA allows for adapting assessment characteristics to the specific needs of the test-takers, increasing the accuracy of the measurements, assessing remotely on common devices available to a wide portion of society, and giving immediate and motivating feedback. However, CCA also has human limitations, such as users who are not proficient with digital tools or professionals who want to amortize the investment made in classic tests. Besides, certain environmental conditions may limit the degree of generalization of CCA, such as the need to have access to electricity and connectivity or the relative lack of control over the assessment environment. Thus, proper adaptations to digital format with good psychometric characteristics are essential to spread the use of CCA. Nevertheless, the transition to digital assessments is an inevitable evolution.

The debate is not about using only digitized tools or only classic tools, but about whether we are adapting classic paper-and-pencil tools to the context in which we live at the right speed. Classic assessment tools will and should continue being used for some time or, perhaps, always. But the migration to digital tools in parallel is indisputable, and it must be carried out thoroughly to facilitate professionals’ work in case we meet again in such adverse circumstances where in-person paper-and-pencil assessments cannot be used.

Funding

This research has been partially funded by grant PID2021-126884NB-I00 from the Spanish Government. The APC was funded by the publication fund of UiT, The Arctic University of Norway.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Statements

Data availability statement

The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author.

Author contributions

DA and JD contributed to the conceptualization. DA wrote the original draft. JD reviewed, edited the text, and supervised the project. All authors contributed to the article and approved the submitted version.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  • 1

    BertelliM. O.CooperS. A.Salvador-CarullaL. (2018). Intelligence and specific cognitive functions in intellectual disability. Curr. Opin. Psychiatry31, 8895. doi: 10.1097/yco.0000000000000387

  • 2

    ChoiS. H. (2008). Cognitive assessment in traumatic brain injury. Brain Neurorehabil.1:148. doi: 10.12786/bn.2008.1.2.148

  • 3

    CorsiP. M. (1972). Human memory and the medial temporal region of the brain. PhD thesis. Montreal, QC: McGill University.

  • 4

    DaroischeR.HemminghythM. S.EilertsenT. H.BreitveM. H.ChwiszczukL. J. (2021). Cognitive impairment after COVID-19—a review on objective test data. Front. Neurol.12:699582. doi: 10.3389/fneur.2021.699582

  • 5

    DavisD. H.CreavinS. T.YipJ. L.Noel-StorrA. H.BrayneC.CullumS. (2015). Montreal Cognitive Assessment for the diagnosis of Alzheimer’s disease and other dementias. Cochrane Database Syst. Rev.2015:CD010775. doi: 10.1002/14651858.cd010775.pub2

  • 6

    DemirisG.RantzM. J.AudM. A.MarekK. D.TyrerH. W.SkubicM.et al. (2004). Older adults’ attitudes towards and perceptions of ‘smart home’ technologies: a pilot study. Med. Inform. Internet Med.29, 8794. doi: 10.1080/14639230410001684387

  • 7

    FolsteinM.FolsteinS. E.McHughP. R. (1975). “Mini-mental state” a practical method for grading the cognitive state of patients for the clinician. J. Psychiatr. Res.12, 189198. doi: 10.1016/0022-3956(75)90026-6

  • 8

    FrancesC.PueyoS.AnayaV.DuñabeitiaJ. A. (2020). Interpreting foreign smiles: language context and type of scale in the assessment of perceived happiness and sadness. Psicológica41, 2138. doi: 10.2478/psicolj-2020-0002

  • 9

    GatesN. J.KochanN. A. (2015). Computerized and on-line neuropsychological testing for late-life cognition and neurocognitive disorders. Curr. Opin. Psychiatry28, 165172. doi: 10.1097/yco.0000000000000141

  • 10

    GeddesM. R.O’ConnellM. E.FiskJ. D.GauthierS.CamicioliR.IsmailZ. (2020). Remote cognitive and behavioral assessment: report of the Alzheimer society of Canada task force on dementia care best practices for COVID-19. Alzheimer’s Dement. Diagn. Assess. Dis. Monit.12:e12111. doi: 10.1002/dad2.12111

  • 11

    GerliP.WhalleyJ. (2021). Fibre to the countryside: a comparison of public and community initiatives tackling the rural digital divide in the UK. Telecommun. Policy45:102222. doi: 10.1016/j.telpol.2021.102222

  • 12

    HaimovI.HanukaE.HorowitzY. (2008). Chronic insomnia and cognitive functioning among older adults. Behav. Sleep Med.6, 3254. doi: 10.1080/15402000701796080

  • 13

    HannukkalaM. S.MikkonenK.LaitinenE.TuononenT. (2020). Staying on the digitalized trail. Heal. Technol.10, 12571263. doi: 10.1007/s12553-020-00425-6

  • 14

    HeatonR. K. (1981). A manual for the Wisconsin card sorting test. Melton South, VIC: Western Psychological Services.

  • 15

    HsuN.MonasterioE.RolinO. (2021). Telehealth in pediatric rehabilitation. Phys. Med. Rehabil. Clin. N. Am.32, 307317. doi: 10.1016/j.pmr.2020.12.010

  • 16

    HuebnerA. (2010). An overview of recent developments in cognitive diagnostic computer adaptive assessments. Pract. Assess. Res. Eval.15:3. doi: 10.7275/7fdd-6897

  • 17

    KaplanE.GoodglassH.WeintraubS. (1983). Boston naming test. Philadelphia, PA: Lea & Febiger.

  • 18

    KhalilM.MansourM. A.WilhiteD. R. (2010). Evaluation of cognitive loads imposed by traditional paper-based and innovative computer-based instructional strategies. J. Vet. Med. Educ.37, 353357. doi: 10.3138/jvme.37.4.353

  • 19

    KimM. J.KimW. G.KimJ. M.KimC. (2016). Does knowledge matter to seniors’ usage of mobile devices? Focusing on motivation and attachment. Int. J. Contemp. Hosp. Manag.28, 17021727. doi: 10.1108/ijchm-01-2015-0031

  • 20

    KimM. J.PreisM. W. (2015). Why seniors use mobile devices: applying an extended model of goal-directed behavior. J. Travel Tour. Mark.33, 404423. doi: 10.1080/10548408.2015.1064058

  • 21

    LarnerA. J. (2021). Cognitive testing in the COVID-19 era: can existing screeners be adapted for telephone use?Neurodegener. Dis. Manag.11, 7782. doi: 10.2217/nmt-2020-0040

  • 22

    LeeS. M.SeoH. A.HanH. J. (2014). Use of smart devices of infants and preschool-children and their mothers’ perceptions. J. Child Care Educ.10, 111131. doi: 10.14698/jkcce.2014.10.2.111

  • 23

    MeadA. D.DrasgowF. (1993). Equivalence of computerized and paper-and-pencil cognitive ability tests: a meta-analysis. Psychol. Bull.114, 449458. doi: 10.1037/0033-2909.114.3.449

  • 24

    NasreddineZ. S.PhillipsN. A.BédirianV.CharbonneauS.WhiteheadV.CollinI.et al. (2005). The Montreal Cognitive Assessment, MoCA: a brief screening tool for mild cognitive impairment. J. Am. Geriatr. Soc.53, 695699. doi: 10.1111/j.1532-5415.2005.53221.x

  • 25

    NordinS.SturgeJ.AyoubM.JonesA.McKeeK.DahlbergL.et al. (2021). The role of information and communication technology (ICT) for older adults’ decision-making related to health, and health and social care services in daily life—a scoping review. Int. J. Environ. Res. Public Health19:151. doi: 10.3390/ijerph19010151

  • 26

    PaganinG.SimbulaS. (2021). New Technologies in the workplace: can personal and organizational variables affect the employees’ intention to use a work-stress management app?Int. J. Environ. Res. Public Health18:9366. doi: 10.3390/ijerph18179366

  • 27

    RavenJ. C. (1938). Standard Progressive Matrices: A Perceptual Test of Intelligence. London, UK: H. K. Lewis.

  • 28

    ReitanR. M. (1955). The relation of the trail making test to organic brain damage. J. Consult. Psychol.19, 393394. doi: 10.1037/h0044509

  • 29

    ReyA. (1941). L’examen psychologique dans les cas d’encephalopathie traumatique. Arch. Psychol.28, 286340.

  • 30

    RobillardJ. M.LaiJ.WuJ.FengT. L.HaydenS. (2018). Patient perspectives of the experience of a computerized cognitive assessment in a clinical setting. Alzheimer’s Dement.: Transl. Res. Clin. Interv.4, 297303. doi: 10.1016/j.trci.2018.06.003

  • 31

    RocabadoF.DuñabeitiaJ. A. (2022). Assessing inhibitory control in the real world is virtually possible: a virtual reality demonstration. Behav. Sci.12:444. doi: 10.3390/bs12110444

  • 32

    SandbergM. A. (2011). Cambridge Neuropsychological Testing Automated Battery. in Encyclopedia of Clinical Neuropsychology. eds. KreutzerJ. S.DeLucaJ.CaplanB.New York, NY: Springer.

  • 33

    SchmidtM. (1996). Rey Auditory Verbal Learning Test: A Handbook. Los Angeles, CA: Western Psychological Services

  • 34

    SettleJ. R.RobinsonS. A.KaneR.MaloniH. W.WallinM. T. (2015). Remote cognitive assessments for patients with multiple sclerosis: a feasibility study. Mult. Scler.21, 10721079. doi: 10.1177/1352458514559296

  • 35

    Soto-RuizK. M. (2020). Digital neurocognitive testing. Biomarkers Trauma. Brain Inj., 355365. doi: 10.1016/b978-0-12-816346-7.00024-5

  • 36

    SterninA.BurnsA.OwenA. M. (2019). Thirty-five years of computerized cognitive assessment of aging—where are we now?Diagnostics9:114. doi: 10.3390/diagnostics9030114

  • 37

    StroopJ. R. (1935). Studies of interference in serial verbal reactions. J. Exp. Psychol.18, 643662. doi: 10.1037/h0054651

  • 38

    SwellerJ. (1988). Cognitive load during problem solving: effects on learning. Cogn. Sci.12, 257285. doi: 10.1207/s15516709cog1202_4

  • 39

    TaylorT. R. (1994). A review of three approaches to cognitive assessment, and a proposed integrated approach based on a unifying theoretical framework. S. Afr. J. Psychol.24, 184193. doi: 10.1177/008124639402400403

  • 40

    TomasikM. J.HelblingL. A.MoserU. (2020). Educational gains of in-person vs. distance learning in primary and secondary schools: a natural experiment during the COVID-19 pandemic school closures in Switzerland. Int. J. Psychol.56, 566576. doi: 10.1002/ijop.12728

  • 41

    WebbS. S.KontouE.DemeyereN. (2021). The COVID-19 pandemic altered the modality, but not the frequency, of formal cognitive assessment. Disabil. Rehabil.44, 63656373. doi: 10.1080/09638288.2021.1963855

  • 42

    WechslerD. (2008). Wechsler Adult Intelligence Scale (4th ed.). San Antonio, TX: Pearson/PsychCorp.

  • 43

    WechslerD. (2014). Wechsler Intelligence Scale for Children (5th ed.). San Antonio, TX: Pearson/PsychCorp.

  • 44

    WhiteA. J.BatchelorJ.PulmanS.HowardD. (2012). The role of cognitive assessment in determining fitness to stand trial. Int. J. Forensic Ment. Health11, 102109. doi: 10.1080/14999013.2012.688091

  • 45

    WildK.HowiesonD.WebbeF.SeelyeA.KayeJ. (2008). Status of computerized cognitive testing in aging: a systematic review. Alzheimers Dement.4, 428437. doi: 10.1016/j.jalz.2008.07.003

  • 46

    WittJ. A.AlphertsW.HelmstaedterC. (2013). Computerized neuropsychological testing in epilepsy: overview of available tools. Seizure22, 416423. doi: 10.1016/j.seizure.2013.04.004

  • 47

    WojcikE. H.PrasadA.HutchinsonS. P.ShenK. (2022). Children prefer to learn from smart devices, but do not trust them more than humans. Int. J. Child Comput. Interact.32:100406. doi: 10.1016/j.ijcci.2021.100406

  • 48

    ZeghariR.GuerchoucheR.Tran DucM.BremondF.LemoineM. P.BultingaireV.et al. (2021). Pilot study to assess the feasibility of a mobile unit for remote cognitive screening of isolated elderly in rural areas. Int. J. Environ. Res. Public Health18:6108. doi: 10.3390/ijerph18116108

  • 49

    ZygourisS.TsolakiM. (2014). Computerized cognitive testing for older adults. Am. J. Alzheimers Dis. Other Dement.30, 1328. doi: 10.1177/1533317514522852

Summary

Keywords

cognitive assessment, cognition, digital tools, computerized cognitive assessment, paper and pencil test

Citation

Asensio D and Duñabeitia JA (2023) The necessary, albeit belated, transition to computerized cognitive assessment. Front. Psychol. 14:1160554. doi: 10.3389/fpsyg.2023.1160554

Received

07 February 2023

Accepted

06 April 2023

Published

24 April 2023

Volume

14 - 2023

Edited by

Manpreet Kaur Bagga, Partap College of Education, India

Reviewed by

Maura Pilotti, Prince Mohammad bin Fahd University, Saudi Arabia; Natanael Karjanto, Sungkyunkwan University, Republic of Korea; Luis Carlos Jaume, University of Buenos Aires, Argentina; Rozel Balmores-Paulino, University of the Philippines Baguio, Philippines; Murat Tezer, Near East University, Cyprus

Updates

Copyright

*Correspondence: Jon Andoni Duñabeitia,

Disclaimer

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Outline

Cite article

Copy to clipboard


Export citation file


Share article

Article metrics