Your new experience awaits. Try the new design now and help us make it even better

OPINION article

Front. Psychol., 21 November 2025

Sec. Health Psychology

Volume 16 - 2025 | https://doi.org/10.3389/fpsyg.2025.1699320

Cognitive offloading or cognitive overload? How AI alters the mental architecture of coping


Ginto ChirayathGinto Chirayath1K. PremamaliniK. Premamalini1Jeena Joseph
Jeena Joseph2*
  • 1Department of Social Work, Bishop Appasamy College of Arts and Science, Coimbatore, Tamil Nadu, India
  • 2Department of Computer Applications, Marian College Kuttikkanam Autonomous, Kuttikkanam, Kerala, India

Introduction

Artificial intelligence (AI) has moved from being a specialized technological tool to an intimate presence in everyday life. Smart assistants organize our schedules, predictive systems anticipate our needs, and therapeutic chatbots promise to listen when no human is available (Zhang and Wang, 2024). The diffusion of AI into mental health care is often framed in highly optimistic terms: technologies that reduce stigma, democratize access, and provide affordable, always-on support (Shankar Ganesh and Venkateswaramurthy, 2025; Sivasubramanian Balasubramanian, 2023). From meditation applications and emotion-tracking wearables to conversational agents offering cognitive-behavioral interventions, AI is marketed as an efficient and reliable partner in the pursuit of wellbeing (Balcombe, 2023).

Beneath this enthusiasm lies a paradox. On one hand, AI enables cognitive offloading: the use of external aids to reduce mental effort and conserve resources for more meaningful activities (Grinschgl and Neubauer, 2022; Risko and Gilbert, 2016). This process can support adaptive coping, helping individuals regulate stress and sustain mental health. On the other hand, the same tools may create cognitive overload: an erosion of introspection, over-reliance on algorithmic feedback, and anxiety induced by hyper-monitoring and optimization (Grinschgl et al., 2021; Skulmowski, 2023). The question is not whether AI is good or bad for mental health but how it is reshaping the very architecture of coping (Gerlich, 2025).

This article examines the psychology of this paradox. First, it places copings in mainstream psychological theory. Then it considers the promise of AI to lighten burdens through cognitive offloading. Next, it considers the risks of cognitive overload and theoretical insights that illuminate this conflict. Through practical illustrations, it demonstrates the double-sidedness of AI as copings partner on the one hand, yet destabilizer on the other. Finally, it identifies principles for design, clinical responses, and policy interventions required to ensure that AI complements, not undermines, human resilience.

Coping in psychological science

Coping is central to psychological health. It encompasses the thoughts, behaviors, and emotional strategies individuals use to manage stressful events, regulate affect, and adapt to changing environments (Marroquín et al., 2017). Coping strategies have been categorized in multiple ways:

• Problem-focused coping, which seeks to address the stressor directly through actions such as planning or problem-solving (Gruber and Martic Biocina, 2024).

• Emotion-focused coping, which aims to regulate emotional responses through reappraisal, distraction, or relaxation (Oostvogels et al., 2018; Trudel-Fitzgerald et al., 2024).

• Avoidant coping, which minimizes distress through withdrawal or denial, sometimes maladaptive in the long term (Lin, 2022; Marr et al., 2022).

Coping is not a static trait but a dynamic process shaped by situational demands, personal resources, and social contexts (Depoorter et al., 2025). Historically, coping has always been supported by external aids. Religious rituals, cultural practices, and communal gatherings provided frameworks for resilience (Butler et al., 2025; Whitehead and Bergeman, 2020). Personal tools such as diaries, letters, or calendars extended introspection and memory, allowing individuals to regulate emotions and track personal growth.

AI represents a profound shift in this lineage. Unlike static tools or human communities, AI systems are adaptive, predictive, and personalized. They do not merely offer neutral support; they actively reshape the environment in which coping occurs (Shi, 2025). A diary records what one chooses to write; a mood-tracking app interprets and quantifies feelings, often presenting its interpretation as more authoritative than subjective experience (Brandsema, 2023; Yang and Zhao, 2024). This transformation raises new questions. How does the delegation of introspection to machines affect self-awareness? Does reliance on algorithmic recommendations weaken intrinsic coping mechanisms? And can AI be designed to scaffold rather than substitute the human capacity for resilience?

The promise of cognitive offloading

Cognitive offloading is a well-established phenomenon in cognitive science. It refers to the delegation of cognitive tasks to external resources in order to reduce mental demand. People use notebooks, calculators, and smartphones to extend memory and problem-solving capacities. AI magnifies this process dramatically, providing not only storage but active analysis and prediction (León-Domínguez, 2024).

• AI in Mental Health Tracking: Mental health applications now integrate biometric sensors and self-reporting tools to track sleep, exercise, and mood. By aggregating data and presenting trends, these apps reduce the burden of self-monitoring (Bakker and Rickard, 2018). Instead of recalling fluctuations over weeks, users can visualize emotional patterns instantly. Such tools facilitate problem-focused coping, allowing individuals to identify triggers, monitor progress, and adjust behaviors accordingly (Blease and Torous, 2023; Lopes et al., 2024).

• Therapeutic Chatbots: AI-driven conversational agents, such as Woebot and Wysa, deliver micro-interventions based on cognitive-behavioral therapy. They can prompt reappraisal, encourage behavioral activation, and provide coping strategies in real time (Beatty et al., 2022). For individuals facing barriers to traditional therapy—such as cost, stigma, or geographic isolation—these tools lower thresholds to care. They exemplify emotion-focused coping, offering comfort and strategies at the moment they are needed (Coghlan et al., 2023; Inkster et al., 2018).

• Guided Reflection: AI-guided meditation programs and journaling assistants scaffold introspection. By generating prompts, suggesting breathing exercises, or helping articulate feelings, they encourage engagement with practices that might otherwise feel overwhelming. They reduce decision fatigue, making coping rituals easier to adopt and sustain (Park et al., 2024).

• Benefits for Coping: In these ways, AI promotes adaptive coping by:

○ Minimizing cognitive load associated with self-monitoring.

○ Reducing decision fatigue in stressful moments.

○ Providing immediate, context-sensitive interventions.

○ Enhancing self-awareness through data visualization.

Viewed positively, AI can function as a resilience amplifier. By offloading mundane or effortful processes, it frees mental resources for growth, creativity, and deeper social engagement.

The perils of cognitive overload

The promise of offloading is shadowed by the risks of overload. When reliance on external aids discourages intrinsic engagement, coping may be weakened rather than strengthened.

• Erosion of Introspection: One risk is the erosion of introspection. Traditional coping often depends on reflective practices such as journaling, meditation, or conversation. When mood-tracking applications or predictive systems dictate interpretations of feelings, individuals may defer to the machine's account over their own experience (Brand et al., 2023). Complex emotions are reduced to numerical scores—“stress index: 75%”—flattening nuance and discouraging self-discovery.

• Outsourcing Resilience: A second risk is the outsourcing of resilience. If every moment of distress is met with algorithmic suggestions—“take three deep breaths,” “reframe your thought”—individuals may lose opportunities to cultivate independent strategies. Over time, this reliance may erode psychological immunity, leaving people ill-equipped to manage stress in contexts where technology is absent or unavailable (Gilboa and Nahum, 2022).

• Anxiety from Hyper-Monitoring: Finally, hyper-monitoring may itself generate anxiety. Wearable devices that continuously provide biometric feedback can create pressure to optimize every aspect of life. A poor “sleep score” may undermine subjective feelings of rest, while constant reminders to reduce stress can paradoxically heighten it. Instead of relieving burdens, AI may impose new demands that exacerbate distress (Inamdar et al., 2025).

These dynamics illustrate that AI is not a neutral partner but an active shaper of coping. Its interventions may scaffold resilience or foster dependence, depending on how they are designed and used.

Theoretical perspectives

Several psychological frameworks help explain this paradox.

• Cognitive Load Theory: Cognitive Load Theory characterizes extraneous load (unnecessary effort), intrinsic load (embedded difficulty), and germane load (effort toward learning). Reducing extraneous load is beneficial, but technology that reduces germane load has the negative impact on deeper engagement. Extended to dealing with stress, AI will in the future alleviate present burdens but risk losing potential for long-term resilience building (Evans et al., 2024; Sultanova, 2025).

• Self-Determination Theory: Self-Determination Theory posits that autonomy, competence, and relatedness are basic psychological needs (Evans et al., 2024). AI may enhance competence by offering strategies but threaten autonomy by making choices on behalf of users. It may also compromise relatedness if machine interactions displace human relationships.

• Resilience Frameworks: Resilience research emphasizes both external supports and internal capacities. External aids are valuable but cannot substitute for the cultivation of intrinsic coping skills (Brockbank and Feldon, 2024). Overuse of AI may act as a crutch, preventing the strengthening of these internal resources.

• Social Cognition and Bias: Finally, research on social cognition shows that external interpretations shape self-perception. If individuals consistently rely on algorithmic interpretations of their emotions, they may internalize biased or limiting narratives (Le Cunff et al., 2025). For example, a system that over-detects anxiety may reinforce an anxious self-concept, even when variability exists.

Integrating the theoretical frameworks: toward a unified model of AI—coping dynamics

Taken together, these four psychological frameworks form an interconnected system that explains how AI reshapes the mental architecture of coping. Cognitive Load Theory offers the foundational mechanism, describing how AI redistributes mental effort by externalizing cognitive tasks. This redistribution directly influences motivational dynamics described by Self-Determination Theory, as reductions in cognitive demand enhance coping only when individuals retain a sense of autonomy, competence, and relatedness. When these motivational needs are met, the process strengthens the internal capacities emphasized by Resilience Frameworks—promoting flexibility, persistence, and adaptive recovery over time. Conversely, if offloading undermines autonomy or fosters overreliance, resilience may deteriorate rather than grow. Overlaying these layers, Social Cognition and Bias Theory captures the reflective loop through which users internalize algorithmic interpretations of emotion and self, shaping how they perceive and evaluate their coping abilities. In this integrated view, AI's influence follows a coherent psychological sequence: cognitive redistribution affects motivation, motivation determines resilience outcomes, and feedback from algorithmic interpretation continually reshapes self-concept. This synthesis clarifies the theoretical thread connecting the frameworks and positions them not as parallel explanations but as interacting components of a unified model of AI-mediated coping.

AI as a double-edged coping partner

The paradox of cognitive offloading vs. cognitive overload highlights the dual nature of AI as a partner in coping. Unlike static tools such as a notebook or calendar, AI is dynamic: it adapts, predicts, and responds in ways that make it both supportive and potentially disruptive (Grinschgl and Neubauer, 2022). Its role in mental health is therefore best conceived not in binary terms but as a continuum ranging from healthy offloading to risky overload.

At one end of this continuum lies healthy offloading. In this mode, AI technologies function as scaffolds—temporary supports that facilitate reflection, encourage adaptive strategies, and promote personal growth. For instance, an app that prompts users to pause and identify their emotions at stressful moments does not replace emotional regulation but nudges individuals toward engaging with their own cognitive and affective processes. Similarly, a chatbot that guides users through a breathing exercise can reduce acute anxiety while simultaneously teaching a transferable skill (Yang et al., 2025). In these cases, the technology acts like a training wheel, helping individuals stabilize until they are capable of maintaining balance on their own (Grinschgl and Neubauer, 2022).

At the opposite end of the continuum lies risky overload. Here, AI tools begin to dictate experiences rather than facilitate them. Instead of scaffolding reflection, they may prescribe emotions (“you are stressed”), deliver rigid coping instructions without space for adaptation, or condition users to rely on external prompts rather than developing internal regulation strategies. For example, if a mood-tracking device consistently tells a user that they are “anxious” based on biometric indicators, the individual may come to trust the algorithm's interpretation over their own felt experience. This risks undermining self-awareness and fostering dependence on the device (Gerlich, 2025). In such contexts, AI ceases to be a supportive partner and becomes a substitute for coping itself, replacing human agency with algorithmic authority.

The critical distinction, then, is whether AI operates as a scaffold or a substitute. Scaffolding is characterized by temporariness, adaptability, and empowerment: the goal is to strengthen internal capacities so that the technology becomes progressively less necessary (Britten-Neish, 2025). A meditation app that gradually reduces guided instructions, encouraging independent practice, exemplifies scaffolding. Substitution, by contrast, is characterized by permanence and dependency: the technology assumes responsibility for regulation in ways that diminish intrinsic skills. An AI system that requires continuous engagement for relief, without cultivating transferable strategies, exemplifies substitution (Diaz Alfaro et al., 2024).

Importantly, whether a given technology scaffolds or substitutes depends less on its technical sophistication than on its design philosophy, integration context, and patterns of use. Designers can encourage scaffolding by building features that promote reflection, variability, and user agency (Lahlou, 2025). Clinicians can integrate AI in ways that complement, rather than replace, therapeutic relationships. Users themselves can engage critically, treating AI as a tool rather than a determinant of their psychological states.

Understanding AI as a double-edged coping partner shifts the focus from whether AI is “good” or “bad” for mental health to how it is designed, deployed, and experienced. It invites researchers, clinicians, and policymakers to ask: Does this technology empower individuals to cope more effectively, or does it cope on their behalf? The answer to this question will determine whether AI becomes an ally in resilience or an architect of dependency.

Real-world illustrations

The paradox is visible in multiple real-world contexts. Meditation applications such as Calm and Headspace provide structured routines that make mindfulness accessible (Huberty et al., 2020; Taylor et al., 2022). Yet dependence on guided sessions can leave individuals unable to practice independently, undermining self-sufficiency. Therapeutic chatbots offer another example. Evidence suggests that Woebot can reduce symptoms of depression and anxiety in young adults (Li et al., 2025). Yet prolonged use risks cultivating “pseudo-intimacy,” where trust is invested in a machine rather than fostered in authentic human relationships (Wasil et al., 2022). Wearable devices illustrate the risks of hyper-monitoring. Smartwatches that prompt stress reduction exercises can help prevent escalation. Yet constant biometric feedback may trigger obsessive self-monitoring, leading to heightened anxiety and detachment from subjective experience (Clarke and Draper, 2020). These cases show that AI can scaffold coping or create new vulnerabilities, depending on usage and context.

While the examples above illustrate how AI tools shape coping, their comparative analysis also reveals a systematic pattern across technologies. Meditation applications primarily reduce extraneous cognitive load, fostering short-term calm but risking dependence if self-regulation does not internalize. Therapeutic chatbots extend this mechanism into emotion-focused coping, enhancing accessibility and competence yet potentially weakening relatedness when human empathy is replaced by simulated interaction. Wearables, in turn, externalize physiological awareness, offering early warning and data-driven control but often amplifying anxiety through continuous self-surveillance. Viewed through the integrated theoretical model, these cases collectively demonstrate that AI-mediated coping outcomes depend on the balance between cognitive relief and motivational autonomy. When technology scaffolds reflection and supports agency, it strengthens resilience; when it substitutes intrinsic effort or distorts self-perception, it contributes to cognitive overload. Thus, the real-world evidence aligns with the theoretical continuum proposed in this paper, grounding the discussion in systematic, cross-case reasoning rather than isolated illustrations.

Social and Ethical Implications

Besides individual psychology, there also are social and ethical effects of AI on coping. On the contrary, AI cuts down access barriers for mental health resources. Automated tools also reduce help-seeking stigmatization through anonymity. They gain access where there are few professional resources (Li et al., 2025). Conversely, there is the risk for self-surveillance culture normalizing with AI. Distress is individualized into data points rather than expressed through the community. Coping's social factor—help by family, friends, and cultural rituals—potentially is suppressed by the electronic individualized management

Algorithmic bias compounds inequities. Mental health AI trained on narrow datasets may misinterpret expressions of distress across cultures, genders, or age groups. For example, emotional expression varies widely across societies (Chen, 2025). If AI tools fail to recognize this diversity, they may deliver interventions that are ineffective or even harmful for marginalized populations. Finally, the privatization of coping risks weakening collective resilience. Communities have historically sustained resilience through shared rituals, storytelling, and mutual care (Jackson, 2020). AI interactions, by isolating coping within individualized digital exchanges, may erode these communal resources.

Pathways forward

To reduce these risks, multi-level strategies are needed. Designers will need to favor scaffold over substitution. Features would include reflective prompts in place of instruction, delays with individual reflection opportunities, and disclosure on how recommendations are generated (Martinez-Martin, 2021). Diversity in the training datasets will need to be guaranteed to reduce the risk of bias. Clinicians will need to integrate AI as part of stepped-care models, where technology provides low-level support shifting to human intervention with increasing complexity (Hoose and Králiková, 2024). Mental health professionals training needs to concentrate on critical interaction with outputs instead of unconditional use. Policy protections will need to ensure the safeguarding of privacy, accountability, and inclusivity. Regulation will need to require transparent data protection procedures along with the testing of the systems for the AI for fairness and efficacy (Elendu et al., 2023). Finally, people need to learn how to use AI with intention. Treating technology as partner rather than replacement helps maintain agency. Setting aside “AI-free reflection time” can help maintain intrinsic introspection. Engaging in practices such as journaling, mindfulness, and relational conversation ensures that AI complements rather than replaces human coping (Balcombe, 2023).

Conclusion

AI is reshaping not only how people live but also how they cope. The paradox of cognitive offloading vs. cognitive overload captures both the promise and the peril of this transformation. On one hand, AI can democratize access, reduce stigma, and scaffold resilience. On the other hand, it risks eroding introspection, diminishing autonomy, and fostering dependency.

The challenge for psychology is not to determine whether AI is beneficial or harmful in absolute terms. It is to ensure that AI empowers individuals to cope rather than coping on their behalf. This requires thoughtful design, careful clinical integration, robust policy safeguards, and intentional individual practices.

The future of mental health in the AI era will depend on maintaining this delicate balance. If technology can be designed and used to scaffold rather than substitute, it has the potential to become a genuine partner in wellbeing. If not, it risks becoming an architect of dependency. The task for researchers, practitioners, and policymakers is to ensure that AI strengthens, rather than weakens, the mental architecture of coping.

Author contributions

GC: Conceptualization, Writing – review & editing, Writing – original draft. KP: Writing – review & editing, Conceptualization, Writing – original draft. JJ: Conceptualization, Writing – review & editing, Writing – original draft.

Funding

The author(s) declare that no financial support was received for the research and/or publication of this article.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Generative AI statement

The author(s) declare that Gen AI was used in the creation of this manuscript. Generative artificial intelligence (AI) tools were used in the preparation of this manuscript to assist with language editing and text polishing.

Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Bakker, D., and Rickard, N. (2018). Engagement in mobile phone app for self-monitoring of emotional wellbeing predicts changes in mental health: MoodPrism. J. Affect. Disord. 227, 432–442. doi: 10.1016/j.jad.2017.11.016

PubMed Abstract | Crossref Full Text | Google Scholar

Balcombe, L. (2023). AI chatbots in digital mental health. Informatics 10:82. doi: 10.3390/informatics10040082

Crossref Full Text | Google Scholar

Beatty, C., Malik, T., Meheli, S., and Sinha, C. (2022). Evaluating the therapeutic alliance with a free-text CBT conversational agent (wysa): a mixed-methods study. Front. Digital Health 4:847991. doi: 10.3389/fdgth.2022.847991

PubMed Abstract | Crossref Full Text | Google Scholar

Blease, C., and Torous, J. (2023). ChatGPT and mental healthcare: balancing benefits with risks of harms. BMJ Mental Health 26:e300884. doi: 10.1136/bmjment-2023-300884

PubMed Abstract | Crossref Full Text | Google Scholar

Brand, N., Odom, W., and Barnett, S. (2023). “Envisioning and understanding orientations to introspective AI: exploring a design space with meta aware,” in Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (New York, NY: ACM), 1–18.

Google Scholar

Brandsema, M. J. (2023). Advanced mood tracking using waveform statistical signal processing techniques. Measurement 218:113152. doi: 10.1016/j.measurement.2023.113152

Crossref Full Text | Google Scholar

Britten-Neish, G. (2025). Cognitive offloading and the causal structure of human action. Synthese 205:47. doi: 10.1007/s11229-024-04887-3

Crossref Full Text | Google Scholar

Brockbank, R. B., and Feldon, D. F. (2024). Cognitive reappraisal: the bridge between cognitive load and emotion. Educ. Sci. 14:870. doi: 10.3390/educsci14080870

Crossref Full Text | Google Scholar

Butler, C., Michael, N., and Kissane, D. (2025). Reclaiming ritual in palliative care: a hermeneutic narrative review. Palliat. Support. Care 23:e49. doi: 10.1017/S1478951524001767

PubMed Abstract | Crossref Full Text | Google Scholar

Chen, X. (2025). Differences in emotional expression among college students: a study on integrating psychometric methods and algorithm optimization. BMC Psychol. 13:280. doi: 10.1186/s40359-025-02506-5

PubMed Abstract | Crossref Full Text | Google Scholar

Clarke, J., and Draper, S. (2020). Intermittent mindfulness practice can be beneficial, and daily practice can be harmful. An in depth, mixed methods study of the “Calm” app's (mostly positive) effects. Internet Interven. 19:100293. doi: 10.1016/j.invent.2019.100293

PubMed Abstract | Crossref Full Text | Google Scholar

Coghlan, S., Leins, K., Sheldrick, S., Cheong, M., Gooding, P., D'Alfonso, S., et al. (2023). To chat or bot to chat: ethical issues with using chatbots in mental health. Digital Health 9:20552076231183542. doi: 10.1177/20552076231183542

PubMed Abstract | Crossref Full Text | Google Scholar

Depoorter, J., De Raedt, R., Berking, M., and Hoorelbeke, K. (2025). Specificity of emotion regulation processes in depression: a network analysis. Cognit. Ther. Res. 49, 312–323. doi: 10.1007/s10608-024-10530-9

Crossref Full Text | Google Scholar

Diaz Alfaro, G., Fiore, S. M., and Oden, K. (2024). Externalized and extended cognition: cognitive offloading for human-machine teaming. Proc. Human Fact. Ergon. Soc. Ann. Meet. 68, 290–293. doi: 10.1177/10711813241275506

Crossref Full Text | Google Scholar

Elendu, C., Amaechi, D. C., Elendu, T. C., Jingwa, K. A., Okoye, O. K., John Okah, M., et al. (2023). Ethical implications of AI and robotics in healthcare: a review. Medicine 102:e36671. doi: 10.1097/MD.0000000000036671

PubMed Abstract | Crossref Full Text | Google Scholar

Evans, P., Vansteenkiste, M., Parker, P., Kingsford-Smith, A., and Zhou, S. (2024). Cognitive load theory and its relationships with motivation: a self-determination theory perspective. Educ. Psychol. Rev. 36:7. doi: 10.1007/s10648-023-09841-2

Crossref Full Text | Google Scholar

Gerlich, M. (2025). AI tools in society: impacts on cognitive offloading and the future of critical thinking. Societies 15:6. doi: 10.3390/soc15010006

Crossref Full Text | Google Scholar

Gilboa, Y., and Nahum, M. (2022). Mobile Ecological Tracking of mood as a predictor for resilience among male and female Israeli combatants. Eur. Psychiat. 65, S235–S235. doi: 10.1192/j.eurpsy.2022.609

Crossref Full Text | Google Scholar

Grinschgl, S., and Neubauer, A. C. (2022). Supporting cognition with modern technology: distributed cognition today and in an AI-enhanced future. Front. Artif. Intell. 5:908261. doi: 10.3389/frai.2022.908261

PubMed Abstract | Crossref Full Text | Google Scholar

Grinschgl, S., Papenmeier, F., and Meyerhoff, H. S. (2021). Consequences of cognitive offloading: boosting performance but diminishing memory. Q. J. Exp. Psychol. 74, 1477–1496. doi: 10.1177/17470218211008060

PubMed Abstract | Crossref Full Text | Google Scholar

Gruber, E. N., and Martic Biocina, S. (2024). Problem focused coping strategies and high self-compassion can be seen as protective factors to lower stress, negative emotional reactions to job and anxiety. Eur. Psychiat. 67, S568–S569. doi: 10.1192/j.eurpsy.2024.1182

Crossref Full Text | Google Scholar

Hoose, S., and Králiková, K. (2024). Artificial intelligence in mental health care: management implications, ethical challenges, and policy considerations. Adm, Sci. 14:227. doi: 10.3390/admsci14090227

Crossref Full Text | Google Scholar

Huberty, J., Puzia, M. E., Larkey, L., Irwin, M. R., and Vranceanu, A-. M. (2020). Use of the consumer-based meditation app calm for sleep disturbances: cross-sectional survey study. JMIR Format. Res. 4:e19508. doi: 10.2196/19508

PubMed Abstract | Crossref Full Text | Google Scholar

Inamdar, A., Gurupadayya, B., Gautam, M., Sharma, A., Pathak, R., Sharma, H., et al. (2025). AI-driven innovations in assessing stress, anxiety, and mental health. Curr. Psychiat. Res. Rev. 21:0126660822334997241216062002. doi: 10.2174/0126660822334997241216062002

Crossref Full Text | Google Scholar

Inkster, B., Sarda, S., and Subramanian, V. (2018). An empathy-driven, conversational artificial intelligence agent (Wysa) for digital mental well-being: real-world data evaluation mixed-methods study. JMIR Mhealth Uhealth 6:e12106. doi: 10.2196/12106

PubMed Abstract | Crossref Full Text | Google Scholar

Jackson, L. (2020). Must children sit still? The dark biopolitics of mindfulness and yoga in education. Educ. Philos. Theor. 52, 120–125. doi: 10.1080/00131857.2019.1595326

Crossref Full Text | Google Scholar

Lahlou, S. (2025). Mitigating Societal Cognitive Overload in the Age of AI: Challenges and Directions (Version 1). arXiv [Preprint]. arXiv: 2501.12345. Available online at: https://arxiv.org/abs/2501.12345

Google Scholar

Le Cunff, A-. L., Martis, B-. L., Glover, C., Ahmed, E., Ford, R., Giampietro, V., et al. (2025). Cognitive load and neurodiversity in online education: a preliminary framework for educational research and policy. Front. Educ. 9:1437673. doi: 10.3389/feduc.2024.1437673

Crossref Full Text | Google Scholar

León-Domínguez, U. (2024). Potential cognitive risks of generative transformer-based AI chatbots on higher order executive functions. Neuropsychology 38, 293–308. doi: 10.1037/neu0000948

PubMed Abstract | Crossref Full Text | Google Scholar

Li, Y., Chung, T. Y., Lu, W., Li, M., Ho, Y. W. B., He, M., et al. (2025). Chatbot-based mindfulness-based stress reduction program for university students with depressive symptoms: intervention development and pilot evaluation. J. Am. Psychiatr. Nurses Assoc. 31, 398–411. doi: 10.1177/10783903241302092

PubMed Abstract | Crossref Full Text | Google Scholar

Lin, M. (2022). Avoidance/emotion-focused coping mediates the relationship between distress tolerance and problematic Internet use in a representative sample of adolescents in Taiwan: one-year follow-up. J. Adolesc. 94, 600–610. doi: 10.1002/jad.12049

PubMed Abstract | Crossref Full Text | Google Scholar

Lopes, R. M., Silva, A. F., Rodrigues, A. C. A., and Melo, V. (2024). Chatbots for well-being: exploring the impact of artificial intelligence on mood enhancement and mental health. Eur. Psychiat. 67, S550–S551. doi: 10.1192/j.eurpsy.2024.1143

Crossref Full Text | Google Scholar

Marr, N. S., Zainal, N. H., and Newman, M. G. (2022). Focus on and venting of negative emotion mediates the 18-year bi-directional relations between major depressive disorder and generalized anxiety disorder diagnoses. J. Affect. Disord. 303, 10–17. doi: 10.1016/j.jad.2022.01.079

PubMed Abstract | Crossref Full Text | Google Scholar

Marroquín, B., Tennen, H., and Stanton, A. L. (2017). “Coping, emotion regulation, and well-being: intrapersonal and interpersonal processes,” in The Happy Mind: Cognitive Contributions to Well-Being, eds. M. D. Robinson and M. Eid (Berlin: Springer International Publishing), 253–274.

Google Scholar

Martinez-Martin, N. (2021). “Minding the AI: ethical challenges and practice for AI mental health care tools,” in Artificial Intelligence in Brain and Mental Health: Philosophical, Ethical and Policy Issues, eds. F. Jotterand and M. Ienca (Berlin: Springer International Publishing), 111–125.

Google Scholar

Oostvogels, I., Bongers, I. L., and Willems, A. (2018). The role of emotion regulation, coping, self-reflection and insight in staff interaction with patients with a diagnosis of personality disorder in forensic settings. J. Psychiatr. Ment. Health Nurs. 25, 582–600. doi: 10.1111/jpm.12506

PubMed Abstract | Crossref Full Text | Google Scholar

Park, B-. J., Choi, Y., Lee, J-. S., Ahn, Y-. C., Lee, E-. J., Son, C-. G., et al. (2024). Effectiveness of meditation for fatigue management: insight from a comprehensive systematic review and meta-analysis. Gen. Hosp. Psychiatry 91, 33–42. doi: 10.1016/j.genhosppsych.2024.08.001

PubMed Abstract | Crossref Full Text | Google Scholar

Risko, E. F., and Gilbert, S. J. (2016). Cognitive offloading. Trends Cogn. Sci. 20, 676–688. doi: 10.1016/j.tics.2016.07.002

PubMed Abstract | Crossref Full Text | Google Scholar

Shankar Ganesh, M., and Venkateswaramurthy, N. (2025). Artificial intelligence (AI) generated health counseling for mental illness patients. Curr. Psychiat. Res. Rev. 21, 269–283. doi: 10.2174/0126660822277500240109050359

Crossref Full Text | Google Scholar

Shi, L. (2025). The integration of advanced ai-enabled emotion detection and adaptive learning systems for improved emotional regulation. J. Educ. Comput. Res. 63, 173–201. doi: 10.1177/07356331241296890

Crossref Full Text | Google Scholar

Sivasubramanian Balasubramanian, Et. A. (2023). AI-enabled mental health assessment and intervention: bridging gaps in access and quality of care. Power Syst. Technol. 47, 85–92. doi: 10.52783/pst.159

Crossref Full Text | Google Scholar

Skulmowski, A. (2023). The cognitive architecture of digital externalization. Educ. Psychol. Rev. 35:101. doi: 10.1007/s10648-023-09818-1

Crossref Full Text | Google Scholar

Sultanova, G. (2025). Introducing non-cognitive load to the educational discourse. Front. Psychol. 15:1411102. doi: 10.3389/fpsyg.2024.1411102

PubMed Abstract | Crossref Full Text | Google Scholar

Taylor, H., Cavanagh, K., Field, A. P., and Strauss, C. (2022). Health care workers' need for headspace: findings from a multisite definitive randomized controlled trial of an unguided digital mindfulness-based self-help app to reduce healthcare worker stress. JMIR mHealth uHealth 10:e31744. doi: 10.2196/31744

PubMed Abstract | Crossref Full Text | Google Scholar

Trudel-Fitzgerald, C., Boucher, G., Morin, C., Mondragon, P., Guimond, A-. J., Nishimi, K., et al. (2024). Coping and emotion regulation: a conceptual and measurement scoping review. Can. Psychol. /Psychologie Canadienne 65, 149–162. French. doi: 10.1037/cap0000377

PubMed Abstract | Crossref Full Text | Google Scholar

Wasil, A. R., Palermo, E. H., Lorenzo-Luaces, L., and DeRubeis, R. J. (2022). Is there an app for that? A review of popular apps for depression, anxiety, and well-being. Cogn. Behav. Pract. 29, 883–901. doi: 10.1016/j.cbpra.2021.07.001

Crossref Full Text | Google Scholar

Whitehead, B. R., and Bergeman, C. S. (2020). Daily religious coping buffers the stress–affect relationship and benefits overall metabolic health in older adults. Psycholog. Relig. Spiritual. 12, 393–399. doi: 10.1037/rel0000251

PubMed Abstract | Crossref Full Text | Google Scholar

Yang, C., Wei, M., and Liu, Q. (2025). Intersections between cognitive-emotion regulation, critical thinking and academic resilience with academic motivation and autonomy in EFL learners: contributions of AI-mediated learning environments. Br. Educ. Res. J. 51:4140. doi: 10.1002/berj.4140

Crossref Full Text | Google Scholar

Yang, L., and Zhao, S. (2024). AI-induced emotions in L2 education: exploring EFL students' perceived emotions and regulation strategies. Comput. Human Behav. 159:108337. doi: 10.1016/j.chb.2024.108337

Crossref Full Text | Google Scholar

Zhang, Z., and Wang, J. (2024). Can AI replace psychotherapists? Exploring the future of mental health care. Front. Psychiatry 15:1444382. doi: 10.3389/fpsyt.2024.1444382

PubMed Abstract | Crossref Full Text | Google Scholar

Keywords: artificial intelligence, cognitive offloading, cognitive overload, coping strategies, mental health, resilience

Citation: Chirayath G, Premamalini K and Joseph J (2025) Cognitive offloading or cognitive overload? How AI alters the mental architecture of coping. Front. Psychol. 16:1699320. doi: 10.3389/fpsyg.2025.1699320

Received: 04 September 2025; Revised: 31 October 2025; Accepted: 05 November 2025;
Published: 21 November 2025.

Edited by:

Runxi Zeng, Chongqing University, China

Reviewed by:

David Eugene Vance, University of Alabama at Birmingham, United States

Copyright © 2025 Chirayath, Premamalini and Joseph. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Jeena Joseph, amVlbmFqb3NlcGgwMDVAZ21haWwuY29t

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.