Semantics and syntax are core components of language. The prevailing view was that processing of word meaning and syntactic processing happens in isolation from other systems. In light of proofed interactions between language and other systems, especially with perception, action and emotion, this view became untenable. This article reviews Event-related potential studies conducted at the Donders Centre for Cognition exploring the interplay between language comprehension and a person’s emotional state. The research program was aimed at an investigation of the online effects of emotional state on semantic processing and syntactic processing. To this aim we manipulated mood via film fragments (happy vs. sad) before participants read neutral sentences while their EEG was recorded. In Part 1, it is shown that mood impacts online semantic processing (as indicated by N400) and the processing of syntactic violations (as indicated by P600). Part 2 was directed at a further determination of the mechanisms underlying these interactions. The role of heuristics was examined by investigating the effects of mood on the P600 to semantic reversals. The results revealed that mood affects heuristic processing. The next step consisted of an assessment of the role of attention, in the mood-by-semantics and mood-by-syntax interaction. This was accomplished by recording EEG while manipulating attention via task next to emotional state. Participants performed a semantic or syntactic judgment task vs. a letter-size judgment task. The main ERP results were as follows: (i) attention interacts with the mood effect on semantic processing and syntactic processing, respectively, (ii) the effects of mood on semantic processing and syntactic processing are reliable, and (iii) the mood effects on semantic processing are not fixed but context-dependent. In Part 3 the effects of mood on the processing of script knowledge and general world knowledge are presented. Part 4 closes with a discussion of the mechanisms involved in the mood-by-language interactions and recommendations for future research. Regarding the underlying mechanism we propose that heuristics based on semantic expectancies or syntactic expectancies play a key role in the mood-by-language interactions. The results support the view that language takes place in continuous interaction with other (non-language) systems.
As a newly emerging field, connectomics has greatly advanced our understanding of the wiring diagram and organizational features of the human brain. Generative modeling-based connectome analysis, in particular, plays a vital role in deciphering the neural mechanisms of cognitive functions in health and dysfunction in diseases. Here we review the foundation and development of major generative modeling approaches for functional magnetic resonance imaging (fMRI) and survey their applications to cognitive or clinical neuroscience problems. We argue that conventional structural and functional connectivity (FC) analysis alone is not sufficient to reveal the complex circuit interactions underlying observed neuroimaging data and should be supplemented with generative modeling-based effective connectivity and simulation, a fruitful practice that we term “mechanistic connectome.” The transformation from descriptive connectome to mechanistic connectome will open up promising avenues to gain mechanistic insights into the delicate operating principles of the human brain and their potential impairments in diseases, which facilitates the development of effective personalized treatments to curb neurological and psychiatric disorders.
Functional near infrared spectroscopy (fNIRS) has been gaining increasing interest as a practical mobile functional brain imaging technology for understanding the neural correlates of social cognition and emotional processing in the human prefrontal cortex (PFC). Considering the cognitive complexity of human-robot interactions, the aim of this study was to explore the neural correlates of emotional processing of congruent and incongruent pairs of human and robot audio-visual stimuli in the human PFC with fNIRS methodology. Hemodynamic responses from the PFC region of 29 subjects were recorded with fNIRS during an experimental paradigm which consisted of auditory and visual presentation of human and robot stimuli. Distinct neural responses to human and robot stimuli were detected at the dorsolateral prefrontal cortex (DLPFC) and orbitofrontal cortex (OFC) regions. Presentation of robot voice elicited significantly less hemodynamic response than presentation of human voice in a left OFC channel. Meanwhile, processing of human faces elicited significantly higher hemodynamic activity when compared to processing of robot faces in two left DLPFC channels and a left OFC channel. Significant correlation between the hemodynamic and behavioral responses for the face-voice mismatch effect was found in the left OFC. Our results highlight the potential of fNIRS for unraveling the neural processing of human and robot audio-visual stimuli, which might enable optimization of social robot designs and contribute to elucidation of the neural processing of human and robot stimuli in the PFC in naturalistic conditions.