<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Psychol.</journal-id>
<journal-title>Frontiers in Psychology</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Psychol.</abbrev-journal-title>
<issn pub-type="epub">1664-1078</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fpsyg.2022.884242</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychology</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Searching for Best Predictors of Paralinguistic Comprehension and Production of Emotions in Communication in Adults With Moderate Intellectual Disability</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Cali&#x0107;</surname> <given-names>Gordana</given-names></name>
<xref ref-type="corresp" rid="c001"><sup>&#x002A;</sup></xref>
<xref ref-type="author-notes" rid="fn002"><sup>&#x2020;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/1538533/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Glumbi&#x0107;</surname> <given-names>Nenad</given-names></name>
<xref ref-type="author-notes" rid="fn002"><sup>&#x2020;</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Petrovi&#x0107;-Lazi&#x0107;</surname> <given-names>Mirjana</given-names></name>
<xref ref-type="author-notes" rid="fn002"><sup>&#x2020;</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>&#x0110;or&#x0111;evi&#x0107;</surname> <given-names>Mirjana</given-names></name>
<xref ref-type="author-notes" rid="fn002"><sup>&#x2020;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/384356/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Mentus</surname> <given-names>Tatjana</given-names></name>
<xref ref-type="author-notes" rid="fn002"><sup>&#x2020;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/1536409/overview"/>
</contrib>
</contrib-group>
<aff><institution>Faculty of Special Education and Rehabilitation, University of Belgrade</institution>, <addr-line>Belgrade</addr-line>, <country>Serbia</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Petar &#x010C;olovi&#x0107;, University of Novi Sad, Serbia</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Fasih Haider, University of Edinburgh, United Kingdom; Sara Filipiak, Marie Curie-Sklodowska University, Poland</p></fn>
<corresp id="c001">&#x002A;Correspondence: Gordana Cali&#x0107;, <email>calicgordana@yahoo.com</email></corresp>
<fn fn-type="equal" id="fn002"><p><sup>&#x2020;</sup>These authors have contributed equally to this work</p></fn>
<fn fn-type="other" id="fn004"><p>This article was submitted to Personality and Social Psychology, a section of the journal Frontiers in Psychology</p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>08</day>
<month>07</month>
<year>2022</year>
</pub-date>
<pub-date pub-type="collection">
<year>2022</year>
</pub-date>
<volume>13</volume>
<elocation-id>884242</elocation-id>
<history>
<date date-type="received">
<day>25</day>
<month>02</month>
<year>2022</year>
</date>
<date date-type="accepted">
<day>06</day>
<month>06</month>
<year>2022</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x00A9; 2022 Cali&#x0107;, Glumbi&#x0107;, Petrovi&#x0107;-Lazi&#x0107;, &#x0110;or&#x0111;evi&#x0107; and Mentus.</copyright-statement>
<copyright-year>2022</copyright-year>
<copyright-holder>Cali&#x0107;, Glumbi&#x0107;, Petrovi&#x0107;-Lazi&#x0107;, &#x0110;or&#x0111;evi&#x0107; and Mentus</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license>
</permissions>
<abstract>
<p>Paralinguistic comprehension and production of emotions in communication include the skills of recognizing and interpreting emotional states with the help of facial expressions, prosody and intonation. In the relevant scientific literature, the skills of paralinguistic comprehension and production of emotions in communication are related primarily to receptive language abilities, although some authors found also their correlations with intellectual abilities and acoustic features of the voice. Therefore, the aim of this study was to investigate which of the mentioned variables (receptive language ability, acoustic features of voice, intellectual ability, social-demographic), presents the most relevant predictor of paralinguistic comprehension and paralinguistic production of emotions in communication in adults with moderate intellectual disabilities (MID). The sample included 41 adults with MID, 20&#x2013;49 years of age (<italic>M</italic> = 34.34, SD = 7.809), 29 of whom had MID of unknown etiology, while 12 had Down syndrome. All participants are native speakers of Serbian. Two subscales from The Assessment Battery for Communication &#x2013; Paralinguistic comprehension of emotions in communication and Paralinguistic production of emotions in communication, were used to assess the examinees from the aspect of paralinguistic comprehension and production skills. For the graduation of examinees from the aspect of assumed predictor variables, the following instruments were used: Peabody Picture Vocabulary Test was used to assess receptive language abilities, Computerized Speech Lab (&#x201C;Kay Elemetrics&#x201D; Corp., model 4300) was used to assess acoustic features of voice, and Raven&#x2019;s Progressive Matrices were used to assess intellectual ability. Hierarchical regression analysis was applied to investigate to which extent the proposed variables present an actual predictor variables for paralinguistic comprehension and production of emotions in communication as dependent variables. The results of this analysis showed that only receptive language skills had statistically significant predictive value for paralinguistic comprehension of emotions (&#x03B2; = 0.468, <italic>t</italic> = 2.236, <italic>p</italic> &#x003C; 0.05), while the factor related to voice frequency and interruptions, form the domain of acoustic voice characteristics, displays predictive value for paralinguistic production of emotions (&#x03B2; = 0.280, <italic>t</italic> = 2.076, <italic>p</italic> &#x003C; 0.05). Consequently, this study, in the adult population with MID, evidenced a greater importance of voice and language in relation to intellectual abilities in understanding and producing emotions.</p>
</abstract>
<kwd-group>
<kwd>intellectual ability</kwd>
<kwd>moderate intellectual disability</kwd>
<kwd>paralinguistic comprehension</kwd>
<kwd>paralinguistic production</kwd>
<kwd>receptive language ability</kwd>
<kwd>acoustic features of voice</kwd>
<kwd>Serbian language</kwd>
</kwd-group>
<counts>
<fig-count count="0"/>
<table-count count="5"/>
<equation-count count="0"/>
<ref-count count="75"/>
<page-count count="11"/>
<word-count count="9305"/>
</counts>
</article-meta>
</front>
<body>
<sec id="S1" sec-type="intro">
<title>Introduction</title>
<p>According to DSM-5 (<xref ref-type="bibr" rid="B5">American Psychiatric Association, 2013</xref>), intellectual disability (ID) is defined as a disability characterized by significant limitations both in intellectual functioning and adaptive behavior, manifested in conceptual, social, and practical adaptive skills. The social aspect of adaptive skills, among other things, includes empathy, social-emotional reasoning, establishing interpersonal interactions, etc. (<xref ref-type="bibr" rid="B5">American Psychiatric Association, 2013</xref>). The ability to recognize and produce emotions contributes to successful social adaptation and facilitates coping in a social environment (<xref ref-type="bibr" rid="B32">Kashani-Vahid et al., 2018</xref>). On the other hand, difficulties in understanding emotions may affect social adaptation and integration of people with ID (<xref ref-type="bibr" rid="B73">Trentacosta and Fine, 2010</xref>) and cause aggressive outbursts (<xref ref-type="bibr" rid="B43">Matheson and Jahoda, 2005</xref>). People with ID usually have difficulties in recognizing emotions, which persists in adulthood (<xref ref-type="bibr" rid="B67">Scotland et al., 2015</xref>, <xref ref-type="bibr" rid="B68">2016</xref>). The persisting problems in understanding emotions also affect finding and keeping a job, self-esteem, and thus the overall quality of life of these people (<xref ref-type="bibr" rid="B9">Banks et al., 2010</xref>). Since there are various adverse effects of reduced emotion recognition abilities in people with ID, research studies aimed at detecting the predictors of these abilities may be useful, as they may result in guidelines for targeted and effective interventions. Nowadays, this kind of research is increasingly gaining importance as a support for teaching models for automatic emotion recognition, i.e., the creation of technological assistive devices that facilitate the treatment of children and adults with neurodevelopmental disorders (<xref ref-type="bibr" rid="B41">Ma et al., 2019</xref>; <xref ref-type="bibr" rid="B40">Martinez-Martin et al., 2020</xref>; <xref ref-type="bibr" rid="B33">Landowska et al., 2022</xref>).</p>
<p>Paralinguistic elements include various sounds, tones, crying, laughing, sobbing, speech speed and pitch, rhythm and intonation (<xref ref-type="bibr" rid="B58">Rot, 2004</xref>), as well as facial expressions (e.g., in the eyes and lips region &#x2013; eyes widening) (<xref ref-type="bibr" rid="B6">Angeleri et al., 2012</xref>; <xref ref-type="bibr" rid="B72">Tomi&#x0107;, 2014</xref>). They also answer the question: &#x201C;How has something been said?&#x201D;. Therefore, paralinguistic comprehension of emotions could be viewed as the skill to precisely recognize and interpret an emotional state based on non-verbal clues, such as facial expressions, prosody, and/or body language (<xref ref-type="bibr" rid="B6">Angeleri et al., 2012</xref>, <xref ref-type="bibr" rid="B7">2016</xref>). On the other hand, paralinguistic production of emotions involves expressing emotions by using non-linguistic aspects of speech, such as mime and prosody (<xref ref-type="bibr" rid="B6">Angeleri et al., 2012</xref>, <xref ref-type="bibr" rid="B7">2016</xref>). Many studies show that people with ID perform worse on tasks involving paralinguistic recognition, comprehension, and production of emotions, than typically developing people (<xref ref-type="bibr" rid="B29">Hippolyte et al., 2008</xref>; <xref ref-type="bibr" rid="B67">Scotland et al., 2015</xref>, <xref ref-type="bibr" rid="B68">2016</xref>; <xref ref-type="bibr" rid="B49">Murray et al., 2019</xref>; <xref ref-type="bibr" rid="B45">McKenzie et al., 2021</xref>). There are several hypotheses that can potentially explain the mentioned difficulties (e.g., <xref ref-type="bibr" rid="B2">Albanese et al., 2010</xref>; <xref ref-type="bibr" rid="B13">Beck et al., 2012</xref>; <xref ref-type="bibr" rid="B44">Mazzoni et al., 2020</xref>). However, it is still not completely clear what these skills are most closely related to both in typically developing people and people with ID.</p>
<p>A deficit in language skills, especially receptive skills, in people with ID may be related to difficulties in understanding emotions. Emotional recognition includes knowledge of emotions acquired from life experience, just as receptive vocabulary includes knowledge of concepts acquired from experience (<xref ref-type="bibr" rid="B13">Beck et al., 2012</xref>). <xref ref-type="bibr" rid="B30">Joyce et al. (2006)</xref> show that adults with ID are better at recognizing than producing emotions, and that these abilities are related to receptive language skills. Also, the results of some studies indicate a significant positive correlation between receptive language skills and recognizing certain emotions from facial expressions in adults with Down syndrome (DS<sup><xref ref-type="fn" rid="footnote1">1</xref></sup>), while such correlation was not found in typically developing adults (<xref ref-type="bibr" rid="B29">Hippolyte et al., 2008</xref>).</p>
<p>Compared to language skills, the literature on prosodic features of voice in people with ID is not so extensive. Still, people with ID express certain changes in acoustic voice characteristics compared to typically developing people, both in childhood and in adulthood (<xref ref-type="bibr" rid="B36">Lee et al., 2009</xref>; <xref ref-type="bibr" rid="B4">Amad&#x00F3; et al., 2016</xref>; <xref ref-type="bibr" rid="B14">Becker et al., 2017</xref>; <xref ref-type="bibr" rid="B51">O&#x2019; Leary et al., 2019</xref>). Various research studies on typically developing people show that acoustic voice characteristics [e.g., Pitch, Fundamental frequency (F0), F0 contour, Jitter, Intensity, Attack, Pauses, etc.] are significant factors in recognizing emotions (<xref ref-type="bibr" rid="B65">Scherer et al., 1991</xref>; <xref ref-type="bibr" rid="B10">Banse and Scherer, 1996</xref>; <xref ref-type="bibr" rid="B31">Juslin and Laukka, 2003</xref>). Numerous studies have examined how a listener recognizes emotions from acoustic voice characteristics in situations where trained actors express emotional states while reading syllables or meaningless texts (e.g., <xref ref-type="bibr" rid="B10">Banse and Scherer, 1996</xref>). Basic emotions are best recognized (happiness, sadness, anger, surprise, fear, and disgust) when the actors who express emotions belong to the same culture as the listeners (<xref ref-type="bibr" rid="B31">Juslin and Laukka, 2003</xref>; <xref ref-type="bibr" rid="B52">Pell et al., 2009</xref>). In addition, research shows that vocal sighs (expressed through laughter, a scream, an exclamation, etc.) contribute to recognizing different emotions in given speech situations (<xref ref-type="bibr" rid="B62">Sauter and Scott, 2007</xref>; <xref ref-type="bibr" rid="B69">Simon-Thomas et al., 2009</xref>; <xref ref-type="bibr" rid="B61">Sauter et al., 2010</xref>; <xref ref-type="bibr" rid="B34">Laukka et al., 2013</xref>). Based on the information from studies on typical population, the question arises as to whether and to what extent these acoustic characteristics affect paralinguistic recognition and production in people with ID.</p>
<p>Limitations in overall intellectual functioning may also contribute to difficulties in understanding and producing emotions in people with ID, as indicated by some authors in this field (<xref ref-type="bibr" rid="B48">Moore, 2001</xref>). Also, IQ is negatively correlated with processing emotional information in people with borderline intellectual functioning (<xref ref-type="bibr" rid="B70">Smirni et al., 2019</xref>). One explanation is that emotional and cognitive intelligence share a neuro-functional-anatomical network and that explains these correlations (<xref ref-type="bibr" rid="B11">Barbey et al., 2014</xref>, according to <xref ref-type="bibr" rid="B70">Smirni et al., 2019</xref>). Higher non-verbal IQ was also associated with better emotional understanding in the population of participants with autism with different levels of cognitive functioning (<xref ref-type="bibr" rid="B60">Salomone et al., 2019</xref>). The relationship between non-verbal intelligence and understanding of emotions certainly exists, but it is not simple and age mediates it (<xref ref-type="bibr" rid="B2">Albanese et al., 2010</xref>). <xref ref-type="bibr" rid="B43">Matheson and Jahoda (2005)</xref> show that emotion recognition is associated with receptive language skills but not IQ. However, some authors (<xref ref-type="bibr" rid="B13">Beck et al., 2012</xref>) suggest that intellectual abilities could also account for the relation between receptive vocabulary and recognizing emotions. The same authors propose further examination of the role of intelligence in the development of emotional competencies.</p>
<p>Age is another significant variable in recognizing emotions. It has been shown that older typically developing adults perform worse on emotion recognition tasks than younger adults (<xref ref-type="bibr" rid="B71">Sullivan et al., 2015</xref>). In a meta-analysis that examined the relationship between intelligence and emotional recognition in adults of a typical population, it was found that the relationship exists, but that age also plays a significant role in it. One explanation is that in the elderly, the stronger connection between intelligence and emotion recognition can be explained by the hypothesis of cognitive dedifferentiation, which claims that the structure of individuals&#x2019; cognitive abilities becomes less differentiated in old age (<xref ref-type="bibr" rid="B66">Schlegel et al., 2020</xref>). Furthermore, some research studies show that older adults produce emotions more slowly than younger adults (<xref ref-type="bibr" rid="B25">Dupuis and Pichora-Fuller, 2010</xref>). Apart from age, gender differences have also been found. Women are generally more successful in recognizing emotions than men (<xref ref-type="bibr" rid="B71">Sullivan et al., 2015</xref>; <xref ref-type="bibr" rid="B28">Gon&#x00E7;alves et al., 2018</xref>). Similarly, the ability to recognize emotions decreases with age in adults with ID (<xref ref-type="bibr" rid="B43">Matheson and Jahoda, 2005</xref>). An older study (<xref ref-type="bibr" rid="B46">McKenzie et al., 2000</xref>) on a sample of adults with mild and moderate ID showed that younger participants and those with milder ID were better at recognizing emotions.</p>
<p>The literature shows that so far researchers have focused more on examining people with autism spectrum disorder than people with ID with regard to paralinguistic comprehension and production. Compared to language and cognitive abilities, the field of social-emotional skills in this population has been less frequently examined (<xref ref-type="bibr" rid="B55">Pochon et al., 2017</xref>). Research has mainly examined emotion recognition abilities based on facial expressions and prosodic elements of speech in people with ID compared to typically developing population, mostly at a younger age (<xref ref-type="bibr" rid="B26">Fern&#x00E1;ndez-Alcaraz et al., 2010</xref>; <xref ref-type="bibr" rid="B17">Cebula et al., 2017</xref>; <xref ref-type="bibr" rid="B55">Pochon et al., 2017</xref>; <xref ref-type="bibr" rid="B39">Mart&#x00ED;nez-Gonz&#x00E1;lez and Veas, 2019</xref>), but also in the elderly (<xref ref-type="bibr" rid="B57">Roch et al., 2020</xref>; <xref ref-type="bibr" rid="B8">Andr&#x00E9;s-Roqueta et al., 2021</xref>).</p>
<p>Some studies deal with paralinguistic recognition and production of emotions in relation to certain cognitive aspects in children and adolescents (<xref ref-type="bibr" rid="B30">Joyce et al., 2006</xref>; <xref ref-type="bibr" rid="B4">Amad&#x00F3; et al., 2016</xref>; <xref ref-type="bibr" rid="B20">Djordjevic et al., 2016a</xref>; <xref ref-type="bibr" rid="B55">Pochon et al., 2017</xref>; <xref ref-type="bibr" rid="B12">Barisnikov et al., 2020</xref>). However, we are not familiar with studies examining which of these aspects predict mentioned abilities in people with ID. The authors of previous research studies associate the obtained results, which show difficulties in understanding and producing emotions in people with moderate ID (MID<sup><xref ref-type="fn" rid="footnote2">2</xref></sup>), with deficits in language and cognitive abilities (<xref ref-type="bibr" rid="B20">Djordjevic et al., 2016a</xref>). However, we did not find any manuscripts that examined the predictors of these abilities in adults with MID.</p>
<p>Based on this literature survey, we consider to be interesting to examine the relation between these skills and acoustic features of voice, intellectual ability, and receptive language skills in adults with ID. The topicality of this subject is also reflected in our everyday reality, where we face the global effects of the pandemic on the clinical population due to social distancing and limited social contacts. Thus, the practical value of this manuscript is even greater in this vulnerable population.</p>
<p>This research aimed to determine how different variables related to understanding and producing emotions predicted these skills in adults with MID.</p>
<p>The aim of this manuscript was to determine which of the mentioned variables &#x2013; receptive language skills, voice acoustics, intelligence, and social-demographic variables (gender, age, etiology, level of intellectual functioning) were the best predictors of paralinguistic comprehension and production of emotions in communication in adults with MID.</p>
</sec>
<sec id="S2" sec-type="materials|methods">
<title>Materials and Methods</title>
<sec id="S2.SS1">
<title>Sample</title>
<p>The sample included 41 adults with MID, 20&#x2013;49 years of age (<italic>M</italic> = 34.34, SD = 7.809). There were 17 (41.5%) male and 24 (58.5%) female participants in the sample. With regard to etiology, the sample included participants with MID of unknown etiology (<italic>N</italic> = 29) and participants with DS (<italic>N</italic> = 12). These two subgroups did not significantly differ in gender [<italic>t</italic>(39) = 0.016, <italic>p</italic> &#x003E; 0.05] and age [<italic>t</italic>(39) = &#x2212;2.018, <italic>p</italic> &#x003E; 0.05]. All participants are native speakers of Serbian. After the initial consent from parents, we obtained data on the participants&#x2019; level of intellectual functioning from their medical records. Medical records provided information about the level of intellectual disability (i.e., the information that the participant functioned at the MID level) for each participant included in the sample. Their medical records also showed that different psychiatrists used different assessment tests at different times and referred to different diagnostic classifications. Therefore, we did not use the IQ data for the purpose of this study, but only the level of ID.</p>
</sec>
<sec id="S2.SS2">
<title>Procedure</title>
<p>Prior to conducting the research, we obtained the approval of the Ethics Committee of the Faculty of Special Education and Rehabilitation, University of Belgrade (no. 109/1). After obtaining the approval, a request for conducting the research was sent to managers of three day-care centers for adults with disabilities in Belgrade. The request included a detailed explanation of the aim, sample, instruments, and procedure of the research. After obtaining their consent, the request was forwarded to special educators employed in day-care centers, and they contacted the user&#x2019;s parents.</p>
<p>Contacted participants&#x2019; parents or guardians voluntarily signed informed consent for the participants to take part in the research, after which persons with MID also gave their consent. Only those participants who gave their consent to participate were included in the research. The participants were explained that they could leave the research at any time. First, a triage assessment of the participants was conducted. On the basis of anamnestic data, the inclusion criteria were the following: diagnosed moderate ID, absence of autism spectrum disorder characteristics, absence of associated psychiatric disorders and/or illnesses which could affect voice characteristics, age between 18 and 55 (absence of aging voice influence).</p>
<p>The research was conducted in day-care centers<sup><xref ref-type="fn" rid="footnote3">3</xref></sup> for adults with disabilities in Belgrade, whose services the participants used. The assessment was conducted in a speech therapy room, isolated from noise and distractors.</p>
<p>The participants were first given an explanation about the nature of the task. The Peabody scale and Raven scale were presented to participants using pictures.</p>
<p>The Paralinguistic scale was presented to participants on a computer. On the basis of the presented videos and questions asked questions, the participants answered the questions, and the examiner recorded their answers on a special answer sheet.</p>
<p>Phonation of the vowel/a/for the acoustic analysis was recorded directly on the computer through a microphone placed at a specific distance from the participant&#x2019;s mouth.</p>
</sec>
<sec id="S2.SS3">
<title>Research Instruments</title>
<sec id="S2.SS3.SSS1">
<title>Assssment of Paralinguistic Recognition of Emotions</title>
<p>The Paralinguistic scale from The Assessment Battery for Communication (<xref ref-type="bibr" rid="B59">Sacco et al., 2008</xref>), which consists of Paralinguistic comprehension of emotions and Paralinguistic production of emotions, was used to assess paralinguistic ability to recognize emotions. The complete battery was translated from Italian into Serbian with a double-blind method, so that the professor translated the Italian into Serbian, and again the court interpreter translated the Serbian translation back into Italian. Then both versions were compared and finally corrected. Of the total translated batteries, we used the translated two subscales.</p>
<p>Paralinguistic comprehension of emotions includes videos in which actors express emotions by speaking in a fictional language, and participants have to recognize those emotions (e.g., the actor makes angry hand gestures and grimaces while speaking in a fictional language, and the question is: &#x201C;In your opinion, which emotion is the actor expressing, how does he/she feel?&#x201D;). The participant then chooses one of five given answers (e.g., &#x201C;happy, surprised, angry, sad, something else&#x201D;). The videos were presented via laptop. The videos lasted between 20 and 25 s. It was possible to get a grade of 0 or 1 for each task, depending on whether the answer was correct (score = 1) or incorrect (score = 0). The respondent chooses one of the five offered answers. This subscale includes eight items and the total score is eight points. Paralinguistic production of emotions includes eight items in which participants should react to examiner&#x2019;s request (e.g., &#x201C;Ask me what time it is. Ask me as if you were upset.&#x201D;). The items we used included requests involving different emotions (bored, upset, disturb, happy, angry, sad and scary). The same scoring principle applied to this subscale. The total score in this subscale is eight points (see <xref ref-type="supplementary-material" rid="DS1">Supplementary Material</xref>).</p>
<p>The total value of the Cronbach&#x2019;s alpha coefficient for the Paralinguistic scale is 0.70 according to the authors of this scale (<xref ref-type="bibr" rid="B15">Bosco et al., 2012</xref>). The Paralinguistic scale has previously been used in the Serbian-speaking area in the population of people with ID, and it had a satisfactory value of the Cronbach&#x2019;s alpha coefficient, which was 0.94 for the subscale of paralinguistic production, and 0.79 for the subscale of paralinguistic comprehension (<xref ref-type="bibr" rid="B21">Djordjevic et al., 2016b</xref>). In our research it was 0.617 for the subscale of paralinguistic comprehension, and 0.767 for the subscale of paralinguistic production.</p>
</sec>
<sec id="S2.SS3.SSS2">
<title>Assessment of Receptive Language Skills</title>
<p><italic>Peabody Picture Vocabulary Test</italic>, PPVT &#x2013; 4 (<xref ref-type="bibr" rid="B24">Dunn and Dunn, 2007</xref>) &#x2013; Peabody Picture Vocabulary Test was used to assess receptive language skills. This test includes 227 items, divided into 19 categories of 12 words. Testing is conducted by presenting the participants with four different pictures and instructing them to point to the picture which corresponds to the word said by the examiner. Two trials are given first, followed by testing in sets. Electronic version of the test was used in this research, and the pictures were shown on computer screen. Testing was stopped when a participant gave eight incorrect answers in one set. Raw score in this test was obtained by subtracting the total number of incorrect answers from the total number of items.</p>
<p>The Peabody test has high internal reliability, ranging from 0.92 to 0.98 (<xref ref-type="bibr" rid="B24">Dunn and Dunn, 2007</xref>). This instrument has previously been used in the Serbian-speaking area in the population of people with ID and it had satisfactory values (<xref ref-type="bibr" rid="B22">Djordjevic et al., 2018</xref>). In our research it was 0.995.</p>
</sec>
<sec id="S2.SS3.SSS3">
<title>Assessment of Acoustic Voice Characteristics</title>
<p>Computerized Speech Lab (&#x201C;<italic>Kay Elemetrics&#x201D; Corp.</italic>, model 4300), Multi-Dimensional Voice Program (MDVP) &#x2013; MDVP software was used for the analysis of acoustic voice characteristics. This program provides a detailed graphical and numerical presentation for 33 parameters. The examiner instructed the participants to sound the vowel <italic>a</italic> calmly and spontaneously, for 3&#x2013;4 s. In accordance with the author&#x2019;s recommendations, this procedure was repeated three times in order to select a voice of the best quality. Sony ECM-T150 microphone was placed at a distance of 5 cm from the participant&#x2019;s mouth. The signal was recorded directly on the computer. In this research, the following acoustic parameters were analyzed: frequency variability parameters (F0, Jitt, and PPQ), amplitude variability parameters (Shim, vAm, and APQ), voice interruption parameter (DVB), noise and tremor estimation parameters (NHR, VTI, and SPI).</p>
<p>Multi-Dimensional Voice Program is often used for voice assessment in different population of participants in the Serbian-speaking area (<xref ref-type="bibr" rid="B53">Petrovic-Lazic et al., 2009</xref>, <xref ref-type="bibr" rid="B54">2014</xref>).</p>
<p>Factor analysis, with Promax rotation, was used in this research to single out latent dimensions which summarize the parameters of acoustic voice characteristics (<xref ref-type="table" rid="T1">Table 1</xref>). The analysis of main components, with the principle of distinguishing only those dimensions with eigenvalues greater than 1, singled out three factors. Together, they explain 80% of the variance in individual differences with regard to voice.</p>
<table-wrap position="float" id="T1">
<label>TABLE 1</label>
<caption><p>Matrix of the results of analyzing main components used for acoustic voice characteristics with Promax rotation.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<td valign="top" align="left"></td>
<td valign="top" align="center" colspan="3">Factors<hr/></td>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="center">1</td>
<td valign="top" align="center">2</td>
<td valign="top" align="center">3</td>
<td/>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">Shim</td>
<td valign="top" align="center"><bold>1.033</bold></td>
<td valign="top" align="center">&#x2212;0.223</td>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left">APQ</td>
<td valign="top" align="center"><bold>1.014</bold></td>
<td valign="top" align="center">&#x2212;0.173</td>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left">Jitt</td>
<td valign="top" align="center"><bold>0.811</bold></td>
<td/>
<td valign="top" align="center">0.326</td>
<td/>
</tr>
<tr>
<td valign="top" align="left">PPQ</td>
<td valign="top" align="center"><bold>0.796</bold></td>
<td valign="top" align="center">0.122</td>
<td valign="top" align="center">0.328</td>
<td/>
</tr>
<tr>
<td valign="top" align="left">VTI</td>
<td valign="top" align="center"><bold>0.754</bold></td>
<td valign="top" align="center">0.110</td>
<td valign="top" align="center">&#x2212;0.503</td>
<td/>
</tr>
<tr>
<td valign="top" align="left">NHR</td>
<td valign="top" align="center"><bold>0.628</bold></td>
<td valign="top" align="center">0.429</td>
<td valign="top" align="center">&#x2212;0.125</td>
<td/>
</tr>
<tr>
<td valign="top" align="left">F0</td>
<td valign="top" align="center">&#x2212;0.296</td>
<td valign="top" align="center"><bold>1.017</bold></td>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left">DVB</td>
<td valign="top" align="center">0.171</td>
<td valign="top" align="center"><bold>0.598</bold></td>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left">SPI</td>
<td/>
<td/>
<td valign="top" align="center"><bold>0.869</bold></td>
<td/>
</tr>
<tr>
<td valign="top" align="left">vAm</td>
<td valign="top" align="center">0.330</td>
<td/>
<td valign="top" align="center"><bold>0.649</bold></td>
<td/>
</tr>
<tr>
<td valign="top" align="left">% variance</td>
<td valign="top" align="center">50.103</td>
<td valign="top" align="center">17.224</td>
<td valign="top" align="center">12.010</td>
<td valign="top" align="center">79.337</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn><p><italic>1 &#x2013; voice perturbations, 2 &#x2013; voice frequency and interruptions, 3 &#x2013; noise and tremor in voice. Shim, amplitude variation of the sound wave; APQ, amplitude perturbation quotient; Jitt, frequency variation; PPQ, pitch period perturbation quotient; VTI, voice turbulence index; NHR, noise-to-harmonic ratio; F0, mean fundamental frequency; DVB, degree of voice break; SPI, soft phonation index; vAm, peak amplitude variation. Bold values are factor loadings &#x003E;0.50.</italic></p></fn>
</table-wrap-foot>
</table-wrap>
<p>The obtained results are in accordance with the Scree plot chart, which also recommends that three dimensions be singled out. The first factor includes the following variables: Shim, APQ, Jitt, PPQ, VTI, and NHR (voice perturbations). This factor explains 50% of the variance. The second factor includes F0 and DVB variables (voice frequency and interruptions), and it explains additional 17% of the variance. The third factor includes SPI and vAm variables (noise and tremor in voice). It explains additional 12% of the variance. The obtained three factors (voice perturbations, voice frequency and interruptions, noise and tremor in voice) will be used in further analyses.</p>
<p>The Cronbach&#x2019;s alpha coefficient for the acoustic parameters used in our research was 0.856.</p>
</sec>
<sec id="S2.SS3.SSS4">
<title>Assessment of Intellectual Functioning</title>
<p>Raven&#x2019;s Progressive Matrices (<xref ref-type="bibr" rid="B56">Raven and Raven, 1998</xref>) &#x2013; Raven&#x2019;s progressive matrices were used to assess intellectual functioning. This instrument consists of non-verbal tasks which measure the general intelligence factor. The tasks are organized in such a way that there is a pattern, but with one segment always missing. The participant has to discover a rule according to which the patterns are arranged, and choose the one that is missing form several given options. The matrix consists of 60 tasks arranged in five sets. Arrangement of tasks is based on difficulty. Sets of tasks are arranged with regard to topics (supplementing, finding analogy, changing, permutation, and division). Electronic version of the matrix was used for the purpose of this research. The participants were first explained that an element was missing at the top of the page, while the answers were given at the bottom of the page. After selecting an answer at the bottom, participants pointed with their finger and the number of the answer was recorded. Each participant first did a trial, after which the assessment began. All correct answers were added up to get a total raw score, which shows the level of intellectual functioning.</p>
<p>The Cronbach&#x2019;s alpha coefficient for this scale in our research was 0.815.</p>
</sec>
</sec>
</sec>
<sec id="S3" sec-type="results">
<title>Results</title>
<sec id="S3.SS1">
<title>Participants&#x2019; Results on All Applied Scales</title>
<p><xref ref-type="table" rid="T2">Table 2</xref> shows descriptive statistics of participants&#x2019; achievements in all analyzed variables. The rows show the variables, and the columns show the determined data values &#x2013; the number of the participants who answered the questions, the score range of the variables, mean, standard deviation, skewness and its standard error, kurtosis and its standard error. Raven&#x2019;s progressive matrices scores were within the expected range given the examined population. High deviation of skewness and kurtosis parameters and their standard errors, which indicates not meeting the conditions for normal distribution of results, was in our manuscript observed in certain variables from the Acoustic voice characteristics domain: Shim, DVB, NHR, and PPQ (<xref ref-type="table" rid="T2">Table 2</xref>).</p>
<table-wrap position="float" id="T2">
<label>TABLE 2</label>
<caption><p>Descriptive measures of all variables used in the manuscript.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<td valign="top" align="left"></td>
<td valign="top" align="center"><italic>N</italic></td>
<td valign="top" align="center">Min</td>
<td valign="top" align="center">Max</td>
<td valign="top" align="center"><italic>M</italic></td>
<td valign="top" align="center">SD</td>
<td valign="top" align="center" colspan="2">Sk</td>
<td valign="top" align="center" colspan="2">Ku</td>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">(1)</td>
<td valign="top" align="center">41</td>
<td valign="top" align="center">20</td>
<td valign="top" align="center">49</td>
<td valign="top" align="center">34.34</td>
<td valign="top" align="center">7.809</td>
<td valign="top" align="center">0.195</td>
<td valign="top" align="center">0.369</td>
<td valign="top" align="center">&#x2212;0.636</td>
<td valign="top" align="center">0.724</td>
</tr>
<tr>
<td valign="top" align="left">(2)</td>
<td valign="top" align="center">41</td>
<td valign="top" align="center">32</td>
<td valign="top" align="center">172</td>
<td valign="top" align="center">106.15</td>
<td valign="top" align="center">36.241</td>
<td valign="top" align="center">&#x2212;0.191</td>
<td valign="top" align="center">0.369</td>
<td valign="top" align="center">&#x2212;0.511</td>
<td valign="top" align="center">0.724</td>
</tr>
<tr>
<td valign="top" align="left">(3)</td>
<td valign="top" align="center">41</td>
<td valign="top" align="center">8</td>
<td valign="top" align="center">24</td>
<td valign="top" align="center">13.78</td>
<td valign="top" align="center">3.560</td>
<td valign="top" align="center">0.815</td>
<td valign="top" align="center">0.369</td>
<td valign="top" align="center">0.744</td>
<td valign="top" align="center">0.724</td>
</tr>
<tr>
<td valign="top" align="left">(4)</td>
<td valign="top" align="center">41</td>
<td valign="top" align="center">0</td>
<td valign="top" align="center">8</td>
<td valign="top" align="center">2.61</td>
<td valign="top" align="center">2.312</td>
<td valign="top" align="center">0.580</td>
<td valign="top" align="center">0.369</td>
<td valign="top" align="center">&#x2212;0.585</td>
<td valign="top" align="center">0.724</td>
</tr>
<tr>
<td valign="top" align="left">(5)</td>
<td valign="top" align="center">41</td>
<td valign="top" align="center">3</td>
<td valign="top" align="center">15</td>
<td valign="top" align="center">9.37</td>
<td valign="top" align="center">2.791</td>
<td valign="top" align="center">&#x2212;0.151</td>
<td valign="top" align="center">0.369</td>
<td valign="top" align="center">&#x2212;0.005</td>
<td valign="top" align="center">0.724</td>
</tr>
<tr>
<td valign="top" align="left">(6)</td>
<td valign="top" align="center">41</td>
<td valign="top" align="center">3</td>
<td valign="top" align="center">19</td>
<td valign="top" align="center">11.98</td>
<td valign="top" align="center">4.083</td>
<td valign="top" align="center">&#x2212;0.313</td>
<td valign="top" align="center">0.369</td>
<td valign="top" align="center">&#x2212;0.737</td>
<td valign="top" align="center">0.724</td>
</tr>
<tr>
<td valign="top" align="left"><bold>Acoustic features of voice</bold></td>
<td/>
<td/>
<td/>
<td/>
<td/>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left">(7)</td>
<td valign="top" align="center">41</td>
<td valign="top" align="center">3.42</td>
<td valign="top" align="center">22.48</td>
<td valign="top" align="center">11.12</td>
<td valign="top" align="center">4.592</td>
<td valign="top" align="center">0.514</td>
<td valign="top" align="center">0.374</td>
<td valign="top" align="center">&#x2212;0.116</td>
<td valign="top" align="center">0.733</td>
</tr>
<tr>
<td valign="top" align="left">(8)</td>
<td valign="top" align="center">41</td>
<td valign="top" align="center">6.60</td>
<td valign="top" align="center">82.58</td>
<td valign="top" align="center">23.83</td>
<td valign="top" align="center">14.203</td>
<td valign="top" align="center">2.068</td>
<td valign="top" align="center">0.374</td>
<td valign="top" align="center">6.657</td>
<td valign="top" align="center">0.733</td>
</tr>
<tr>
<td valign="top" align="left">(9)</td>
<td valign="top" align="center">41</td>
<td valign="top" align="center">2.84</td>
<td valign="top" align="center">15.98</td>
<td valign="top" align="center">8.25</td>
<td valign="top" align="center">3.416</td>
<td valign="top" align="center">0.623</td>
<td valign="top" align="center">0.374</td>
<td valign="top" align="center">&#x2212;0.169</td>
<td valign="top" align="center">0.733</td>
</tr>
<tr>
<td valign="top" align="left">(10)</td>
<td valign="top" align="center">41</td>
<td valign="top" align="center">0.172</td>
<td valign="top" align="center">13.52</td>
<td valign="top" align="center">4.13</td>
<td valign="top" align="center">3.311</td>
<td valign="top" align="center">0.875</td>
<td valign="top" align="center">0.374</td>
<td valign="top" align="center">0.091</td>
<td valign="top" align="center">0.733</td>
</tr>
<tr>
<td valign="top" align="left">(11)</td>
<td valign="top" align="center">41</td>
<td valign="top" align="center">0.10</td>
<td valign="top" align="center">8.31</td>
<td valign="top" align="center">2.66</td>
<td valign="top" align="center">2.208</td>
<td valign="top" align="center">0.880</td>
<td valign="top" align="center">0.374</td>
<td valign="top" align="center">&#x2212;0.218</td>
<td valign="top" align="center">0.733</td>
</tr>
<tr>
<td valign="top" align="left">(12)</td>
<td valign="top" align="center">41</td>
<td valign="top" align="center">0.00</td>
<td valign="top" align="center">52.79</td>
<td valign="top" align="center">2.83</td>
<td valign="top" align="center">9.689</td>
<td valign="top" align="center">4.320</td>
<td valign="top" align="center">0.374</td>
<td valign="top" align="center">19.848</td>
<td valign="top" align="center">0.733</td>
</tr>
<tr>
<td valign="top" align="left">(13)</td>
<td valign="top" align="center">41</td>
<td valign="top" align="center">0.17</td>
<td valign="top" align="center">1.67</td>
<td valign="top" align="center">0.39</td>
<td valign="top" align="center">0.265</td>
<td valign="top" align="center">3.005</td>
<td valign="top" align="center">0.374</td>
<td valign="top" align="center">13.597</td>
<td valign="top" align="center">0.733</td>
</tr>
<tr>
<td valign="top" align="left">(14)</td>
<td valign="top" align="center">41</td>
<td valign="top" align="center">0.039</td>
<td valign="top" align="center">1.29</td>
<td valign="top" align="center">0.27</td>
<td valign="top" align="center">0.260</td>
<td valign="top" align="center">2.673</td>
<td valign="top" align="center">0.374</td>
<td valign="top" align="center">7.802</td>
<td valign="top" align="center">0.733</td>
</tr>
<tr>
<td valign="top" align="left">(15)</td>
<td valign="top" align="center">41</td>
<td valign="top" align="center">0.26</td>
<td valign="top" align="center">4.642</td>
<td valign="top" align="center">1.57</td>
<td valign="top" align="center">0.967</td>
<td valign="top" align="center">1.241</td>
<td valign="top" align="center">0.374</td>
<td valign="top" align="center">1.908</td>
<td valign="top" align="center">0.733</td>
</tr>
<tr>
<td valign="top" align="left">(16)</td>
<td valign="top" align="center">41</td>
<td valign="top" align="center">14</td>
<td valign="top" align="center">77</td>
<td valign="top" align="center">41.83</td>
<td valign="top" align="center">14.874</td>
<td valign="top" align="center">0.061</td>
<td valign="top" align="center">0.369</td>
<td valign="top" align="center">&#x2212;0.183</td>
<td valign="top" align="center">0.724</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn><p><italic>N - number of participants; Min - participant&#x2019;s lowest achievement; Max - participant&#x2019;s highest achievement; M - arithmetic mean; SD - standard deviation; Sk - Skewness; Ku - Kurtosis. (1) Age; (2) Peabody; (3) Raven score; (4) Paralinguistic production of emotion; (5) Paralinguistic comprehension of emotion; (6) Total Paralinguistic score; (7) F0; (8) Shim; (9) vAm; (10) APQ; (11) Jitt; (12) PPQ; (13) DVB; (14) NHR; (15) VTI; (16) SPI.</italic></p></fn>
</table-wrap-foot>
</table-wrap>
<p><xref ref-type="table" rid="T3">Table 3</xref> shows the correlation coefficients of all used variables. The rows show numbered variables, and their names are shown in the description of columns. Spearman&#x2019;s correlation coefficients are shown in the matrix cells. The variables of the same domain highly correlate with each other.</p>
<table-wrap position="float" id="T3">
<label>TABLE 3</label>
<caption><p>Intercorrelation of all used variables.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<td valign="top" align="left"></td>
<td/>
<td valign="top" align="center">(1)</td>
<td valign="top" align="center">(2)</td>
<td valign="top" align="center">(3)</td>
<td valign="top" align="center">(4)</td>
<td valign="top" align="center">(5)</td>
<td valign="top" align="center">(6)</td>
<td valign="top" align="center">(7)</td>
<td valign="top" align="center">(8)</td>
<td valign="top" align="center">(9)</td>
<td valign="top" align="center">(10)</td>
<td valign="top" align="center">(11)</td>
<td valign="top" align="center">(12)</td>
<td valign="top" align="center">(13)</td>
<td valign="top" align="center">(14)</td>
<td valign="top" align="center">(15)</td>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">(1)</td>
<td valign="top" align="center">&#x2212;1</td>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/></tr>
<tr>
<td valign="top" align="left">(2)</td>
<td valign="top" align="center">&#x2212;2</td>
<td valign="top" align="center">&#x2013;0.11</td>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/></tr>
<tr>
<td valign="top" align="left">(3)</td>
<td valign="top" align="center">&#x2212;3</td>
<td valign="top" align="center">&#x2013;0.01</td>
<td valign="top" align="center">0.464<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/></tr>
<tr>
<td valign="top" align="left">(4)</td>
<td valign="top" align="center">&#x2212;4</td>
<td valign="top" align="center">0.01</td>
<td valign="top" align="center">0.506<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.339<xref ref-type="table-fn" rid="t3fns1">&#x002A;</xref></td>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/></tr>
<tr>
<td valign="top" align="left">(5)</td>
<td valign="top" align="center">&#x2212;5</td>
<td valign="top" align="center">&#x2013;0.04</td>
<td valign="top" align="center">0.457<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.24</td>
<td valign="top" align="center">0.28</td>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/></tr>
<tr>
<td valign="top" align="left">(6)</td>
<td valign="top" align="center">&#x2212;6</td>
<td valign="top" align="center">&#x2013;0.02</td>
<td valign="top" align="center">0.599<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.354<xref ref-type="table-fn" rid="t3fns1">&#x002A;</xref></td>
<td valign="top" align="center">0.754<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.839<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/></tr>
<tr>
<td valign="top" align="left">(7)</td>
<td valign="top" align="center">&#x2212;7</td>
<td valign="top" align="center">0.26</td>
<td valign="top" align="center">&#x2013;0.15</td>
<td valign="top" align="center">&#x2013;0.24</td>
<td valign="top" align="center">0.09</td>
<td valign="top" align="center">&#x2013;0.06</td>
<td valign="top" align="center">0.01</td>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/></tr>
<tr>
<td valign="top" align="left">(8)</td>
<td valign="top" align="center">&#x2212;8</td>
<td valign="top" align="center">0.13</td>
<td valign="top" align="center">0.02</td>
<td valign="top" align="center">&#x2013;0.2</td>
<td valign="top" align="center">&#x2013;0.04</td>
<td valign="top" align="center">0.17</td>
<td valign="top" align="center">0.09</td>
<td valign="top" align="center">&#x2013;0.02</td>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/></tr>
<tr>
<td valign="top" align="left">(9)</td>
<td valign="top" align="center">&#x2212;9</td>
<td valign="top" align="center">0.1</td>
<td valign="top" align="center">&#x2013;0.28</td>
<td valign="top" align="center">&#x2013;0.21</td>
<td valign="top" align="center">&#x2013;0.23</td>
<td valign="top" align="center">&#x2212;0.358<xref ref-type="table-fn" rid="t3fns1">&#x002A;</xref></td>
<td valign="top" align="center">&#x2212;0.366<xref ref-type="table-fn" rid="t3fns1">&#x002A;</xref></td>
<td valign="top" align="center">0.11</td>
<td valign="top" align="center">0.331<xref ref-type="table-fn" rid="t3fns1">&#x002A;</xref></td>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/></tr>
<tr>
<td valign="top" align="left">(10)</td>
<td valign="top" align="center">&#x2212;10</td>
<td valign="top" align="center">0.16</td>
<td valign="top" align="center">0.04</td>
<td valign="top" align="center">&#x2013;0.16</td>
<td valign="top" align="center">0.01</td>
<td valign="top" align="center">0.16</td>
<td valign="top" align="center">0.11</td>
<td valign="top" align="center">0.02</td>
<td valign="top" align="center">0.981<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.370<xref ref-type="table-fn" rid="t3fns1">&#x002A;</xref></td>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/></tr>
<tr>
<td valign="top" align="left">(11)</td>
<td valign="top" align="center">&#x2212;11</td>
<td valign="top" align="center">0.13</td>
<td valign="top" align="center">&#x2013;0.07</td>
<td valign="top" align="center">&#x2212;0.327<xref ref-type="table-fn" rid="t3fns1">&#x002A;</xref></td>
<td valign="top" align="center">&#x2013;0.13</td>
<td valign="top" align="center">&#x2013;0.07</td>
<td valign="top" align="center">&#x2013;0.12</td>
<td valign="top" align="center">0.21</td>
<td valign="top" align="center">0.772<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.583<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.796<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/></tr>
<tr>
<td valign="top" align="left">(12)</td>
<td valign="top" align="center">&#x2212;12</td>
<td valign="top" align="center">0.19</td>
<td valign="top" align="center">&#x2013;0.06</td>
<td valign="top" align="center">&#x2212;0.323<xref ref-type="table-fn" rid="t3fns1">&#x002A;</xref></td>
<td valign="top" align="center">&#x2013;0.1</td>
<td valign="top" align="center">&#x2013;0.04</td>
<td valign="top" align="center">&#x2013;0.09</td>
<td valign="top" align="center">0.23</td>
<td valign="top" align="center">0.792<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.556<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.826<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.982<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/></tr>
<tr>
<td valign="top" align="left">(13)</td>
<td valign="top" align="center">&#x2212;13</td>
<td valign="top" align="center">0.2</td>
<td valign="top" align="center">0.03</td>
<td valign="top" align="center">&#x2013;0.29</td>
<td valign="top" align="center">0.16</td>
<td valign="top" align="center">0.11</td>
<td valign="top" align="center">0.17</td>
<td valign="top" align="center">0.374<xref ref-type="table-fn" rid="t3fns1">&#x002A;</xref></td>
<td valign="top" align="center">0.340<xref ref-type="table-fn" rid="t3fns1">&#x002A;</xref></td>
<td valign="top" align="center">0.03</td>
<td valign="top" align="center">0.377<xref ref-type="table-fn" rid="t3fns1">&#x002A;</xref></td>
<td valign="top" align="center">0.27</td>
<td valign="top" align="center">0.395<xref ref-type="table-fn" rid="t3fns1">&#x002A;</xref></td>
<td valign="top" align="center"/>
<td valign="top" align="center"/>
<td valign="top" align="center"/></tr>
<tr>
<td valign="top" align="left">(14)</td>
<td valign="top" align="center">&#x2212;14</td>
<td valign="top" align="center">&#x2013;0.03</td>
<td valign="top" align="center">&#x2013;0.07</td>
<td valign="top" align="center">&#x2013;0.25</td>
<td valign="top" align="center">0.01</td>
<td valign="top" align="center">&#x2013;0.02</td>
<td valign="top" align="center">&#x2013;0.01</td>
<td valign="top" align="center">0.462<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.607<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.29</td>
<td valign="top" align="center">0.603<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.619<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.624<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.417<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center"/>
<td valign="top" align="center"/></tr>
<tr>
<td valign="top" align="left">(15)</td>
<td valign="top" align="center">&#x2212;15</td>
<td valign="top" align="center">&#x2013;0.11</td>
<td valign="top" align="center">0.02</td>
<td valign="top" align="center">&#x2013;0.1</td>
<td valign="top" align="center">0.12</td>
<td valign="top" align="center">0.14</td>
<td valign="top" align="center">0.16</td>
<td valign="top" align="center">0.22</td>
<td valign="top" align="center">0.590<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.13</td>
<td valign="top" align="center">0.580<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.459<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.433<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center">0.24</td>
<td valign="top" align="center">0.775<xref ref-type="table-fn" rid="t3fns1">&#x002A;&#x002A;</xref></td>
<td valign="top" align="center"/></tr>
<tr>
<td valign="top" align="left">(16)</td>
<td valign="top" align="center">&#x2212;16</td>
<td valign="top" align="center">&#x2013;0.15</td>
<td valign="top" align="center">0.1</td>
<td valign="top" align="center">0.06</td>
<td valign="top" align="center">0.04</td>
<td valign="top" align="center">&#x2013;0.03</td>
<td valign="top" align="center">0</td>
<td valign="top" align="center">&#x2013;0.09</td>
<td valign="top" align="center">0.03</td>
<td valign="top" align="center">0.356<xref ref-type="table-fn" rid="t3fns1">&#x002A;</xref></td>
<td valign="top" align="center">0.05</td>
<td valign="top" align="center">0.2</td>
<td valign="top" align="center">0.21</td>
<td valign="top" align="center">&#x2013;0.07</td>
<td valign="top" align="center">&#x2013;0.06</td>
<td valign="top" align="center">&#x2212;0.358<xref ref-type="table-fn" rid="t3fns1">&#x002A;</xref></td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn id="t3fns1"><p><italic>&#x002A;&#x002A;p &#x003C; 0.01, &#x002A;p &#x003C; 0.05. (1) Age; (2) Peabody; (3) Raven score; (4) Paralinguistic production of emotion; (5) Paralinguistic comprehension of emotion; (6) Total Paralinguistic score; (7) F0; (8) Shim; (9) vAm; (10) APQ; (11) Jitt; (12) PPQ; (13) DVB; (14) NHR; (15) VTI; (16) SPI.</italic></p></fn>
</table-wrap-foot>
</table-wrap>
<p>The table shows that acoustic voice characteristics highly correlate with each other. Paralinguistic production of emotions and paralinguistic comprehension are highly interrelated. Also, Raven and Peabody are highly interrelated.</p>
</sec>
<sec id="S3.SS2">
<title>Predictors of Paralinguistic Comprehension of Emotions</title>
<p>Hierarchical regression analysis was used to determine the predictors of paralinguistic comprehension of emotions (<xref ref-type="table" rid="T4">Table 4</xref>). This analysis evaluated the possibility to predict paralinguistic comprehension of emotions by the set of variables used in the research (gender, age, etiology, level of intellectual functioning, receptive language skills), as well as by three factors of acoustic voice characteristics (voice perturbation, voice frequency and interruptions, noise, and tremor in voice). The predictors were introduced hierarchically in blocks. The first block included gender, age, and etiology. The second block included variables from the domain of receptive language skills, level of intellectual functioning, and which were added to the variables from the first block. The third block included variables from the domain of acoustic voice characteristics which were grouped according to the results of factor analysis (<xref ref-type="table" rid="T1">Table 1</xref>).</p>
<table-wrap position="float" id="T4">
<label>TABLE 4</label>
<caption><p>Results of hierarchical regression analysis for predicting paralinguistic comprehension of emotions.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<td valign="top" align="left">Block</td>
<td/>
<td valign="top" align="center">B</td>
<td valign="top" align="center"><italic>t</italic></td>
<td valign="top" align="center"><italic>p</italic></td>
<td valign="top" align="center"><italic>R</italic></td>
<td valign="top" align="center"><italic>R</italic><sup>2</sup></td>
<td valign="top" align="center"><italic>F</italic>(3/37)</td>
<td valign="top" align="center"><italic>P</italic></td>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">Block I</td>
<td valign="top" align="left">Gender</td>
<td valign="top" align="center">&#x2013;0.037</td>
<td valign="top" align="center">&#x2013;0.219</td>
<td valign="top" align="center">0.828</td>
<td valign="top" align="center">0.208</td>
<td valign="top" align="center">0.043</td>
<td valign="top" align="center">0.558</td>
<td valign="top" align="center">0.646</td>
</tr>
<tr>
<td/>
<td valign="top" align="left">Age</td>
<td valign="top" align="center">&#x2013;0.169</td>
<td valign="top" align="center">&#x2013;1.000</td>
<td valign="top" align="center">0.324</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Etiology</td>
<td valign="top" align="center">0.108</td>
<td valign="top" align="center">0.668</td>
<td valign="top" align="center">0.508</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left">Block II</td>
<td valign="top" align="left">Gender</td>
<td valign="top" align="center">0.015</td>
<td valign="top" align="center">0.089</td>
<td valign="top" align="center">0.929</td>
<td valign="top" align="center">0.420</td>
<td valign="top" align="center">0.176</td>
<td valign="top" align="center">1.213</td>
<td valign="top" align="center">0.323</td>
</tr>
<tr>
<td/>
<td valign="top" align="left">Age</td>
<td valign="top" align="center">&#x2013;0.082</td>
<td valign="top" align="center">&#x2013;0.467</td>
<td valign="top" align="center">0.644</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Etiology</td>
<td valign="top" align="center">0.145</td>
<td valign="top" align="center">0.733</td>
<td valign="top" align="center">0.468</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Raven score</td>
<td valign="top" align="center">&#x2013;0.100</td>
<td valign="top" align="center">&#x2013;0.477</td>
<td valign="top" align="center">0.636</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Peabody</td>
<td valign="top" align="center">0.468</td>
<td valign="top" align="center">2.236</td>
<td valign="top" align="center"><bold>0.032</bold></td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left">Block III</td>
<td valign="top" align="left">Gender</td>
<td valign="top" align="center">0.008</td>
<td valign="top" align="center">0.046</td>
<td valign="top" align="center">0.963</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Age</td>
<td valign="top" align="center">&#x2013;0.109</td>
<td valign="top" align="center">&#x2013;0.629</td>
<td valign="top" align="center">0.534</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Etiology</td>
<td valign="top" align="center">0.093</td>
<td valign="top" align="center">0.465</td>
<td valign="top" align="center">0.645</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Raven score</td>
<td valign="top" align="center">0.058</td>
<td valign="top" align="center">0.260</td>
<td valign="top" align="center">0.796</td>
<td valign="top" align="center">0.524</td>
<td valign="top" align="center">0.275</td>
<td valign="top" align="center">1.306</td>
<td valign="top" align="center">0.274</td>
</tr>
<tr>
<td/>
<td valign="top" align="left">Peabody</td>
<td valign="top" align="center">0.436</td>
<td valign="top" align="center">2.110</td>
<td valign="top" align="center"><bold>0.043</bold></td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Voice perturbations</td>
<td valign="top" align="center">0.217</td>
<td valign="top" align="center">1.212</td>
<td valign="top" align="center">0.235</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Voice frequency and interruptions</td>
<td valign="top" align="center">&#x2013;0.064</td>
<td valign="top" align="center">&#x2013;0.380</td>
<td valign="top" align="center">0.707</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Noise and tremor in voice</td>
<td valign="top" align="center">0.201</td>
<td valign="top" align="center">1.151</td>
<td valign="top" align="center">0.259</td>
<td/>
<td/>
<td/>
<td/>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn><p><italic>&#x03B2; &#x2013; standardized regression coefficient, t &#x2013; significance parameter of standardized regression coefficient, p &#x2013; statistical significance of standardized regression coefficient, R &#x2013; regression model for predicting paralinguistic comprehension of emotions with hierarchical introduction of gender, age, and etiology (first level), Raven and Peabody (second level) and factor scores of acoustic voice characteristics (third level), R<sup>2</sup> &#x2013; variance percentage of paralinguistic comprehension of emotions explained with hierarchical regression model, F &#x2013; parameter of statistical significance of regression model, P &#x2013; statistical significance of regression model. Bold values p &#x003C; 0.05.</italic></p></fn>
</table-wrap-foot>
</table-wrap>
<p>The results of this regression analysis singled out two significant models. The first explains about 4% of the criteria variance [R<sup>2</sup> = 0.043, F(3/37) = 0.558, <italic>P</italic> &#x003E; 0.05], and the second explains about 17% [R<sup>2</sup> = 0.176, F(3/37) = 1.213, <italic>P</italic> &#x003E; 0.05]. In this analysis, receptive language skills proved to be a significant predictor of paralinguistic comprehension of emotions (&#x03B2; = 0.468, <italic>t</italic> = 2.236, <italic>p</italic> &#x003C; 0.05).</p>
</sec>
<sec id="S3.SS3">
<title>Predictors of Paralinguistic Production of Emotions</title>
<p>Hierarchical regression analysis was applied to determine the predictors of paralinguistic production of emotions (<xref ref-type="table" rid="T5">Table 5</xref>). This analysis evaluated the possibility to predict paralinguistic production of emotions by the set of variables used in the research (gender, age, etiology, level of intellectual functioning, receptive language skills), as well as by three factors of acoustic voice characteristics (voice perturbation, voice frequency and interruptions, noise and tremor in voice). The predictors were introduced hierarchically in blocks, in the same way as in the process of determining the predictors of paralinguistic comprehension of emotions.</p>
<table-wrap position="float" id="T5">
<label>TABLE 5</label>
<caption><p>Results of hierarchical regression analysis for predicting paralinguistic production of emotions.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<td valign="top" align="left">Block</td>
<td/>
<td valign="top" align="center">&#x03B2;</td>
<td valign="top" align="center"><italic>t</italic></td>
<td valign="top" align="center"><italic>p</italic></td>
<td valign="top" align="center"><italic>R</italic></td>
<td valign="top" align="center"><italic>R</italic><sup>2</sup></td>
<td valign="top" align="center">F (3/37)</td>
<td valign="top" align="center"><italic>P</italic></td>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">Block I</td>
<td valign="top" align="left">Gender</td>
<td valign="top" align="center">0.089</td>
<td valign="top" align="center">0.535</td>
<td valign="top" align="center">0.596</td>
<td valign="top" align="center">0.270</td>
<td valign="top" align="center">0.073</td>
<td valign="top" align="center">0.972</td>
<td valign="top" align="center">0.416</td>
</tr>
<tr>
<td/>
<td valign="top" align="left">Age</td>
<td valign="top" align="center">&#x2013;0.173</td>
<td valign="top" align="center">&#x2013;1.043</td>
<td valign="top" align="center">0.304</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Etiology</td>
<td valign="top" align="center">0.228</td>
<td valign="top" align="center">1.433</td>
<td valign="top" align="center">0.160</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left">Block II</td>
<td valign="top" align="left">Gender</td>
<td valign="top" align="center">0.067</td>
<td valign="top" align="center">0.476</td>
<td valign="top" align="center">0.637</td>
<td valign="top" align="center">0.639</td>
<td valign="top" align="center">0.409</td>
<td valign="top" align="center">3.918</td>
<td valign="top" align="center"><bold>0.004</bold></td>
</tr>
<tr>
<td/>
<td valign="top" align="left">Age</td>
<td valign="top" align="center">&#x2013;0.092</td>
<td valign="top" align="center">&#x2013;0.621</td>
<td valign="top" align="center">0.539</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Etiology</td>
<td valign="top" align="center">&#x2013;0.072</td>
<td valign="top" align="center">&#x2013;0.431</td>
<td valign="top" align="center">0.670</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Raven score</td>
<td valign="top" align="center">0.213</td>
<td valign="top" align="center">1.200</td>
<td valign="top" align="center">0.238</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Peabody</td>
<td valign="top" align="center">0.303</td>
<td valign="top" align="center">1.709</td>
<td valign="top" align="center">0.097</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left">Block III</td>
<td valign="top" align="left">Gender</td>
<td valign="top" align="center">&#x2013;0.006</td>
<td valign="top" align="center">&#x2013;0.041</td>
<td valign="top" align="center">0.968</td>
<td valign="top" align="center">0.691</td>
<td valign="top" align="center">0.477</td>
<td valign="top" align="center">4.301</td>
<td valign="top" align="center"><bold>0.002</bold></td>
</tr>
<tr>
<td/>
<td valign="top" align="left">Age</td>
<td valign="top" align="center">&#x2013;0.106</td>
<td valign="top" align="center">&#x2013;0.743</td>
<td valign="top" align="center">0.463</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Etiology</td>
<td valign="top" align="center">&#x2013;0.057</td>
<td valign="top" align="center">&#x2013;0.357</td>
<td valign="top" align="center">0.724</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Raven score</td>
<td valign="top" align="center">0.270</td>
<td valign="top" align="center">1.578</td>
<td valign="top" align="center">0.124</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Peabody</td>
<td valign="top" align="center">0.298</td>
<td valign="top" align="center">1.763</td>
<td valign="top" align="center">0.087</td>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td valign="top" align="left">Voice frequency and interruptions</td>
<td valign="top" align="center">0.280</td>
<td valign="top" align="center">2.076</td>
<td valign="top" align="center"><bold>0.046</bold></td>
<td/>
<td/>
<td/>
<td/>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn><p><italic>&#x03B2; &#x2013; standardized regression coefficient; t &#x2013; significance parameter of standardized regression coefficient; p &#x2013; statistical significance of standardized regression coefficient; R &#x2013; regression model for predicting paralinguistic production of emotions with hierarchical introduction of gender, age, and etiology (first level), Raven and Peabody (second level) and factor scores of acoustic voice characteristics (third level); R<sup>2</sup> &#x2013; variance percentage of paralinguistic production of emotions explained with hierarchical regression model; F &#x2013; parameter of statistical significance of regression model; P &#x2013; statistical significance of regression model. bold values p &#x003C; 0.05, bold values P &#x003C; 0.05.</italic></p></fn>
</table-wrap-foot>
</table-wrap>
<p>Regression analysis singled out three significant models. The first explains about 7% [R<sup>2</sup> = 0.073, F(3/37) = 0.972, <italic>P</italic> &#x003E; 0.05], the second about 40% [R<sup>2</sup> = 0.409, F(3/37) = 3.918, <italic>P</italic> &#x003C; 0.01], and the third 48% [R<sup>2</sup> = 0.477, F(3/37) = 4.301, <italic>P</italic> &#x003C; 0.01] of criteria variance. In this analysis, the second factor from the domain of acoustic voice characteristics (voice frequency and interruptions) proved to be a significant predictor of paralinguistic production of emotions. (&#x03B2; = 0.280, <italic>t</italic> = 2.076, <italic>p</italic> &#x003C; 0.05).</p>
</sec>
</sec>
<sec id="S4" sec-type="discussion">
<title>Discussion</title>
<p>Earlier manuscripts on paralinguistic comprehension and production of emotions in people with ID assumed that these abilities were related to language and cognitive deficits. However, we are not familiar with manuscripts that more precisely determine which language and cognitive domains best predict these abilities. Our manuscript examined whether specific acoustic voice characteristics, receptive language skills, gender, age, etiology, and the level of intellectual functioning had a predictive value in paralinguistic comprehension and production of emotions in adults with MID.</p>
<p>Regression analysis showed that, from three hierarchical groups of predictors, only receptive language skills from the second group were a significant predictor of paralinguistic comprehension of emotions. Interestingly, receptive language skills were a significant predictor of understanding emotions, although the actors in our study spoke a non-existent language while expressing basic emotions. In addition to the required answers, the researchers also noted down the comments of the participants (which were not included in this analysis) who tried to detect the language the actor spoke (e.g., &#x201C;I know, he is speaking Japanese&#x201D;) or tried to &#x201C;translate&#x201D; the spoken message (e.g., &#x201C;He said that he was sad and that he was crying because of that&#x201D;). All this indicates that people with MID in our sample largely relied on the meaning of the spoken messages, even when they did not mean anything. Receptive language skills measured by the Peabody test for assessing receptive speech are not limited by expressive language abilities or working memory limitations (<xref ref-type="bibr" rid="B38">Loveall et al., 2016</xref>). Thus, we can assume that receptive language skills measured by the Peabody receptive speech test are a measure of lower, perceptual cognitive abilities compared to the level of intellectual functioning measured by Raven&#x2019;s progressive matrices that measures higher, more complex forms of cognition. <xref ref-type="bibr" rid="B55">Pochon et al. (2017)</xref> showed that the measures of intellectual functioning obtained by Raven&#x2019;s progressive matrices did not predict the ability to understand emotions in adolescents with DS, unlike in typically developing adolescents. According to <xref ref-type="bibr" rid="B13">Beck et al. (2012)</xref>, receptive language skills refer to coded verbal concepts gained from experience in relation to the environment. The same authors state the possibility that recognizing emotions is closely related to receptive language skills since the conceptualization of emotions may arise from lexical-semantic differentiation. Another possible explanation is that there is a common conceptualization mechanism that connects receptive language skills and emotion recognition (<xref ref-type="bibr" rid="B13">Beck et al., 2012</xref>). The fact that these abilities are conditioned by learning and experience can also be the reason why we did not get significant results for Raven&#x2019;s matrices measures. They are non-verbal cognitive measures that refer to general intelligence (<xref ref-type="bibr" rid="B56">Raven and Raven, 1998</xref>) unrelated to experience and learning (<xref ref-type="bibr" rid="B16">Cattell, 1971</xref>). Furthermore, <xref ref-type="bibr" rid="B30">Joyce et al. (2006)</xref> showed that adults with ID were better in recognizing emotions than in producing them, and that these abilities were related to receptive language skills. These findings confirmed previous ones on the importance of receptive language skills (<xref ref-type="bibr" rid="B19">Dagnan et al., 2000</xref>).</p>
<p>With regard to paralinguistic production, the participants in our research were required to utter specific content and show a specific emotion in the way they speak (e.g., &#x201C;Tell me to close the door. Say it angrily.&#x201D;; &#x201C;Ask me where the doctor is. Do it sadly&#x201D;). The results of regression analysis in predicting paralinguistic production of emotions showed that the second factor in the acoustic voice characteristics domain (frequency and interruptions) from the third group was a significant predictor of paralinguistic production of emotions. This factor refers to the mean value of fundamental frequency and the percentage of parts with voice interruptions. In one research (<xref ref-type="bibr" rid="B47">Mendhakar et al., 2019</xref>), these parameters proved to be significantly different in premature babies compared to term babies. DVB represents the relation between the total duration of the parts with voice interruptions and the duration of the complete voice sample. It was shown that high-risk babies had a higher degree of voice interruptions. A higher degree of interruptions indicates bigger pauses and distribution of non-vocal components in a child&#x2019;s cry (<xref ref-type="bibr" rid="B47">Mendhakar et al., 2019</xref>). The authors assume that these characteristics are attributed to immature larynx innervation in premature babies (<xref ref-type="bibr" rid="B47">Mendhakar et al., 2019</xref>). Research into acoustic voice characteristics in adults with ID of unknown etiology and DS shows that these people have a smaller range of frequency variability, which indicates voice monotony (<xref ref-type="bibr" rid="B27">Gautam and Singh, 2016</xref>; <xref ref-type="bibr" rid="B18">Corrales-Astorgano et al., 2018</xref>). As an element of prosody, F0 has a significant paralinguistic function (<xref ref-type="bibr" rid="B50">N&#x00ED; Chasaide and Gobl, 2004</xref>). Difficulties in producing emotions can be associated with monotonous and less melodic speech in people with ID. Certain intonation-melodic variability and a larger range of fundamental voice frequency are necessary to express emotions. It is assumed that structural differences in the larynx level become more pronounced in adulthood (<xref ref-type="bibr" rid="B51">O&#x2019; Leary et al., 2019</xref>) and can be related to emotion production.</p>
<p>Emotion production is a more complex ability than emotion comprehension and, apart from perception, requires the use of non-linguistic (prosodic) elements of speech (<xref ref-type="bibr" rid="B64">Scherer, 2003</xref>). The assessed receptive language skills and the level of intellectual functioning were measured by scales that require answering based on recognition only, without expressive production. Therefore, it was expected that they did not prove to be significant predictors. This was also assumed by the author whose research showed no significant correlation between paralinguistic production of emotions and the level of intellectual functioning measured by Raven&#x2019;s progressive matrices (<xref ref-type="bibr" rid="B20">Djordjevic et al., 2016a</xref>).</p>
<p>According to <xref ref-type="bibr" rid="B13">Beck et al. (2012)</xref>, people with more developed lexical abilities are more successful in the general conceptualization of verbal concepts, and thus probably in the conceptualization of emotions. Many research studies show that people with DS have significantly more pronounced deficits in the linguistic domain compared to their cognitive capacities (<xref ref-type="bibr" rid="B35">Laws and Bishop, 2004</xref>; <xref ref-type="bibr" rid="B75">Zampini et al., 2016</xref>). Therefore, the sample size in this manuscript may be the reason why etiology did not have a significant influence as a variable that significantly predicted the criteria variables. This was also confirmed by other studies such as <xref ref-type="bibr" rid="B74">Wishart et al. (2007)</xref>, who compared the ability to recognize emotions in participants with ID of different etiology and found that only children with DS had worse results than typically developing children. A significant positive correlation was also found between receptive language skills measured by the Peabody picture test and recognizing certain emotions from facial expressions in adults with Down syndrome (DS) as opposed to the control group (typically developing adults) (<xref ref-type="bibr" rid="B29">Hippolyte et al., 2008</xref>). The authors assumed that some emotional expressions required more complex semantic representation but preferred to explain that by a specific emotional deficit characteristic of participants with DS.</p>
<p>An older study by <xref ref-type="bibr" rid="B37">Leung and Singh (1998)</xref> showed that adults with ID had a poorer ability to recognize emotions than typically developing children. Also, in a sample of adults with mild and moderate ID, it was shown that younger participants and people with milder ID recognized emotions better (<xref ref-type="bibr" rid="B46">McKenzie et al., 2000</xref>). This was also confirmed by <xref ref-type="bibr" rid="B43">Matheson and Jahoda (2005)</xref>, who found that the ability to recognize emotions decreased with age in adults with ID. Possibly a more even age structure of the participants and a bigger sample would influence the obtained results, which could be a recommendation for future research.</p>
</sec>
<sec id="S5">
<title>Limitations</title>
<p>This research has several limitations. One refers to sample size, and future studies should include many more participants so that the results could be generalized. Apart from the size, another sample-related limitation is the participants&#x2019; structure. Thus, future studies should include more participants with known etiology, classified into groups with different syndromes. The application of tests that require different levels of cognitive information processing could be expanded in future research by using additional tests at the same level of processing. In addition to verbal tasks, accompanying pictographic material should also be included in assessing people with ID for the purpose of understanding orders.</p>
<p>Only one instrument was used in this research to assess comprehension and production of emotions. Thus, conclusions should be drawn with caution and future studies should verify these results using additional instruments.</p>
</sec>
<sec id="S6" sec-type="conclusion">
<title>Conclusion</title>
<p>The results obtained in our study showed that receptive language skills had a predictive value in paralinguistic comprehension of emotions, and voice frequency and interruptions, from the domain of acoustic voice characteristics, predicted paralinguistic production of emotions.</p>
<p>Since recognizing emotions is the basis of social interaction, as pointed out by many authors, it is important to encourage the skills which are most closely related to comprehension and production of emotions so that they could develop as well. Much more research in this area is needed before more precise educational and therapeutic guidelines can be established. With regard to this, future studies could go in the direction of longitudinal monitoring of these abilities in people with MID. Also, more precise measures of the level of intellectual functioning, in addition to Raven&#x2019;s Progressive Matrices, should be included in further analyses.</p>
</sec>
<sec id="S7" sec-type="data-availability">
<title>Data Availability Statement</title>
<p>The original contributions presented in this study are included in the article/<xref ref-type="supplementary-material" rid="DS1">Supplementary Material</xref>, further inquiries can be directed to the corresponding author.</p>
</sec>
<sec id="S8">
<title>Ethics Statement</title>
<p>The studies involving human participants were reviewed and approved by Ethics Committee of the Faculty of Special Education and Rehabilitation, University of Belgrade (no. 109/1). The patients/participants provided their written informed consent to participate in this study.</p>
</sec>
<sec id="S9">
<title>Author Contributions</title>
<p>All authors listed have made a substantial, direct, and intellectual contribution to the work, and approved it for publication.</p>
</sec>
<sec id="conf1" sec-type="COI-statement">
<title>Conflict of Interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec id="pudiscl1" sec-type="disclaimer">
<title>Publisher&#x2019;s Note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
</body>
<back>
<sec id="S10" sec-type="funding-information">
<title>Funding</title>
<p>This manuscript was a result of research within the projects of the Ministry of Education, Science and Technological Development of the Republic of Serbia (no. 451-03-68/2022-14).</p>
</sec>
<sec id="S11" sec-type="supplementary-material">
<title>Supplementary Material</title>
<p>The Supplementary Material for this article can be found online at: <ext-link ext-link-type="uri" xlink:href="https://www.frontiersin.org/articles/10.3389/fpsyg.2022.884242/full#supplementary-material">https://www.frontiersin.org/articles/10.3389/fpsyg.2022.884242/full#supplementary-material</ext-link></p>
<supplementary-material xlink:href="Data_Sheet_1.PDF" id="DS1" mimetype="application/pdf" xmlns:xlink="http://www.w3.org/1999/xlink"/>
</sec>
<ref-list>
<title>References</title>
<ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Agarwal Gupta</surname> <given-names>N.</given-names></name> <name><surname>Kabra</surname> <given-names>M.</given-names></name></person-group> (<year>2013</year>). <article-title>Diagnosis and management of down syndrome.</article-title> <source><italic>Indian J. Pediatr.</italic></source> <volume>81</volume> <fpage>560</fpage>&#x2013;<lpage>567</lpage>. <pub-id pub-id-type="doi">10.1007/s12098-013-1249-7</pub-id> <pub-id pub-id-type="pmid">24127006</pub-id></citation></ref>
<ref id="B2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Albanese</surname> <given-names>O.</given-names></name> <name><surname>De Stasio</surname> <given-names>S.</given-names></name> <name><surname>Chiacchio</surname> <given-names>C. D.</given-names></name> <name><surname>Fiorilli</surname> <given-names>C.</given-names></name> <name><surname>Pons</surname> <given-names>F.</given-names></name></person-group> (<year>2010</year>). <article-title>Emotion comprehension: the impact of nonverbal intelligence.</article-title> <source><italic>J. Genet. Psychol.</italic></source> <volume>171</volume> <fpage>101</fpage>&#x2013;<lpage>115</lpage>. <pub-id pub-id-type="doi">10.1080/00221320903548084</pub-id> <pub-id pub-id-type="pmid">20486399</pub-id></citation></ref>
<ref id="B3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Albertini</surname> <given-names>G.</given-names></name> <name><surname>Bonassi</surname> <given-names>S.</given-names></name> <name><surname>Dall&#x2019;Armi</surname> <given-names>V.</given-names></name> <name><surname>Giachetti</surname> <given-names>I.</given-names></name> <name><surname>Giaquinto</surname> <given-names>S.</given-names></name> <name><surname>Mignano</surname> <given-names>M.</given-names></name></person-group> (<year>2010</year>). <article-title>Spectral analysis of the voice in down syndrome.</article-title> <source><italic>Res. Dev. Disabil.</italic></source> <volume>31</volume> <fpage>995</fpage>&#x2013;<lpage>1001</lpage>. <pub-id pub-id-type="doi">10.1016/j.ridd.2010.04.024</pub-id> <pub-id pub-id-type="pmid">20488659</pub-id></citation></ref>
<ref id="B4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Amad&#x00F3;</surname> <given-names>A.</given-names></name> <name><surname>Serrat</surname> <given-names>E.</given-names></name> <name><surname>Vall&#x00E8;s-Majoral</surname> <given-names>E.</given-names></name></person-group> (<year>2016</year>). <article-title>The role of executive functions in social cognition among children with Down syndrome: relationship patterns</article-title>. <source><italic>Front. Psychol.</italic></source> <volume>7</volume>:<issue>1363</issue>. <pub-id pub-id-type="doi">10.3389/fpsyg.2016.01363</pub-id> <pub-id pub-id-type="pmid">27679588</pub-id></citation></ref>
<ref id="B5"><citation citation-type="journal"><collab>American Psychiatric Association</collab> (<year>2013</year>). <source><italic>Diagnostic and Statistical Manual of Mental Disorders</italic></source>, <edition>5th Edn</edition>. <publisher-loc>Washington, D.C</publisher-loc>: <publisher-name>American Psychiatric Association</publisher-name>, <pub-id pub-id-type="doi">10.1176/appi.books.9780890425596</pub-id></citation></ref>
<ref id="B6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Angeleri</surname> <given-names>R.</given-names></name> <name><surname>Bosco</surname> <given-names>F. M.</given-names></name> <name><surname>Gabbatore</surname> <given-names>I.</given-names></name> <name><surname>Bara</surname> <given-names>B. G.</given-names></name> <name><surname>Sacco</surname> <given-names>K.</given-names></name></person-group> (<year>2012</year>). <article-title>Assessment battery for communication (ABaCo): normative data.</article-title> <source><italic>Behav. Res. Methods</italic></source> <volume>44</volume> <fpage>845</fpage>&#x2013;<lpage>861</lpage>. <pub-id pub-id-type="doi">10.3758/s13428-011-0174</pub-id></citation></ref>
<ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Angeleri</surname> <given-names>R.</given-names></name> <name><surname>Gabbatore</surname> <given-names>I.</given-names></name> <name><surname>Bosco</surname> <given-names>F. M.</given-names></name> <name><surname>Sacco</surname> <given-names>K.</given-names></name> <name><surname>Colle</surname> <given-names>L.</given-names></name></person-group> (<year>2016</year>). <article-title>Pragmatic abilities in children and adolescents with autism spectrum disorder: a study with the ABaCo battery.</article-title> <source><italic>Minerva Psichiatr.</italic></source> <volume>57</volume> <fpage>93</fpage>&#x2013;<lpage>103</lpage>.</citation></ref>
<ref id="B8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Andr&#x00E9;s-Roqueta</surname> <given-names>C.</given-names></name> <name><surname>Soria-Izquierdo</surname> <given-names>E.</given-names></name> <name><surname>G&#x00F3;rriz-Plumed</surname> <given-names>A. B.</given-names></name></person-group> (<year>2021</year>). <article-title>Exploring different aspects of emotion understanding in adults with down syndrome.</article-title> <source><italic>Res. Dev. Disabil.</italic></source> <volume>114</volume>:<issue>103962</issue>. <pub-id pub-id-type="doi">10.1016/j.ridd.2021.103962</pub-id> <pub-id pub-id-type="pmid">33932849</pub-id></citation></ref>
<ref id="B9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Banks</surname> <given-names>P.</given-names></name> <name><surname>Jahoda</surname> <given-names>A.</given-names></name> <name><surname>Dagnan</surname> <given-names>D.</given-names></name> <name><surname>Kemp</surname> <given-names>J.</given-names></name> <name><surname>Williams</surname> <given-names>V.</given-names></name></person-group> (<year>2010</year>). <article-title>Supported employment for people with intellectual disability: The effects of job breakdown on psychological well-being.</article-title> <source><italic>J. Appl. Res. Intellect. Disabil.</italic></source> <volume>23</volume> <fpage>344</fpage>&#x2013;<lpage>354</lpage>. <pub-id pub-id-type="doi">10.1111/j.1468-3148.2009.00541.x</pub-id></citation></ref>
<ref id="B10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Banse</surname> <given-names>R.</given-names></name> <name><surname>Scherer</surname> <given-names>K. R.</given-names></name></person-group> (<year>1996</year>). <article-title>Acoustic profiles in vocal emotion expression.</article-title> <source><italic>J. Pers. Soc. Psychol.</italic></source> <volume>70</volume> <fpage>614</fpage>&#x2013;<lpage>636</lpage>. <pub-id pub-id-type="doi">10.1037/0022-3514.70.3.614</pub-id> <pub-id pub-id-type="pmid">8851745</pub-id></citation></ref>
<ref id="B11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Barbey</surname> <given-names>A. K.</given-names></name> <name><surname>Colom</surname> <given-names>R.</given-names></name> <name><surname>Grafman</surname> <given-names>J.</given-names></name></person-group> (<year>2014</year>). <article-title>Distributed neural system for emotional intelligence revealed by lesion mapping</article-title>. <source><italic>Soc. Cogn. Affect. Neurosci</italic></source>. <volume>9</volume>, <fpage>265</fpage>&#x2013;<lpage>272</lpage>. <pub-id pub-id-type="doi">10.1093/scan/nss124</pub-id> <pub-id pub-id-type="pmid">23171618</pub-id></citation></ref>
<ref id="B12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Barisnikov</surname> <given-names>K.</given-names></name> <name><surname>Thomasson</surname> <given-names>M.</given-names></name> <name><surname>Stutzmann</surname> <given-names>J.</given-names></name> <name><surname>Lejeune</surname> <given-names>F.</given-names></name></person-group> (<year>2020</year>). <article-title>Relation between processing facial identity and emotional expression in typically developing school-age children and those with Down syndrome.</article-title> <source><italic>Appl. Neuropsychol. Child</italic></source> <volume>9</volume> <fpage>179</fpage>&#x2013;<lpage>192</lpage>. <pub-id pub-id-type="doi">10.1080/21622965.2018.1552867</pub-id> <pub-id pub-id-type="pmid">30646753</pub-id></citation></ref>
<ref id="B13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Beck</surname> <given-names>L.</given-names></name> <name><surname>Kumschick</surname> <given-names>I. R.</given-names></name> <name><surname>Eid</surname> <given-names>M.</given-names></name> <name><surname>Klann-Delius</surname> <given-names>G.</given-names></name></person-group> (<year>2012</year>). <article-title>Relationship between language competence and emotional competence in middle childhood.</article-title> <source><italic>Emotion</italic></source> <volume>12</volume> <fpage>503</fpage>&#x2013;<lpage>514</lpage>. <pub-id pub-id-type="doi">10.1037/a0026320</pub-id> <pub-id pub-id-type="pmid">22148995</pub-id></citation></ref>
<ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Becker</surname> <given-names>S.</given-names></name> <name><surname>Nonn</surname> <given-names>K.</given-names></name> <name><surname>Graessel</surname> <given-names>E.</given-names></name> <name><surname>Becker</surname> <given-names>A. M.</given-names></name> <name><surname>Schuster</surname> <given-names>M.</given-names></name></person-group> (<year>2017</year>). <article-title>Voice disorders of adults with intellectual disability</article-title>. <source><italic>JSM Commun. Disord</italic></source>. <volume>1</volume>:<issue>1002</issue>.</citation></ref>
<ref id="B15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bosco</surname> <given-names>F. M.</given-names></name> <name><surname>Angeleri</surname> <given-names>R.</given-names></name> <name><surname>Zuffranieri</surname> <given-names>M.</given-names></name> <name><surname>Bara</surname> <given-names>B. G.</given-names></name> <name><surname>Sacco</surname> <given-names>K.</given-names></name></person-group> (<year>2012</year>). <article-title>Assessment battery for communication: development of two equivalent forms.</article-title> <source><italic>J. Commun. Dis.</italic></source> <volume>45</volume> <fpage>290</fpage>&#x2013;<lpage>303</lpage>. <pub-id pub-id-type="doi">10.1016/j.jcomdis.2012.03.002</pub-id> <pub-id pub-id-type="pmid">22483360</pub-id></citation></ref>
<ref id="B16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cattell</surname> <given-names>R. B.</given-names></name></person-group> (<year>1971</year>). <source><italic>Abilities: Their Structure, Growth, and Action.</italic></source> <publisher-loc>Boston, MA</publisher-loc>: <publisher-name>Houghton Mifflin</publisher-name>.</citation></ref>
<ref id="B17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cebula</surname> <given-names>K. R.</given-names></name> <name><surname>Wishart</surname> <given-names>J. G.</given-names></name> <name><surname>Willis</surname> <given-names>D. S.</given-names></name> <name><surname>Pitcairn</surname> <given-names>T. K.</given-names></name></person-group> (<year>2017</year>). <article-title>Emotion recognition in children with down syndrome: influence of emotion label and expression intensity.</article-title> <source><italic>Am. J. Intellect. Dev. Disabil.</italic></source> <volume>122</volume> <fpage>138</fpage>&#x2013;<lpage>155</lpage>. <pub-id pub-id-type="doi">10.1352/1944-7558-122.2.138</pub-id> <pub-id pub-id-type="pmid">28257244</pub-id></citation></ref>
<ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Corrales-Astorgano</surname> <given-names>M.</given-names></name> <name><surname>Escudero-Mancebo</surname> <given-names>D.</given-names></name> <name><surname>Gonzalez-Ferreras</surname> <given-names>C.</given-names></name></person-group> (<year>2018</year>). <article-title>Acoustic characterization and perceptual analysis of the relative importance of prosody in speech of people with down syndrome.</article-title> <source><italic>Speech Commun.</italic></source> <volume>99</volume> <fpage>90</fpage>&#x2013;<lpage>100</lpage>. <pub-id pub-id-type="doi">10.1016/j.specom.2018.03.006</pub-id></citation></ref>
<ref id="B19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dagnan</surname> <given-names>D.</given-names></name> <name><surname>Chadwick</surname> <given-names>P.</given-names></name> <name><surname>Proudlove</surname> <given-names>J.</given-names></name></person-group> (<year>2000</year>). <article-title>Toward an assessment of suitability of people with mental retardation for cognitive therapy.</article-title> <source><italic>Cogn. Ther. Res.</italic></source> <volume>24</volume> <fpage>627</fpage>&#x2013;<lpage>636</lpage>. <pub-id pub-id-type="doi">10.1023/a:1005531226519</pub-id></citation></ref>
<ref id="B20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Djordjevic</surname> <given-names>M.</given-names></name> <name><surname>Glumbic</surname> <given-names>N.</given-names></name> <name><surname>Brojcin</surname> <given-names>B.</given-names></name></person-group> (<year>2016a</year>). <article-title>Paralinguistic abilities of adults with intellectual disability.</article-title> <source><italic>Res. Dev. Disabil.</italic></source> <volume>48</volume> <fpage>211</fpage>&#x2013;<lpage>219</lpage>. <pub-id pub-id-type="doi">10.1016/j.ridd.2015.11.001</pub-id> <pub-id pub-id-type="pmid">26625206</pub-id></citation></ref>
<ref id="B21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Djordjevic</surname> <given-names>M.</given-names></name> <name><surname>Glumbic</surname> <given-names>N.</given-names></name> <name><surname>Brojcin</surname> <given-names>B.</given-names></name></person-group> (<year>2016b</year>). <article-title>Relation between paralinguistic skills and social skills in adults with mild and moderate intellectual disability.</article-title> <source><italic>Spec. Edukac. Rehabil.</italic></source> <volume>15</volume> <fpage>265</fpage>&#x2013;<lpage>285</lpage>. <pub-id pub-id-type="doi">10.5937/specedreh15-11313</pub-id></citation></ref>
<ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Djordjevic</surname> <given-names>M.</given-names></name> <name><surname>Glumbic</surname> <given-names>N.</given-names></name> <name><surname>Brojcin</surname> <given-names>B.</given-names></name></person-group> (<year>2018</year>). <article-title>Differences between pragmatic abilities of adults with down syndrome and persons with intellectual disability of unknown etiology - preliminary research.</article-title> <source><italic>Belgrade Sch. Spec. Educ. Rehabil.</italic></source> <volume>24</volume> <fpage>29</fpage>&#x2013;<lpage>40</lpage>.</citation></ref>
<ref id="B23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Djordjevic</surname> <given-names>M.</given-names></name> <name><surname>Glumbic</surname> <given-names>N.</given-names></name> <name><surname>Memisevic</surname> <given-names>H.</given-names></name></person-group> (<year>2020</year>). <article-title>Socialization in adults with intellectual disability: the effects of gender, mental illness, setting type, and level of intellectual disability.</article-title> <source><italic>J. Mental Health Res. Intellect. Disabil.</italic></source> <volume>13</volume> <fpage>364</fpage>&#x2013;<lpage>383</lpage>. <pub-id pub-id-type="doi">10.1080/19315864.2020.1815914</pub-id></citation></ref>
<ref id="B24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dunn</surname> <given-names>L. M.</given-names></name> <name><surname>Dunn</surname> <given-names>D. M.</given-names></name></person-group> (<year>2007</year>). <source><italic>Peabody Picture Vocabulary Test, (PPVT-4).</italic></source> <publisher-loc>New York, NY</publisher-loc>: <publisher-name>Pearson Assessments</publisher-name>, <pub-id pub-id-type="doi">10.1037/t15144-000</pub-id></citation></ref>
<ref id="B25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dupuis</surname> <given-names>K.</given-names></name> <name><surname>Pichora-Fuller</surname> <given-names>M. K.</given-names></name></person-group> (<year>2010</year>). <article-title>Use of affective prosody by young and older adults.</article-title> <source><italic>Psychol. Aging</italic></source> <volume>25</volume> <fpage>16</fpage>&#x2013;<lpage>29</lpage>. <pub-id pub-id-type="doi">10.1037/a0018777</pub-id> <pub-id pub-id-type="pmid">20230124</pub-id></citation></ref>
<ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fern&#x00E1;ndez-Alcaraz</surname> <given-names>C.</given-names></name> <name><surname>Extremera</surname> <given-names>M. R.</given-names></name> <name><surname>Garc&#x00ED;a-Andres</surname> <given-names>E.</given-names></name> <name><surname>Fernando</surname></name> <name><surname>Molina</surname> <given-names>C.</given-names></name></person-group> (<year>2010</year>). <article-title>Emotion recognition in down&#x2019;s syndrome adults: neuropsychology approach.</article-title> <source><italic>Procedia Soc. Behav. Sci.</italic></source> <volume>5</volume> <fpage>2072</fpage>&#x2013;<lpage>2076</lpage>. <pub-id pub-id-type="doi">10.1016/j.sbspro.2010.07.415</pub-id></citation></ref>
<ref id="B27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gautam</surname> <given-names>S.</given-names></name> <name><surname>Singh</surname> <given-names>L.</given-names></name></person-group> (<year>2016</year>). &#x201C;<article-title>A comparative study: Spectral parameter in speech of intellectually disabled and normal population</article-title>,&#x201D; in <source><italic>Proceedings of the 3rd International Conference on Computing for Sustainable Global Development</italic></source>, (<publisher-loc>New Delhi</publisher-loc>), <fpage>4009</fpage>&#x2013;<lpage>4013</lpage>.</citation></ref>
<ref id="B28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gon&#x00E7;alves</surname> <given-names>A. R.</given-names></name> <name><surname>Fernandes</surname> <given-names>C.</given-names></name> <name><surname>Pasion</surname> <given-names>R.</given-names></name> <name><surname>Ferreira-Santos</surname> <given-names>F.</given-names></name> <name><surname>Barbosa</surname> <given-names>F.</given-names></name> <name><surname>Marques-Teixeira</surname> <given-names>J.</given-names></name></person-group> (<year>2018</year>). <article-title>Effects of age on the identification of emotions in facial expressions: a meta-analysis.</article-title> <source><italic>PeerJ.</italic></source> <volume>6</volume>:<issue>e5278</issue>. <pub-id pub-id-type="doi">10.7717/peerj.5278</pub-id> <pub-id pub-id-type="pmid">30065878</pub-id></citation></ref>
<ref id="B29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hippolyte</surname> <given-names>L.</given-names></name> <name><surname>Barisnikov</surname> <given-names>K.</given-names></name> <name><surname>Van der Linden</surname> <given-names>M.</given-names></name></person-group> (<year>2008</year>). <article-title>Face processing and facial emotion recognition in adults with down syndrome.</article-title> <source><italic>Am. J. Mental Retard.</italic></source> <volume>113</volume> <fpage>292</fpage>&#x2013;<lpage>306</lpage>. <pub-id pub-id-type="doi">10.1352/0895-80172008113</pub-id> <pub-id pub-id-type="pmid">15067480</pub-id></citation></ref>
<ref id="B30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Joyce</surname> <given-names>T.</given-names></name> <name><surname>Globe</surname> <given-names>A.</given-names></name> <name><surname>Moody</surname> <given-names>C.</given-names></name></person-group> (<year>2006</year>). <article-title>Assessment of the component skills for cognitive therapy in adults with intellectual disability.</article-title> <source><italic>J. Appl. Res. Intellect. Disabil.</italic></source> <volume>19</volume> <fpage>17</fpage>&#x2013;<lpage>23</lpage>. <pub-id pub-id-type="doi">10.1111/j.1468-3148.2005.00287.x</pub-id></citation></ref>
<ref id="B31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Juslin</surname> <given-names>P. N.</given-names></name> <name><surname>Laukka</surname> <given-names>P.</given-names></name></person-group> (<year>2003</year>). <article-title>Communication of emotions in vocal expression and music performance: different channels, same code?</article-title> <source><italic>Psychol. Bull.</italic></source> <volume>129</volume> <fpage>770</fpage>&#x2013;<lpage>814</lpage>. <pub-id pub-id-type="doi">10.1037/0033-2909.129.5.770</pub-id> <pub-id pub-id-type="pmid">12956543</pub-id></citation></ref>
<ref id="B32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kashani-Vahid</surname> <given-names>L.</given-names></name> <name><surname>Mohajeri</surname> <given-names>M.</given-names></name> <name><surname>Moradi</surname> <given-names>H.</given-names></name> <name><surname>Irani</surname> <given-names>A.</given-names></name></person-group> (<year>2018</year>). &#x201C;<article-title>Effectiveness of computer games of emotion regulation on social skills of children with intellectual disability</article-title>,&#x201D; in <source><italic>Proceedings of the 2018 2nd National and 1st International Digital Games Research Conference: Trends, Technologies, and Applications (DGRC)</italic></source>, (<publisher-loc>Tehran</publisher-loc>), <pub-id pub-id-type="doi">10.1109/DGRC.2018.8712024</pub-id></citation></ref>
<ref id="B33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Landowska</surname> <given-names>A.</given-names></name> <name><surname>Karpus</surname> <given-names>A.</given-names></name> <name><surname>Zawadzka</surname> <given-names>T.</given-names></name> <name><surname>Robins</surname> <given-names>B.</given-names></name> <name><surname>Erol Barkana</surname> <given-names>D.</given-names></name> <name><surname>Kose</surname> <given-names>H.</given-names></name><etal/></person-group> (<year>2022</year>). <article-title>Automatic emotion recognition in children with autism: a systematic literature review.</article-title> <source><italic>Sensors</italic></source> <volume>22</volume>:<issue>1649</issue>. <pub-id pub-id-type="doi">10.3390/s22041649</pub-id> <pub-id pub-id-type="pmid">35214551</pub-id></citation></ref>
<ref id="B34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Laukka</surname> <given-names>P.</given-names></name> <name><surname>Elfenbein</surname> <given-names>H. A.</given-names></name> <name><surname>S&#x00F6;der</surname> <given-names>N.</given-names></name> <name><surname>Nordstr&#x00F6;m</surname> <given-names>H.</given-names></name> <name><surname>Althoff</surname> <given-names>J.</given-names></name> <name><surname>Chui</surname> <given-names>W.</given-names></name><etal/></person-group> (<year>2013</year>). <article-title>Cross-cultural decoding of positive and negative non-linguistic emotion vocalizations.</article-title> <source><italic>Front. Psychol.</italic></source> <volume>4</volume>:<issue>353</issue>. <pub-id pub-id-type="doi">10.3389/fpsyg.2013.00353</pub-id> <pub-id pub-id-type="pmid">23914178</pub-id></citation></ref>
<ref id="B35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Laws</surname> <given-names>G.</given-names></name> <name><surname>Bishop</surname> <given-names>D. V. M.</given-names></name></person-group> (<year>2004</year>). <article-title>Pragmatic language impairment and social deficits in Williams syndrome: a comparison with Down&#x2019;s syndrome and specific language impairment.</article-title> <source><italic>Int. J. Lang. Commun. Disord.</italic></source> <volume>39</volume> <fpage>45</fpage>&#x2013;<lpage>64</lpage>. <pub-id pub-id-type="doi">10.1080/13682820310001615797</pub-id> <pub-id pub-id-type="pmid">14660186</pub-id></citation></ref>
<ref id="B36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lee</surname> <given-names>M. T.</given-names></name> <name><surname>Thorpe</surname> <given-names>J.</given-names></name> <name><surname>Verhoeven</surname> <given-names>J.</given-names></name></person-group> (<year>2009</year>). <article-title>Intonation and phonation in young adults with down syndrome.</article-title> <source><italic>J. Voice</italic></source> <volume>23</volume> <fpage>82</fpage>&#x2013;<lpage>87</lpage>. <pub-id pub-id-type="doi">10.1016/j.jvoice.2007.04.006</pub-id> <pub-id pub-id-type="pmid">17658722</pub-id></citation></ref>
<ref id="B37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Leung</surname> <given-names>J. P.</given-names></name> <name><surname>Singh</surname> <given-names>N. N.</given-names></name></person-group> (<year>1998</year>). <article-title>Recognition of facial expressions of emotion by Chinese adults with mental retardation.</article-title> <source><italic>Behav. Modif.</italic></source> <volume>22</volume> <fpage>205</fpage>&#x2013;<lpage>216</lpage>. <pub-id pub-id-type="doi">10.1177/01454455980222008</pub-id> <pub-id pub-id-type="pmid">9563293</pub-id></citation></ref>
<ref id="B38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Loveall</surname> <given-names>S. J.</given-names></name> <name><surname>Channell</surname> <given-names>M. M.</given-names></name> <name><surname>Phillips</surname> <given-names>B. A.</given-names></name> <name><surname>Abbeduto</surname> <given-names>L.</given-names></name> <name><surname>Conners</surname> <given-names>F. A.</given-names></name></person-group> (<year>2016</year>). <article-title>Receptive vocabulary analysis in Down syndrome.</article-title> <source><italic>Res. Dev. Disabil.</italic></source> <volume>55</volume> <fpage>161</fpage>&#x2013;<lpage>172</lpage>. <pub-id pub-id-type="doi">10.1016/j.ridd.2016.03.018</pub-id> <pub-id pub-id-type="pmid">27084992</pub-id></citation></ref>
<ref id="B39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mart&#x00ED;nez-Gonz&#x00E1;lez</surname> <given-names>A. E.</given-names></name> <name><surname>Veas</surname> <given-names>A.</given-names></name></person-group> (<year>2019</year>). <article-title>Identification of emotions and physiological response in individuals with moderate intellectual disability.</article-title> <source><italic>Int. J. Dev. Disabil.</italic></source> <volume>67</volume> <fpage>397</fpage>&#x2013;<lpage>402</lpage>. <pub-id pub-id-type="doi">10.1080/20473869.2019.1651142</pub-id> <pub-id pub-id-type="pmid">34925769</pub-id></citation></ref>
<ref id="B40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Martinez-Martin</surname> <given-names>E.</given-names></name> <name><surname>Escalona</surname> <given-names>F.</given-names></name> <name><surname>Cazorla</surname> <given-names>M.</given-names></name></person-group> (<year>2020</year>). <article-title>Socially assistive robots for older adults and people with autism: an overview.</article-title> <source><italic>Electronics</italic></source> <volume>9</volume> <fpage>367</fpage>&#x2013;<lpage>383</lpage>. <pub-id pub-id-type="doi">10.3390/electronics9020367</pub-id></citation></ref>
<ref id="B41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ma</surname> <given-names>K.</given-names></name> <name><surname>Wang</surname> <given-names>X.</given-names></name> <name><surname>Yang</surname> <given-names>X.</given-names></name> <name><surname>Zhang</surname> <given-names>M.</given-names></name> <name><surname>Girard</surname> <given-names>J. M.</given-names></name> <name><surname>Morency</surname> <given-names>L. P.</given-names></name></person-group> (<year>2019</year>). &#x201C;<article-title>ElderReact: a multimodal dataset for recognizing emotional response in aging adults</article-title>,&#x201D; in <source><italic>In 2019 International Conference on Multimodal Interaction</italic></source>, (<publisher-loc>New York, NY</publisher-loc>: <publisher-name>Machinery</publisher-name>), <fpage>349</fpage>&#x2013;<lpage>357</lpage>. <pub-id pub-id-type="doi">10.1145/3340555.3353747</pub-id></citation></ref>
<ref id="B42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Martin</surname> <given-names>G. E.</given-names></name> <name><surname>Klusek</surname> <given-names>J.</given-names></name> <name><surname>Estigarribia</surname> <given-names>B.</given-names></name> <name><surname>Roberts</surname> <given-names>J. E.</given-names></name></person-group> (<year>2009</year>). <article-title>Language characteristics of individuals with Down Syndrome.</article-title> <source><italic>Top. Lang. Disord.</italic></source> <volume>29</volume> <fpage>112</fpage>&#x2013;<lpage>132</lpage>. <pub-id pub-id-type="doi">10.1097/TLD.0b013e3181a71fe1</pub-id> <pub-id pub-id-type="pmid">20428477</pub-id></citation></ref>
<ref id="B43"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Matheson</surname> <given-names>E.</given-names></name> <name><surname>Jahoda</surname> <given-names>A.</given-names></name></person-group> (<year>2005</year>). <article-title>Emotional understanding in aggressive and nonaggressive individuals with mild or moderate mental retardation.</article-title> <source><italic>Am. J. Mental Retard.</italic></source> <volume>110</volume> <fpage>57</fpage>&#x2013;<lpage>67</lpage>. <pub-id pub-id-type="doi">10.1352/0895-80172005110</pub-id> <pub-id pub-id-type="pmid">15067480</pub-id></citation></ref>
<ref id="B44"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mazzoni</surname> <given-names>N.</given-names></name> <name><surname>Landi</surname> <given-names>I.</given-names></name> <name><surname>Ricciardelli</surname> <given-names>P.</given-names></name> <name><surname>Actis-Grosso</surname> <given-names>R.</given-names></name> <name><surname>Venuti</surname> <given-names>P.</given-names></name></person-group> (<year>2020</year>). <article-title>Motion or emotion? Recognition of emotional bodily expressions in children with autism spectrum disorder with and without intellectual disability.</article-title> <source><italic>Front. Psychol.</italic></source> <volume>11</volume>:<issue>478</issue>. <pub-id pub-id-type="doi">10.3389/fpsyg.2020.00478</pub-id> <pub-id pub-id-type="pmid">32269539</pub-id></citation></ref>
<ref id="B45"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>McKenzie</surname> <given-names>K.</given-names></name> <name><surname>Murray</surname> <given-names>G.</given-names></name> <name><surname>Murray</surname> <given-names>A.</given-names></name> <name><surname>Whelan</surname> <given-names>K.</given-names></name> <name><surname>Cossar</surname> <given-names>J.</given-names></name> <name><surname>Murray</surname> <given-names>K.</given-names></name><etal/></person-group> (<year>2021</year>). <article-title>Emotion recognition and processing style in children with an intellectual disability.</article-title> <source><italic>Learn. Disabil. Pract.</italic></source> <volume>22</volume>, <fpage>20</fpage>&#x2013;<lpage>24</lpage>. <pub-id pub-id-type="doi">10.7748/LDP.2019.E1982</pub-id></citation></ref>
<ref id="B46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>McKenzie</surname> <given-names>K.</given-names></name> <name><surname>Matheson</surname> <given-names>E.</given-names></name> <name><surname>McKaskie</surname> <given-names>K.</given-names></name> <name><surname>Hamilton</surname> <given-names>L.</given-names></name> <name><surname>Murray</surname> <given-names>G. C.</given-names></name></person-group> (<year>2000</year>). <article-title>Impact of group training on emotion recognition in individuals with a learning disability.</article-title> <source><italic>Br. J. Learning Disabil.</italic></source> <volume>28</volume> <fpage>143</fpage>&#x2013;<lpage>147</lpage>. <pub-id pub-id-type="doi">10.1046/j.1468-3156.2000.00061.x</pub-id></citation></ref>
<ref id="B47"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mendhakar</surname> <given-names>A. M.</given-names></name> <name><surname>Sreedevi</surname> <given-names>N.</given-names></name> <name><surname>Arunraj</surname> <given-names>K.</given-names></name> <name><surname>Shanbal</surname> <given-names>J. C.</given-names></name></person-group> (<year>2019</year>). <article-title>Infant screening system based on cry analysis.</article-title> <source><italic>Int. Ann. Sci.</italic></source> <volume>6</volume> <fpage>1</fpage>&#x2013;<lpage>7</lpage>. <pub-id pub-id-type="doi">10.21467/ias.6.1.1-7</pub-id></citation></ref>
<ref id="B48"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Moore</surname> <given-names>D. G.</given-names></name></person-group> (<year>2001</year>). <article-title>Reassessing emotion recognition performance in people with mental retardation: a review.</article-title> <source><italic>Am. J. Mental Retard.</italic></source> <volume>106</volume> <fpage>481</fpage>&#x2013;<lpage>502</lpage>. <pub-id pub-id-type="doi">10.1352/0895-80172001106&#x003C;0481:rerpip&#x003C;2.0.co</pub-id> <pub-id pub-id-type="pmid">15067480</pub-id></citation></ref>
<ref id="B49"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Murray</surname> <given-names>G.</given-names></name> <name><surname>McKenzie</surname> <given-names>K.</given-names></name> <name><surname>Murray</surname> <given-names>A.</given-names></name> <name><surname>Whelan</surname> <given-names>K.</given-names></name> <name><surname>Cossar</surname> <given-names>J.</given-names></name> <name><surname>Murray</surname> <given-names>K.</given-names></name><etal/></person-group> (<year>2019</year>). <article-title>The impact of contextual information on the emotion recognition of children with an intellectual disability.</article-title> <source><italic>J. Appl. Res. Intellect. Disabil.</italic></source> <volume>32</volume> <fpage>152</fpage>&#x2013;<lpage>158</lpage>. <pub-id pub-id-type="doi">10.1111/jar.12517</pub-id> <pub-id pub-id-type="pmid">30014564</pub-id></citation></ref>
<ref id="B50"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>N&#x00ED; Chasaide</surname> <given-names>A.</given-names></name> <name><surname>Gobl</surname> <given-names>C.</given-names></name></person-group> (<year>2004</year>). &#x201C;<article-title>Voice quality and f0 in prosody: towards a holistic account</article-title>,&#x201D; in <source><italic>Proceedings of the 2nd International Conference on Speech Prosody</italic></source>, (<publisher-loc>Nara</publisher-loc>), <fpage>189</fpage>&#x2013;<lpage>196</lpage>.</citation></ref>
<ref id="B51"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>O&#x2019; Leary</surname> <given-names>D.</given-names></name> <name><surname>Lee</surname> <given-names>A.</given-names></name> <name><surname>O&#x2019;Toole</surname> <given-names>C.</given-names></name> <name><surname>Gibbon</surname> <given-names>F.</given-names></name></person-group> (<year>2019</year>). <article-title>Perceptual and acoustic evaluation of speech production in down syndrome: a case series.</article-title> <source><italic>Clin. Linguist. Phon.</italic></source> <volume>34</volume> <fpage>1</fpage>&#x2013;<lpage>2</lpage>. <pub-id pub-id-type="doi">10.1080/02699206.2019.1611925</pub-id> <pub-id pub-id-type="pmid">31345071</pub-id></citation></ref>
<ref id="B52"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pell</surname> <given-names>M. D.</given-names></name> <name><surname>Monetta</surname> <given-names>L.</given-names></name> <name><surname>Paulmann</surname> <given-names>S.</given-names></name> <name><surname>Kotz</surname> <given-names>S. A.</given-names></name></person-group> (<year>2009</year>). <article-title>Recognizing emotions in a foreign language.</article-title> <source><italic>J. Nonverbal Behav.</italic></source> <volume>33</volume> <fpage>107</fpage>&#x2013;<lpage>120</lpage>. <pub-id pub-id-type="doi">10.1007/s10919-008-0065-7</pub-id></citation></ref>
<ref id="B53"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Petrovic-Lazic</surname> <given-names>M.</given-names></name> <name><surname>Babac</surname> <given-names>S.</given-names></name> <name><surname>Ivankovic</surname> <given-names>Z.</given-names></name> <name><surname>Kosanovic</surname> <given-names>R.</given-names></name></person-group> (<year>2009</year>). <article-title>Multidimensional acoustic analysis of pathological voice.</article-title> <source><italic>Serb. Arch. Med.</italic></source> <volume>137</volume> <fpage>234</fpage>&#x2013;<lpage>238</lpage>. <pub-id pub-id-type="doi">10.2298/SARH0906234P</pub-id> <pub-id pub-id-type="pmid">19594063</pub-id></citation></ref>
<ref id="B54"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Petrovic-Lazic</surname> <given-names>M.</given-names></name> <name><surname>Jovanovic</surname> <given-names>N.</given-names></name> <name><surname>Kulic</surname> <given-names>N.</given-names></name> <name><surname>Babac</surname> <given-names>S.</given-names></name> <name><surname>Jurisic</surname> <given-names>V.</given-names></name></person-group> (<year>2014</year>). <article-title>Acoustic and perceptual characteristics of the voice in patients with vocal polyps after surgery and voice therapy.</article-title> <source><italic>J. Voice</italic></source> <volume>29</volume> <fpage>241</fpage>&#x2013;<lpage>246</lpage>. <pub-id pub-id-type="doi">10.1016/j.jvoice.2014.07.009</pub-id> <pub-id pub-id-type="pmid">25301300</pub-id></citation></ref>
<ref id="B55"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pochon</surname> <given-names>R.</given-names></name> <name><surname>Touchet</surname> <given-names>C.</given-names></name> <name><surname>Ibernon</surname> <given-names>L.</given-names></name></person-group> (<year>2017</year>). <article-title>Emotion recognition in adolescents with Down Syndrome: a nonverbal approach.</article-title> <source><italic>Brain Sci.</italic></source> <volume>7</volume>:<issue>55</issue>. <pub-id pub-id-type="doi">10.3390/brainsci7060055</pub-id> <pub-id pub-id-type="pmid">28545237</pub-id></citation></ref>
<ref id="B56"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Raven</surname> <given-names>J.</given-names></name> <name><surname>Raven</surname> <given-names>J. C.</given-names></name></person-group> (<year>1998</year>). <source><italic>Manual for Raven&#x2019;s Progressive Matrices and Vocabulary Scales Standard Progressive Matrices.</italic></source> <publisher-loc>Perth</publisher-loc>: <publisher-name>Naklada Slap</publisher-name>.</citation></ref>
<ref id="B57"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Roch</surname> <given-names>M.</given-names></name> <name><surname>Pesciarelli</surname> <given-names>F.</given-names></name> <name><surname>Leo</surname> <given-names>I.</given-names></name></person-group> (<year>2020</year>). <article-title>How individuals with down syndrome process faces and words conveying emotions? Evidence from a priming paradigm.</article-title> <source><italic>Front. Psychol.</italic></source> <volume>11</volume>:<issue>692</issue>. <pub-id pub-id-type="doi">10.3389/fpsyg.2020.00692</pub-id> <pub-id pub-id-type="pmid">32362859</pub-id></citation></ref>
<ref id="B58"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rot</surname> <given-names>N.</given-names></name></person-group> (<year>2004</year>). <source><italic>Znakovi i zna&#x00E8;enja Plato.</italic></source> <publisher-loc>Beograd</publisher-loc>: <publisher-name>Plato.</publisher-name></citation></ref>
<ref id="B59"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sacco</surname> <given-names>K.</given-names></name> <name><surname>Angeleri</surname> <given-names>R.</given-names></name> <name><surname>Bosco</surname> <given-names>F. M.</given-names></name> <name><surname>Colle</surname> <given-names>L.</given-names></name> <name><surname>Mate</surname> <given-names>D.</given-names></name> <name><surname>Bara</surname> <given-names>B. G.</given-names></name></person-group> (<year>2008</year>). <article-title>Assessment battery for communication - ABaCo: a new instrument for the evaluation of pragmatic abilities.</article-title> <source><italic>J. Cogn. Sci.</italic></source> <volume>9</volume> <fpage>111</fpage>&#x2013;<lpage>157</lpage>. <pub-id pub-id-type="doi">10.17791/jcs.2008.9.2.111</pub-id></citation></ref>
<ref id="B60"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Salomone</surname> <given-names>E.</given-names></name> <name><surname>Bulgarelli</surname> <given-names>D.</given-names></name> <name><surname>Thommen</surname> <given-names>E.</given-names></name> <name><surname>Rossini</surname> <given-names>E.</given-names></name> <name><surname>Molina</surname> <given-names>P.</given-names></name></person-group> (<year>2019</year>). <article-title>Role of age and IQ in emotion understanding in autism spectrum disorder: implications for educational interventions.</article-title> <source><italic>Eur. J. Spec. Educ.</italic></source> <volume>34</volume> <fpage>383</fpage>&#x2013;<lpage>392</lpage>. <pub-id pub-id-type="doi">10.1080/08856257.2018.1451292</pub-id></citation></ref>
<ref id="B61"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sauter</surname> <given-names>D. A.</given-names></name> <name><surname>Eisner</surname> <given-names>F.</given-names></name> <name><surname>Calder</surname> <given-names>A. J.</given-names></name> <name><surname>Scott</surname> <given-names>S. K.</given-names></name></person-group> (<year>2010</year>). <article-title>Perceptual cues in nonverbal vocal expressions of emotion.</article-title> <source><italic>Q. J. Exp. Psychol.</italic></source> <volume>63</volume> <fpage>2251</fpage>&#x2013;<lpage>2272</lpage>. <pub-id pub-id-type="doi">10.1080/17470211003721642</pub-id> <pub-id pub-id-type="pmid">20437296</pub-id></citation></ref>
<ref id="B62"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sauter</surname> <given-names>D. A.</given-names></name> <name><surname>Scott</surname> <given-names>S. K.</given-names></name></person-group> (<year>2007</year>). <article-title>More than one kind of happiness: can we recognize vocal expressions of different positive states?</article-title> <source><italic>Motiv. Emot.</italic></source> <volume>31</volume> <fpage>192</fpage>&#x2013;<lpage>199</lpage>. <pub-id pub-id-type="doi">10.1007/s11031-007-9065-x</pub-id></citation></ref>
<ref id="B63"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Scherer</surname> <given-names>K. R.</given-names></name></person-group> (<year>2007</year>). &#x201C;<article-title>Componential emotion theory can inform models of emotional competence</article-title>,&#x201D; in <source><italic>The Science of Emotional Intelligence: Knowns and Unknowns</italic></source>, <role>eds</role> <person-group person-group-type="editor"><name><surname>Matthews</surname> <given-names>G.</given-names></name> <name><surname>Zeidner</surname> <given-names>M.</given-names></name> <name><surname>Roberts</surname> <given-names>R. D.</given-names></name></person-group> (<publisher-loc>Oxford</publisher-loc>: <publisher-name>Oxford University Press</publisher-name>), <fpage>101</fpage>&#x2013;<lpage>126</lpage>.</citation></ref>
<ref id="B64"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Scherer</surname> <given-names>K.</given-names></name></person-group> (<year>2003</year>). <article-title>Vocal communication of emotion: a review of research paradigms.</article-title> <source><italic>Speech Commun.</italic></source> <volume>40</volume> <fpage>227</fpage>&#x2013;<lpage>256</lpage>. <pub-id pub-id-type="doi">10.1016/S0167-6393(02)00084-5</pub-id></citation></ref>
<ref id="B65"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Scherer</surname> <given-names>K. R.</given-names></name> <name><surname>Banse</surname> <given-names>R.</given-names></name> <name><surname>Wallbott</surname> <given-names>H. G.</given-names></name> <name><surname>Goldbeck</surname> <given-names>T.</given-names></name></person-group> (<year>1991</year>). <article-title>Vocal cues in emotion encoding and decoding.</article-title> <source><italic>Motiv. Emot.</italic></source> <volume>15</volume> <fpage>123</fpage>&#x2013;<lpage>148</lpage>. <pub-id pub-id-type="doi">10.1007/bf00995674</pub-id></citation></ref>
<ref id="B66"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schlegel</surname> <given-names>K.</given-names></name> <name><surname>Palese</surname> <given-names>T.</given-names></name> <name><surname>Mast</surname> <given-names>M.</given-names></name> <name><surname>Rammsayer</surname> <given-names>T. H.</given-names></name> <name><surname>Hall</surname> <given-names>J. A.</given-names></name> <name><surname>Murphy</surname> <given-names>N.</given-names></name></person-group> (<year>2020</year>). <article-title>A meta-analysis of the relationship between emotion recognition ability and intelligence.</article-title> <source><italic>Cogn. Emot.</italic></source> <volume>34</volume> <fpage>329</fpage>&#x2013;<lpage>351</lpage>. <pub-id pub-id-type="doi">10.1080/02699931.2019.1632801</pub-id> <pub-id pub-id-type="pmid">31221021</pub-id></citation></ref>
<ref id="B67"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Scotland</surname> <given-names>J.</given-names></name> <name><surname>Cossar</surname> <given-names>J.</given-names></name> <name><surname>McKenzie</surname> <given-names>K.</given-names></name></person-group> (<year>2015</year>). <article-title>The ability of adults with an intellectual disability to recognise facial expressions of emotion in comparison with typically developing individuals: a systematic review.</article-title> <source><italic>Res. Dev. Disabil.</italic></source> <volume>41-42</volume> <fpage>22</fpage>&#x2013;<lpage>39</lpage>. <pub-id pub-id-type="doi">10.1016/j.ridd.2015.05.007</pub-id> <pub-id pub-id-type="pmid">26057835</pub-id></citation></ref>
<ref id="B68"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Scotland</surname> <given-names>J.</given-names></name> <name><surname>McKenzie</surname> <given-names>K.</given-names></name> <name><surname>Cossar</surname> <given-names>J.</given-names></name> <name><surname>Murray</surname> <given-names>A. L.</given-names></name> <name><surname>Michie</surname> <given-names>A.</given-names></name></person-group> (<year>2016</year>). <article-title>Recognition of facial expressions of emotion by adults with intellectual disability: is there evidence for the emotion specificity hypothesis?</article-title> <source><italic>Res. Dev. Disabil.</italic></source> <volume>48</volume> <fpage>69</fpage>&#x2013;<lpage>78</lpage>. <pub-id pub-id-type="doi">10.1016/j.ridd.2015.10.018</pub-id> <pub-id pub-id-type="pmid">26546741</pub-id></citation></ref>
<ref id="B69"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Simon-Thomas</surname> <given-names>E. R.</given-names></name> <name><surname>Keltner</surname> <given-names>D. J.</given-names></name> <name><surname>Sauter</surname> <given-names>D.</given-names></name> <name><surname>Sinicropi-Yao</surname> <given-names>L.</given-names></name> <name><surname>Abramson</surname> <given-names>A.</given-names></name></person-group> (<year>2009</year>). <article-title>The voice conveys specific emotions: Evidence from vocal burst displays.</article-title> <source><italic>Emotion</italic></source> <volume>9</volume> <fpage>838</fpage>&#x2013;<lpage>846</lpage>. <pub-id pub-id-type="doi">10.1037/a0017810</pub-id> <pub-id pub-id-type="pmid">20001126</pub-id></citation></ref>
<ref id="B70"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Smirni</surname> <given-names>D.</given-names></name> <name><surname>Smirni</surname> <given-names>P.</given-names></name> <name><surname>Di Martino</surname> <given-names>G.</given-names></name> <name><surname>Operto</surname> <given-names>F.</given-names></name> <name><surname>Carotenuto</surname> <given-names>M.</given-names></name></person-group> (<year>2019</year>). <article-title>Emotional awareness and cognitive performance in borderline intellectual functioning young adolescents.</article-title> <source><italic>J. Nerv. Ment. Dis.</italic></source> <volume>207</volume> <fpage>365</fpage>&#x2013;<lpage>370</lpage>. <pub-id pub-id-type="doi">10.1097/NMD.0000000000000972</pub-id> <pub-id pub-id-type="pmid">30932986</pub-id></citation></ref>
<ref id="B71"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sullivan</surname> <given-names>S.</given-names></name> <name><surname>Campbell</surname> <given-names>A.</given-names></name> <name><surname>Hutton</surname> <given-names>S. B.</given-names></name> <name><surname>Ruffman</surname> <given-names>T.</given-names></name></person-group> (<year>2015</year>). <article-title>What&#x2019;s good for the goose is not good for the gander: age and gender differences in scanning emotion faces.</article-title> <source><italic>J. Gerontol. Ser. B. Psychol. Sci. Soc. Sci.</italic></source> <volume>72</volume> <fpage>441</fpage>&#x2013;<lpage>447</lpage>. <pub-id pub-id-type="doi">10.1093/geronb/gbv033</pub-id> <pub-id pub-id-type="pmid">25969472</pub-id></citation></ref>
<ref id="B72"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tomi&#x0107;</surname> <given-names>Z.</given-names></name></person-group> (<year>2014</year>). <source><italic>Razumevanje i Nesporazumi.</italic></source> <publisher-loc>Belgrade</publisher-loc>: <publisher-name>&#x010C;igoja &#x0161;tampa</publisher-name>.</citation></ref>
<ref id="B73"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Trentacosta</surname> <given-names>C. J.</given-names></name> <name><surname>Fine</surname> <given-names>S. E.</given-names></name></person-group> (<year>2010</year>). <article-title>Emotion knowledge, social competence, and behavior problems in childhood and adolescence: a meta-analytic review.</article-title> <source><italic>Soc. Dev.</italic></source> <volume>19</volume> <fpage>1</fpage>&#x2013;<lpage>29</lpage>. <pub-id pub-id-type="doi">10.1111/j.1467-9507.2009.00543.x</pub-id> <pub-id pub-id-type="pmid">21072259</pub-id></citation></ref>
<ref id="B74"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wishart</surname> <given-names>J. G.</given-names></name> <name><surname>Cebula</surname> <given-names>K. R.</given-names></name> <name><surname>Willis</surname> <given-names>D. S.</given-names></name> <name><surname>Pitcairn</surname> <given-names>T. K.</given-names></name></person-group> (<year>2007</year>). <article-title>Understanding of facial expressions of emotion by children with intellectual disabilities of differing aetiology.</article-title> <source><italic>J. Intellect. Disabil. Res.</italic></source> <volume>51</volume> <fpage>551</fpage>&#x2013;<lpage>563</lpage>. <pub-id pub-id-type="doi">10.1111/j.1365-2788.2006.00947.x</pub-id> <pub-id pub-id-type="pmid">17537169</pub-id></citation></ref>
<ref id="B75"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zampini</surname> <given-names>L.</given-names></name> <name><surname>Fasolo</surname> <given-names>M.</given-names></name> <name><surname>Spinelli</surname> <given-names>M.</given-names></name> <name><surname>Zanchi</surname> <given-names>P.</given-names></name> <name><surname>Suttora</surname> <given-names>C.</given-names></name> <name><surname>Salerni</surname> <given-names>N.</given-names></name></person-group> (<year>2016</year>). <article-title>Prosodic skills in children with Down syndrome and in typically developing children.</article-title> <source><italic>Int. J. Lang. Commun. Disord.</italic></source> <volume>51</volume> <fpage>74</fpage>&#x2013;<lpage>83</lpage>.</citation></ref>
</ref-list>
<fn-group>
<fn id="footnote1">
<label>1</label>
<p>Down syndrome (DS) is the most common genetic cause of intellectual disability. The overall prevalence varies from 1 in 800 to 1,200. Most people with DS function within the mild to moderate ID range. These people have underdeveloped language and communication skills primarily due to speech delays, poor articulation due to narrow oral cavity, and not understanding language (<xref ref-type="bibr" rid="B1">Agarwal Gupta and Kabra, 2013</xref>).</p></fn>
<fn id="footnote2">
<label>2</label>
<p>There are four basic ID levels: mild, moderate (MID), severe, and profound. About 10% of people with ID function at the MID level. According to the DSM-4, persons with moderate ID have IQ in the range of 35&#x2013;49 IQs. Description of this condition indicates that these people can take care of themselves, travel to familiar places in their community, and learn basic skills related to safety and health. Their self-care requires moderate support. According to all these previous statements, it can be concluded that MID can perform most daily activities independently with occasional support (<xref ref-type="bibr" rid="B5">American Psychiatric Association, 2013</xref>). Although they can be successful in basic communication, they can have difficulties understanding and comprehending non-verbal and subtle signals in social situations.</p></fn>
<fn id="footnote3">
<label>3</label>
<p>Day-care services in Serbia are social protection services for children and young people with physical impairment and intellectual disability. According to the current Policy, the purpose of day-care services is to improve the quality of life of users in their social environment through maintaining and developing social, psychological, and physical functions and skills, in order to enable them to live as independently as possible (The Policy was published in the &#x201C;Official Gazette of RS, No. 42/2013 from 14th May 2013, and became effective on 22nd May 2013).</p></fn>
</fn-group>
</back>
</article>