Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Brain mechanisms of eye contact during verbal communication predict autistic traits in neurotypical individuals

Abstract

Atypical eye contact in communication is a common characteristic in autism spectrum disorders. Autistic traits vary along a continuum extending into the neurotypical population. The relation between autistic traits and brain mechanisms underlying spontaneous eye contact during verbal communication remains unexplored. Here, we used simultaneous functional magnetic resonance imaging and eye tracking to investigate this relation in neurotypical people within a naturalistic verbal context. Using multiple regression analyses, we found that brain response in the posterior superior temporal sulcus (pSTS) and its connectivity with the fusiform face area (FFA) during eye contact with a speaker predicted the level of autistic traits measured by Autism-spectrum Quotient (AQ). Further analyses for different AQ subclusters revealed that these two predictors were negatively associated with attention to detail. The relation between FFA–pSTS connectivity and the attention to detail ability was mediated by individuals’ looking preferences for speaker’s eyes. This study identified the role of an individual eye contact pattern in the relation between brain mechanisms underlying natural eye contact during verbal communication and autistic traits in neurotypical people. The findings may help to increase our understanding of the mechanisms of atypical eye contact behavior during natural communication.

Introduction

Eye contact is an important non-verbal component during social interaction (for reviews, see Refs.1,2,3). It occurs spontaneously and frequently during face-to-face communication4,5. An atypical pattern of eye contact is often observed in people with autism spectrum6 and atypical pattern of eye contact is also part of the criteria for diagnosing autism spectrum disorder (ASD)7. Autistic traits do not exist solely within the clinical population, but also in the general population7,8,9. Autistic traits are associated with particular eye contact behavior10,11. However, the brain mechanisms that link autistic traits and a particular eye contact pattern are unknown.

A network of brain regions is involved in eye contact processing in non-verbal situations (for reviews, see Refs.1,2) as well as in verbal communication12. The fast-track modulator model2 of eye contact has been developed mainly on findings that investigated eye contact in non-verbal situations. It assumes that this network is composed of subcortical [including superior colliculus (SC), pulvinar (Pulv) and amygdala (Amy)] and cortical visual areas [including lateral occipital cortex (LOC) and inferior temporal cortex (ITC)] interacting with brain regions of the so-called ‘social brain network’ [including Amy and orbitofrontal cortex (OFC) for emotion, pSTS and medial prefrontal cortex (mPFC) for intentionality, right anterior STS (aSTS) for gaze direction, and fusiform gyrus (FG) for face identity]. The regions within the social brain network are thought to interact with each other and to be modulated by dorsolateral prefrontal cortex (dlPFC). For verbal communication, a similar network has been identified in our previous study12. It includes increased responses in the right pSTS, left mPFC, the right dlPFC as well as visual cortices, and extensive enhanced connectivity between these regions and all other regions involved in the fast-track modulator model2. Several recent neuroimaging studies have shown that responses in part of this eye contact network, for example in the right pSTS, were associated with the amount of autistic traits in neurotypical individuals13,14,15. This relation was found in non-verbal situations in which participants passively watched faces with a direct or averted gaze. It remains unexplored whether a similar relation is also present between autistic traits and any regions and/or connectivity in the network for eye contact processing during verbal communication we identified in our previous study12. Addressing this question is important, because atypical eye contact is especially prominent in natural, complex, and cognitively demanding situations (e.g., when watching a speaker talking) in both ASD16,17,18,19,20 and neurotypical people with high autistic traits21.

Furthermore, neurotypical people show large inter-individual differences in looking preference for the eyes in both audiovisual speech perception22 and static face viewing23,24. These individual differences in looking behavior are very stable over time23,24. They have been associated with the level of overall autistic traits (Refs.11,25 but also see26) or subclusters (“Social” and “Attention-to-detail”) of autistic traits10,27. In addition, findings from neuroimaging studies in neurotypical28 and autistic29 individuals both suggested that variation in eye movement pattern influences brain responses. Therefore, it is currently unclear whether the association between brain responses and autistic traits found in previous studies13,14,15 are due to different types and amounts of visual input caused by individuals’ looking preferences or a processing mechanism itself that is independent of visual input. To clarify this, it is crucial to distinguish the brain mechanisms of eye contact (when participants are looking at the eyes) from those of looking at other facial features. It is also of great importance to consider the level of each participant’s looking preference for the eyes when investigating the relation between brain mechanisms of eye contact and autistic traits. It is likely that the brain mechanisms of eye contact are related to individual looking preference, which is in turn related to the overall and/or a specific subcluster of autistic traits.

Therefore, we had two goals in our study: First, to investigate whether the brain mechanisms underlying spontaneous eye contact during verbal communication are related to the overall and/or specific aspects of autistic traits in neurotypical individuals. Second, to examine whether an individual’s looking preference for the eyes mediates the relation between brain mechanisms of eye contact and the overall and/or specific aspects of autistic traits.

To address our goals, we conducted novel analyses based on fMRI data collected in our previous study12 and combined them with measures of AQ in the same subject group. In the fMRI study we created a naturalistic situation in which the listener made spontaneous eye contact while listening and watching another person talking. To do that, we presented participants with pre-recorded videos of speakers directly gazing at the camera while talking about daily life topics. The participants were instructed to listen to the speakers carefully and were allowed to freely look at different regions of the speaker’s face, as they would in daily communication. We simultaneously recorded fMRI and eye tracking data from the participants and used the fixations obtained from eye tracking data to define events for the fMRI analyses. This so-called fixation-based event-related fMRI (FIBER fMRI)30,31 allowed us to separate brain response and connectivity when participants spontaneously looked at the speaker’s eyes (Eyes events), from when they looked at the speaker’s mouth or elsewhere (Mouth or Off events)12.

We used the AQ8 to assess individual’s autistic traits. The AQ has good internal consistency32, predictive validity33, and test–retest reliability34 across different cultures in both people with ASD and the neurotypical population9,35,36. Previous studies have shown that factor analyses of the AQ resulted in two to five factors9,37,38,39. All studies reported two key factors/subclusters (“Social” and “Attention-to-detail”). These two factors have also been found to reliably and validly capture individual difference in autistic traits in a study with nearly a thousand people9. Important in the context of the present study, these subclusters are associated with individual differences in face recognition in the general population10,40,41. Therefore, we adopted the two-factor/subcluster classification of AQ to examine whether the two different subclusters of autistic traits as measured by AQ are related to neural mechanisms of eye contact. We used multiple regression analyses to address whether brain response and effective connectivity during eye contact as compared to mouth fixation could predict an individual’s AQ total score and/or AQ subcluster score7. We then used mediation analyses to examine whether participants’ looking preferences to the speaker’s eyes could mediate the relation between brain mechanisms for eye contact and the AQ total scores and/or AQ subcluster scores.

Results

BOLD response and effective connectivity during eye contact predicted autistic traits

To address our first goal, whether changes in the brain response and/or effective connectivity predict variation in autistic traits, we conducted a multiple linear regression (MLR) analysis using the stepwise method (p < 0.05 to enter, p > 0.1 to remove). Parameter estimates (referring to beta weights) extracted from both brain regions that showed increased BOLD response and enhanced effective connectivity in PPI analysis in Eyes vs. Mouth contrast identified in our previous study12 (for details see “Methods” section) were entered as independent variables (IVs) and the AQ total scores as dependent variable (DV) (Table 1). For AQ total scores we identified 2 significant predictors (adjusted R2 = 0.47, F (2, 16) = 8.87, p = 0.003, Cohen's f2 = 1.13) (Table 2): the brain response in pSTS (β =  − 0.68, t =  − 3.79, p = 0.002) and FFA–pSTS effective connectivity (β = − 0.50, t = − 2.77, p = 0.014) in the Eyes vs. Mouth contrast. These 2 predictors showed significant negative correlation with the AQ total scores across participants. Namely, decreased pSTS response and reduced FFA–pSTS effective connectivity predicted higher autistic trait scores (Fig. 1A). The collinearity statistics showed that the tolerance was 0.93 and the variance inflation factor (VIF) was 1.08 for all the predictors, indicating no multicollinearity between the predictors.

Table 1 AQ scores, subscale scores and subcluster scores for each participant.
Table 2 Stepwise multiple linear regression results for AQ and AQ subcluster.
Figure 1
figure 1

Partial regression scatter plots for BOLD response and connectivity predicting autistic traits. (A) Partial regression plots for the AQ total scores. (B) Partial regression plots for the “Attention to detail” subcluster scores. Significant predictors for both AQ total scores and “Attention to detail” subcluster scores were the BOLD response in the pSTS and the effective connectivity of FFA–pSTS in the Eyes vs. Mouth contrast.

To check which subcluster of autistic traits can be specifically predicted by the brain response and/or effective connectivity identified for overall autistic traits, we also conducted MLR analysis for each AQ subcluster separately. We used the significant predictors (pSTS response and FFA–pSTS effective connectivity) identified for AQ total scores as IVs, the scores of “Social” or “Attention to detail” subcluster as the separate DV. We found these 2 predictors specifically predicted the scores on the “Attention to detail” subcluster (adjusted R2 = 0.47, F (2, 16) = 9.09, p = 0.004 after Bonferroni correction (n = 2), Cohen's f2 = 1.13) (Table 2). The pSTS response (β =  − 0.63, t = -3.53, p = 0.003) and FFA–pSTS effective connectivity (β =  − 0.57, t =  − 3.24, p = 0.005) in the Eyes vs. Mouth contrast both showed a significant negative correlation with the “Attention to detail” subcluster scores (Fig. 1B). There was no multicollinearity between the predictors (tolerance = 0.93, VIF = 1.08). We did not find any significant predictor for the “Social” subcluster. In the following text, we refer to this set of analyses and results for AQ total scores and AQ subcluster scores as MLR I.

Looking preference for eyes mediated the relation between brain mechanisms of eye contact and “Attention to detail” aspects of autistic traits

To address our second goal, whether the looking preference for the eyes of speakers (mediator variable, MV) mediates the relation between brain response/connectivity during eye contact (IV) and AQ scores (DV), we conducted mediation analyses by following procedures recommended by Baron and Kenny42 (see “Methods” section). First, as shown in the MLR I presented above, the IVs (the pSTS response and FFA–pSTS effective connectivity) were significantly related to the DV (the overall autistic traits and “Attention to detail” subcluster of autistic traits), meeting the first criterion (Fig. 1, Table 2). Second, in MLR II, we found that one IV (the FFA–pSTS connectivity) was significantly and positively related to the DV (MV: Eyes Preference) (adjusted R2 = 0.36, F (1, 17) = 11.16, p = 0.004; β = 0.63, t = 3.34, p = 0.004) (Fig. 2, path a), meeting the second criterion. Third, in MLR III, we found (i) the MV (Eyes Preference) was only significantly related to the “Attention to detail” subcluster (adjusted R2 = 0.32, F (1, 17) = 9.33, p = 0.007; β =  − 0.60, t =  − 3.06, p = 0.007) (Fig. 2, path b). (ii) The initial significant correlation between the IV (the FFA–pSTS effective connectivity) and DV (the “Attention to detail” subcluster scores) (β =  − 0.57, t =  − 3.24, p = 0.005) (Fig. 2, path c) in the MLR I was reduced and became insignificant (β =  − 0.06, t =  − 0.21, p = 0.835) (Fig. 2, path c′), meeting the third criterion for the mediation effect by looking preference for the eyes.

Figure 2
figure 2

Mediation role of looking preference for eyes (MV) on the relation between the FFA–pSTS connectivity (IV) and the “Attention to detail” subcluster (DV). Path c showed the direct correlation between the IV and the DV when considered alone. Path c′ showed the correlation between the IV and the DV when added with MV. The paths a and b together indicated that the IV predicted the DV through the mediation of MV.

The mediation effect was also confirmed by an additional analysis using Sobel test (see Supplementary Materials). Taken together, these results indicated that the looking preference for the eyes was a full mediator in the relation between FFA–pSTS effective connectivity and the “Attention to detail” subcluster during eye contact.

Discussion

In this study, we recorded individuals’ spontaneous eye contact in the scanner and investigated the relation between the brain mechanisms of natural eye contact during verbal communication and autistic traits in neurotypical individuals, and the role of individual eye contact preference in this relation. We found that the brain response in the pSTS and the effective connectivity between FFA and pSTS during eye contact in contrast to mouth fixation were negatively associated with the AQ total scores and specifically the “Attention to detail” subcluster scores. In addition, the relation between the FFA–pSTS connectivity and the “Attention to detail” subcluster scores was mediated via the looking preference for the speaker’s eyes. These results provide first evidence that part of the brain mechanisms during spontaneous eye contact when watching a speaker talking is related to the overall and especially to the attention to detail aspects of autistic traits. In addition, the individual eye contact preference mediates this relation.

We found the relation between response in the right pSTS during spontaneous eye contact and the AQ total scores, especially the “Attention to detail” subcluster scores, in a verbal context. Nummenmaa et al.13 found neurotypical individuals’ AQ scores positively correlated with response in the right pSTS when viewing faces with variable gaze, but negatively correlated with that when viewing constant gaze. In another study, both white matter volume and BOLD response to a stroop task in the pSTS were negatively correlated with the AQ scores15. Interestingly, Nummenmaa et al.13 and von dem Hagen et al.15 also found response and white matter volume in the pSTS were negatively related to “Attention to detail” subcluster scores. Consistent with these findings, our study showed that reduced pSTS response when viewing speakers with constant gaze is associated with increased AQ total scores and specifically “Attention to detail” subcluster scores. This indicates that the overall and especially the attention to detail aspects of autistic traits can be predicted by the pSTS response during eye contact in both nonverbal and verbal communication situations. Importantly, the contrast (Eyes vs. Mouth events) in our study was controlled for the visual input type by recording participants’ natural looking behaviors. In addition, the association between AQ scores and “Attention to detail” scores and pSTS response was not mediated by the individual looking preference for eyes. Thus, this association shown in our study is unlikely due to the different amount of visual input caused by individual’s eye contact preference that may correlate with autistic traits. Instead, it is probably due to different brain mechanisms per se used by neurotypical people with different levels of autistic traits during eye contact while watching a speaker talking.

Besides the pSTS response, we additionally found a negative correlation between the FFA–pSTS connectivity during spontaneous eye contact and the AQ scores, and specifically “Attention to detail” subcluster scores. Correlations between AQ scores and brain connectivity were not investigated in previous studies13,14,15. Furthermore, the looking preference for the eyes mediated the relation between the FFA–pSTS connectivity and the “Attention to detail” subcluster scores. We found individuals’ looking preference for the speaker’s eyes positively correlated with the FFA–pSTS connectivity in the brain. Currently, there is an ongoing debate about the interaction between the FFA and pSTS. On one hand, the FFA and pSTS are two main brain regions specialized for face perception43,44. The FFA is considered to be involved more in processing relatively invariant aspects of the face, such as identity43,44,45,46, while the pSTS is thought be important in processing dynamic changes in the face, such as gaze, expression, and facial movement44,47,48,49. Thus, these two regions have been regarded as working independently in parallel pathways, although they respond simultaneously in face processing44. On the other hand, FFA–pSTS connectivity has been found when participants viewed eye gaze shifted towards them as compared to away from them50 and when they perceived changes in facial expression and gaze on faces with same identity compared with those with different identities51. The FFA and pSTS are also intrinsically connected in functional resting state52. In the present study, the FFA–pSTS connectivity was obtained in the Eyes vs. Mouth contrast in an emotionally neutral context. The peak coordinate of pSTS (x = 51, y =  − 48, z = 9) is very close (distance ca. 7 mm in total) to that of pSTS for gaze processing found in previous studies49,53. Thus, the interaction between FFA and pSTS might be due to the integration of gaze information and identity or more invariant face information. The pSTS has also been proposed to serve a role in mentalizing, that is, inferring another person’s mental state or intentions2,54,55,56. It seems that connectivity between the FFA and pSTS could also indicate a process of inferring information from the eyes about the speaker’s intention. However, the coordinate of pSTS revealed in previous meta-analyses tends to be more dorsal57,58 in contrast to that of the current study. We therefore speculate that it is more likely that the FFA–pSTS connectivity represents the integration of gaze and identity information.

We also found that the looking preference for the eyes was negatively correlated with the “Attention to detail” subcluster scores. This seems to be contrary to a previous study which found that looking at the eyes more was associated with higher scores in “Attention to detail” in a face identity learning task10. We attribute this discrepancy to specific tasks used in the two studies. During face identity learning, eye region is considered as the most important detail on the face (for reviews, see Refs.1,59,60). Stronger looking preference for the eyes in people with higher “Attention to detail” scores in such a task therefore seems reasonable10. However, in our study participants performed a speech recognition task. In this situation, non-eye regions (i.e., orofacial movement) usually provide more details for speech recognition than the eye region do22,61,62. This might be the reason we find a negative correlation between the Eyes preference and the “Attention to detail” subcluster scores.

The “Social” subcluster scores were not predicted by any brain response or effective connectivity during eye contact. von dem Hagen et al.15 found that white matter volume and BOLD response to a stroop task in the pSTS was negatively correlated with “Social” subcluster scores. Yet, no study has reported any brain response or connectivity to eye gaze was related to “Social” subcluster scores. The “Social” subcluster scores in our study were also not related to the individual looking preference for eyes. One possible reason is that the pre-recorded videos used in the current study did not provide enough social information. Recent studies showed that neurotypical participants looked significantly less at a live person (or a pre-recorded person they believed to be “live”) than a pre-recorded person (Refs.21,63, but see also Ref.25). In live interaction, individuals with higher autistic traits looked significantly less at the experimenter (Ref.21, but see also Ref.25). Participants’ social skills scores measured by AQ correlated with the looking preference towards live but not videotaped persons63. However, whether looking preference is always correlated with social skills or subclusters in live interactions is unclear in other studies21,25, as they did not investigate different subscales or subcluster of the AQ. Thus, it is currently an open question whether the lack of correlations between the “Social” subcluster scores and brain response/connectivity and eye contact pattern are due to the pre-recorded nature of the videos used in our experimental paradigm.

An alternative explanation could be the lack of variance of the “Social” subcluster scores. There was little variation between participants on 3 of the 4 social subscales, namely “social skills,” “communication,” and “imagination,” except “attention switching” (Table 1). In contrast, scores on “Attention to detail” varied more between participants. These results are in agreement with previous studies on large samples (n > 1,000) of neurotypical participants regardless of culture8,35. We therefore performed a supplementary MLR analysis for each subscale in the “Social” subcluster (see Supplementary Materials) and found that both pSTS response and FFA–pSTS connectivity predicted “attention switching” scores, in which there is larger variation between participants (0–9), but not for the other 3 subscales with little variation between participants. Thus, the lack of correlation between the social subcluster and brain response or connectivity might be due to the lack of variance in the “Social” subcluster scores measured by AQ. Another possibility is that pSTS response and FFA–pSTS connectivity to eye contact may be particularly relevant for attention-related aspects of autistic traits.

Individuals with ASD have been repeatedly reported to show less pSTS response to eye gaze in non-verbal contexts (e.g. Refs.64,65), and autism severity was negatively related to pSTS response64. These findings are similar to ours that strong autistic traits were associated with reduced pSTS response, albeit in a verbal context and in neurotypical participants. Whether these similar findings represent the same underlying brain mechanisms is an open question. Our imaging findings are based on eye tracking data, that is, response and connectivity when participants are looking at the eye region spontaneously, whereas previous studies on ASD did not consider this important factor (e.g. Refs.64,65). Furthermore, there are large individual differences in both eye gaze pattern6 and autistic traits (e.g. social aspects) in people with ASD (Ref.66 for a review see67). Thus, individualized eye gaze pattern in ASD may also relates to different aspects of autistic traits as neurotypical people in the present study. Unfortunately, none of the previous studies on eye contact in ASD have examined subclusters of autistic traits. Therefore, including people across the whole autism spectrum in the same context and considering the effect of the individual eye contact pattern and the subcluster of autistic traits, will yield a better understanding of the relationship between autistic traits, eye contact pattern, and brain mechanisms of eye contact in face-to-face communication.

A limitation of the present study is its relatively small sample size. Increasing sample size is difficult in studies with a complex paradigm, e.g., due to the fact of difficulty in obtaining complete and valid eye tracking data for every recruited participant in the scanner (see “Participants” in “Methods” section). Small sample sizes are not ideal when testing individual differences on autistic traits, though the subclusters of autistic traits showed similar patterns to those in studies with large sample size as discussed above. The effect of small sample size on replicability of studies is controversially discussed (see68 but also69). In our study, we have a very large effect size indicated by cohen’s f2 = 1.3 and R2 = 0.53 in both MLR I and II. The effect size together with the desired probability level 0.05, the number of predictors = 2 in the present MLR models, and the desired statistical power level = 0.8, the minimum required sample size is 11 via G*Power analysis70. Our sample is larger than the minimum required size.

Previous studies reported that intelligence quotient (IQ) relates to eye gaze processing (e.g., Peterson and Miller71) in an emotional context. In the present study we did not acquire measures of intelligence. It would be interesting to test whether the IQ is also associated with brain mechanisms involved in eye gaze processing in a non-emotion context, such as the verbal communication in the present study.

In summary, using an ecologically valid paradigm, our study presents initial evidence that pSTS response and its connectivity to the FFA are related to autistic traits in neurotypical individuals and that part of this relation is mediated by the individual eye contact preference. The present study is the first to show that this relation holds in a verbal communication context and that it is also present when controlling for the potential different sensory input due to the different spontaneous eye contact patterns related to autistic traits. In addition, this study demonstrates a link between brain mechanisms underlying eye contact and a subcluster of autistic traits in neurotypical people. Evidence emerges that “Social” and “Attention to detail” subclusters of autistic traits are differently associated with brain structures in both neurotypical people15 and people with ASD72. Thus, the present study also highlights the importance of studying different subcluster aspects of autistic traits when exploring the neurobiology of autistic traits, rather than only the overall autistic traits.

Methods

Participants

Thirty healthy native German adults (15 female, 15 male; 27.5 years old ± 3.6 SD) participated in the experiment. They reported no history of psychiatric or neurological disease. All were right handers73 with normal vision (without correction). All participants provided written informed consent. The study protocol and all methods performed were approved by the Research Ethics Committee of the University of Leipzig (AZ: 192-14-14042014) and in accordance with its relevant guidelines and regulations. Nine participants were excluded due to difficulties obtaining eye tracking data (e.g. difficulties with calibration before the experiment or eye tracking during the experiment). Another two participants were excluded because of excessive head movement (> 3 mm) in the MRI scanner. Thus, eye tracking and fMRI data analyses were based on 19 subjects (11 female, 8 male; 26.0 ± 2.6 SD year-old).

Autistic traits

To measure autistic traits, all participants filled in a German translation of AQ8 provided by Freitag et al.74. The AQ is a widely used questionnaire to quickly and easily measure variation in autistic traits in both clinical and general populations8. It consists of 5 subscales. The subscales of AQ were validated in studies with large population samples, e.g., the original paper reporting design of AQ by Baron-Cohen et al.8 (see “Item Analysis and Internal Consistency”), the revalidation paper by Hoekstra et al.9 (see “Internal Consistency and Test–Retest Reliability”). We classified the 5 subscales into 2 subclusters as in most recent studies (e.g. Refs.9,10,75): “Social” subcluster (combining “communication [items 7, 17, 18, 26, 27, 31, 33, 35, 38, 39],” “social skills [items 1, 11, 13, 15, 22, 36, 44, 45, 47, 48],” “imagination [items 3, 8, 14, 20, 21, 24, 40, 41, 42, 50],” and “attention switching [items 2, 4, 10, 16, 25, 32, 34, 37, 43, 46]” subscales; higher scores indicate higher likelihood of deficit in social ability) and “Attention to detail” subcluster (referring to the “attention to detail [items 5, 6, 9, 12, 19, 23, 28, 29, 30, 49]” subscale; higher scores indicate higher likelihood of exceptional in attention to detail ability).

Each subscale contains 10 items. The participants rated to what extent they agree or disagree with the items on a 4-point Likert scale (“definitely agree,’’ ‘‘slightly agree,’’ ‘‘slightly disagree,’’ and ‘‘definitely disagree’’). Each item scores 1 point if the response represents autistic-like behavior (e.g. exceptional attention to detail, poor attention switching). The AQ total scores and scores in each subcluster were summed separately. The AQ total score for each participant was below the cut-off value (32) that is indicative of a manifestation of autistic traits typical for ASD (17.95 ± 4.99SD) (Table 1). The test of normality (Shapiro–Wilk test, p > 0.05, n = 19) showed that both the AQ total scores and subcluster scores were distributed normally.

Stimuli

The stimuli were eight ca 6-min long monologue videos from 4 German speakers (for details see Supplementary Materials).

Experimental procedures

The fMRI experiment consisted of 4 sessions implemented in Presentation software (version 14.5, Neurobehavioral Systems Inc., USA). Each session had one Normal and one Noise video from speakers of different sexes. The orders of speakers’ sexes and video conditions were counterbalanced across sessions and participants. We instructed the participants to carefully watch and listen to the speaker talking, they were allowed to freely look at different regions of the speaker’s face. At specific time points the videos were stopped, participants performed a speech recognition task by answering “What was the last word you heard?” from 3 choices shown on the screen. They chose the answer by pressing one of 3 corresponding buttons on a response-box. The video continued when a button was pressed or after 4 s without a response.

Eye tracking

We used a 120 Hz monocular MR-compatible eye tracker (EyeTrac 6, ASL, USA) to record participants’ eye movements during the experiment. Prior to the fMRI experiment, the eye tracking system was calibrated using a standard nine-point calibration procedure for each participant. Before each session, the accuracy of eye tracking was checked. If necessary, the eye tracking system was recalibrated.

Imaging data acquisition

Functional images and structural T1-weighted images were obtained using a 3T Siemens Tim Trio MR scanner (Siemens Healthcare, Erlangen, Germany), equipped with a 12-channel head coil. For other scanning details see Supplementary Materials.

Eye tracking analysis

Fixation events for fMRI analyses

We used EyeNal software (ASL, USA) and customized MATLAB scripts for the eye tracking data analysis. A fixation was defined as having a minimum duration of 100 ms and a maximum visual angle change of 1 degree. Natural speaking is often accompanied by head movements. We therefore corrected the position of participants’ fixations based on the speakers’ head movements in the videos using the Tracker software (https://www.cabrillo.edu/~dbrown/tracker/) (for details see Supplementary Materials). We labeled fixations within the areas of interest (AOIs) of eyes and mouth as “Eyes” and “Mouth” respectively, and fixations outside these AOIs as “Off” (For details, see Ref.12). Fixations occurring consecutively within the same AOI were concatenated into one fixation, resulting in one event for the fMRI analyses. The event onset was the start time of the fixation forming in the corresponding AOI.

Eye gaze patterns

As reported previously12, the eye gaze patterns made this experiment suitable for fMRI analysis as a rapid event-related design: There was a sufficient number of events (NE, Fig. S1A, Table S2) and suitably long for inter-event intervals (IEI, Fig. S1B, Table S3) the Eyes and Mouth events across participants. Both indices (NE and IEI) were roughly balanced between event types across conditions (Fig. S1A,B). In addition, the IEI was jittered (Fig. S1C) and the events occurred in a variable order (Fig. S1D).

Looking preference for the eyes

To define each participant’s looking preference for a speaker’s eyes, we computed an Eyes Preference index defined as Neyes/Nnon-eyes. Neyes is the number of Eyes events, Nnon-eyes is the sum number Mouth and Off events. A larger value means a stronger looking preference for the eyes.

fMRI analyses

In the present study, for brain response we used the results reported in our previous study12. For brain connectivity, we reanalyzed with a similar region of interest (ROI)-based psychophysiological interaction (PPI) analyses as reported previously12. However, in contrast to the previous analyses we added head movement parameter that was neglected in Jiang et al.12.

Analyses of BOLD response

We performed all fMRI analyses using Statistical Parametric Mapping software (SPM8, Wellcome Trust Centre for Neuroimaging, UCL, UK, https://www.fil.ion.ucl.ac.uk/spm) (see Supplementary Materials). The results for BOLD response showed four large clusters involved in eye contact (Eyes > Mouth) (Fig. S2): (1) bilateral visual cortices, including the cuneus (Cun, BA 17/18), calcarine sulcus (Cal, BA 17/18) covering V1, V2, and V3 and extending to the precuneus (Prec, BA 7), (2) the right temporal-parietal junction, including the angular gyrus and supramarginal gyrus and extending into the pSTS (TPJ&pSTS, BA 39/40) (Note that the pSTS involved in eye contact processing is more posterior to that involved in mouth movement processing, this is consistent with previous findings (e.g., Pelphrey et al.64; Pelphrey et al.49), (3) the left medial prefrontal cortex (mPFC), including the anterior cingulate cortex extending to medial orbital frontal cortex (BA 10/24/32), and (4) the right dorsolateral prefrontal cortex (dlPFC) (BA 9/46) (FWE cluster-wise corrected, p < 0.05). These results were reported in Jiang et al.12.

Analyses of effective connectivity

For the effective connectivity, we conducted ROI-based psychophysiological interaction (PPI) analyses76. PPIs have been considered simple models of effective connectivity77 (for details about analyses procedure, see Supplementary Materials). The ROIs were the same ROIs as reported in Jiang et al.12. The ROIs included (i) those regions that had significant BOLD response to eye contact (vs mouth fixation) in verbal communication (Jiang et al.12), and (ii) those regions that have been implicated in the predominant model for eye contact processing, i.e., the fast-track modulator model2. We used the ROIs that were responsive to eye contact (vs mouth fixation) in verbal communication because the main purpose of the present manuscript was to find predictors of autistic traits among regions specifically responsive to eye contact in verbal communication. We also used the ROIs that have been implicated in the fast-track modulator model2 for connectivity analyses, because all the specific connectivity proposed in this model was proposed based on evidence of eye contact processing in non-verbal contexts. It is unknown whether all the specific connectivity for non-verbal contexts would be also shown in the same way in verbal communication (see Jiang et al.12). Therefore, we defined these ROIs anatomically. Most of the regions are defined in the WFU_PickAtlas78 or the SPM Anatomy toolbox (v2.1)79. However, there were still a few regions not available in these toolboxes. For regions that were not available in the toolboxes, we defined via probabilistic maps (FFA and aSTS) or anatomically (SC) with reference to a brain atlas80. All regions were used as both source ROIs and target ROIs (for details see Supplementary Materials). Note that for regions with explicit hemispheric prediction in the fast-track modulator model or showing lateralized activation in the brain response analyses (e.g., right aSTS predicted in the model, right pSTS predicted in both model and showed significant lateralized activation in the response analysis, right dlPFC showed significant lateralized activation in the response analysis), we used the lateralized region as ROI. However, for regions with no explicit hemispheric prediction in this model, we also did not find significant laterality/hemispheric effects on activations between bilateral regions (p > 0.05 after FDR correction for all regions), indicating no strong functional dissimilarity between bilateral regions. Therefore, we merged bilateral regions as a single ROI to extract signal better reflecting general rather than lateralized information for connectivity analyses as done in previous studies (e.g., Admon et al.81,82). We conducted PPI analyses between all ROIs to identify all the possible connectivity that specific to eye contact in verbal communication. To obtain all potential contributors (connectivity) to accounting for the variances in autistic traits, we performed small-volume correction (FWE voxel-wise) for each target ROI in the PPI analyses. All the connectivity survived after correction was further used as independent variables for multiple linear regression analyses.

In contrast to the PPI analyses reported on the same data set in Jiang et al.12, here we additionally added a head movement parameter (framewise displacement, FD)82 to eliminate possible artefacts arising from head movement. After regressing out the head movement effect, the PPI results (Fig. S3) showed a very similar pattern as that in Jiang et al.12. It showed that the effective connectivity results we found for eye contact was relatively robust to the effect of head movement. The significant effective connectivity during eye contact as compared to mouth fixation was plotted in Fig. S3 (p < 0.05, FWE corrected).

Multiple linear regression analyses

We conducted the multiple linear regression analyses in SPSS (Version 20.0, IBM Corp., USA). For IVs, we used the MarsBar toolbox83 to extract each participant’s parameter estimates from the individual’s Eyes vs. Mouth contrast file generated in the 1st level analyses for brain response and connectivity. The parameter estimates for the brain response were from brain regions showing increased BOLD response to the contrast eyes vs. mouth (i.e., 10-mm radius spheres centered on group statistical maximum coordinates. For details, see ‘Functionally defined ROIs’ in Supplementary Materials). The parameter estimates for brain connectivity were from target ROIs showing enhanced effective connectivity in the PPI analyses (For details about the definition, see ‘Definition for target ROIs’ in Supplementary Materials). The standardized β coefficients were used to evaluate the correlation level between the IVs and DV. Note that there was no significant difference between males and females on the AQ total scores (t =  − 0.67, p = 0.513) and no significant correlation between AQ and age (r = 0.11, p = 0.650). We therefore did not include sex or age as covariates.

Mediation analyses

For mediation analyses, three criteria need to be met for the mediation effect in the following procedures recommended by Baron and Kenny42. First, IV must be significantly related to DV when considered alone. This was tested in the MLR I described above. Second, the IV must be significantly related to MV in an additional MLR analysis (MLR II) in which IV was the significant predictors identified in MLR I, and the MV was DV. Third, in the final MLR model (MLR III) in which both the significant predictors identified in MLR II and MV were entered as IVs, (i) the MV must be significantly correlated with the DV, and (ii) the correlation between the IV and DV would decrease in magnitude and significance from the level shown in the MLR I when the MV was not involved.

We repeated the same analysis for each AQ subcluster by entering AQ subcluster scores as separate DV. To check the validity of the results, we conducted an additional conservative test, namely the Sobel test84 (see Supplementary Materials).

References

  1. Itier, R. J. & Batty, M. Neural bases of eye and gaze processing: The core of social cognition. Neurosci. Biobehav. Rev. 33, 843–863. https://doi.org/10.1016/j.neubiorev.2009.02.004 (2009).

    Article  PubMed  PubMed Central  Google Scholar 

  2. Senju, A. & Johnson, M. H. The eye contact effect: Mechanisms and development. Trends Cogn. Sci. 13, 127–134. https://doi.org/10.1016/j.tics.2008.11.009 (2009).

    Article  PubMed  Google Scholar 

  3. Schilbach, L. Eye to eye, face to face and brain to brain: Novel approaches to study the behavioral dynamics and neural mechanisms of social interactions. Curr. Opin. Biobehav. Sci. 3, 130–135 (2015).

    Article  Google Scholar 

  4. Lewkowicz, D. J. & Hansen-Tift, A. M. Infants deploy selective attention to the mouth of a talking face when learning speech. Proc. Natl. Acad. Sci. U.S.A. 109, 1431–1436. https://doi.org/10.1073/pnas.1114783109 (2012).

    ADS  Article  PubMed  PubMed Central  Google Scholar 

  5. Macdonald, R. G. & Tatler, B. W. Do as eye say: Gaze cueing and language in a real-world social interaction. J. Vis. 13, 6. https://doi.org/10.1167/13.4.6 (2013).

    Article  PubMed  Google Scholar 

  6. Senju, A. & Johnson, M. H. Atypical eye contact in autism: Models, mechanisms and development. Neurosci. Biobehav. Rev. 33, 1204–1214. https://doi.org/10.1016/j.neubiorev.2009.06.001 (2009).

    Article  PubMed  Google Scholar 

  7. APA. Diagnostic and Statistical Manual of Mental Disorders (DSM-5®) (American Psychiatric Pub, Washington, DC, 2013).

    Google Scholar 

  8. Baron-Cohen, S., Wheelwright, S., Skinner, R., Martin, J. & Clubley, E. The autism-spectrum quotient (AQ): Evidence from asperger syndrome/high-functioning autism, malesand females, scientists and mathematicians. J. Autism Dev. Disord. 31, 5–17 (2001).

    CAS  Article  Google Scholar 

  9. Hoekstra, R. A., Bartels, M., Cath, D. C. & Boomsma, D. I. Factor structure, reliability and criterion validity of the autism-spectrum quotient (AQ): A study in Dutch population and patient groups. J. Autism Dev. Disord. 38, 1555–1566. https://doi.org/10.1007/s10803-008-0538-x (2008).

    Article  PubMed  PubMed Central  Google Scholar 

  10. Davis, J. et al. Social and attention-to-detail subclusters of autistic traits differentially predict looking at eyes and face identity recognition ability. Br. J. Psychol. https://doi.org/10.1111/bjop.12188 (2017).

    Article  PubMed  Google Scholar 

  11. Chen, F. S. & Yoon, J. M. Brief report: Broader autism phenotype predicts spontaneous reciprocity of direct gaze. J. Autism Dev. Disord. 41, 1131–1134. https://doi.org/10.1007/s10803-010-1136-2 (2011).

    Article  PubMed  Google Scholar 

  12. Jiang, J., Borowiak, K., Tudge, L., Otto, C. & von Kriegstein, K. Neural mechanisms of eye contact when listening to another person talking. Soc. Cogn. Affect. Neurosci. 12, 319–328. https://doi.org/10.1093/scan/nsw127 (2017).

    Article  PubMed  Google Scholar 

  13. Nummenmaa, L., Engell, A. D., von dem Hagen, E., Henson, R. N. & Calder, A. J. Autism spectrum traits predict the neural response to eye gaze in typical individuals. Neuroimage 59, 3356–3363. https://doi.org/10.1016/j.neuroimage.2011.10.075 (2012).

    Article  PubMed  PubMed Central  Google Scholar 

  14. Hasegawa, N. et al. Neural activity in the posterior superior temporal region during eye contact perception correlates with autistic traits. Neurosci. Lett. 549, 45–50. https://doi.org/10.1016/j.neulet.2013.05.067 (2013).

    CAS  Article  PubMed  Google Scholar 

  15. von dem Hagen, E. A. et al. Autism spectrum traits in the typical population predict structure and function in the posterior superior temporal sulcus. Cereb Cortex 21, 493–500. https://doi.org/10.1093/cercor/bhq062 (2011).

    Article  Google Scholar 

  16. Klin, A., Jones, W., Schultz, R., Volkmar, F. & Cohen, D. Visual fixation patterns during viewing of naturalistic social situations as predictors of social competence in individuals with autism. Arch. Gen. Psychiatry 59, 809–816. https://doi.org/10.1001/archpsyc.59.9.809 (2002).

    Article  PubMed  Google Scholar 

  17. Riby, D. & Hancock, P. J. Looking at movies and cartoons: Eye-tracking evidence from Williams syndrome and autism. J. Intell. Disabil. Res. 53, 169–181. https://doi.org/10.1111/j.1365-2788.2008.01142.x (2009).

    CAS  Article  Google Scholar 

  18. Jones, W., Carr, K. & Klin, A. Absence of preferential looking to the eyes of approaching adults predicts level of social disability in 2-year-old toddlers with autism spectrum disorder. Arch. Gen. Psychiatry 65, 946–954. https://doi.org/10.1001/archpsyc.65.8.946 (2008).

    Article  PubMed  Google Scholar 

  19. Klin, A., Jones, W., Schultz, R. & Volkmar, F. The enactive mind, or from actions to cognition: Lessons from autism. Philos. Trans. R Soc. Lond. B Biol. Sci. 358, 345–360. https://doi.org/10.1098/rstb.2002.1202 (2003).

    Article  PubMed  PubMed Central  Google Scholar 

  20. Klin, A. & Jones, W. Altered face scanning and impaired recognition of biological motion in a 15-month-old infant with autism. Dev. Sci. 11, 40–46. https://doi.org/10.1111/j.1467-7687.2007.00608.x (2008).

    Article  PubMed  Google Scholar 

  21. von dem Hagen, E. A. & Bright, N. High autistic trait individuals do not modulate gaze behaviour in response to social presence but look away more when actively engaged in an interaction. Autism Res. 10, 359 (2016).

    Article  Google Scholar 

  22. Gurler, D., Doyle, N., Walker, E., Magnotti, J. & Beauchamp, M. A link between individual differences in multisensory speech perception and eye movements. Attent. Percep. Psychophys. 77, 1333–1341. https://doi.org/10.3758/s13414-014-0821-1 (2015).

    Article  Google Scholar 

  23. Peterson, M. F. & Eckstein, M. P. Individual differences in eye movements during face identification reflect observer-specific optimal points of fixation. Psychol. Sci. 24, 1216–1225. https://doi.org/10.1177/0956797612471684 (2013).

    Article  PubMed  PubMed Central  Google Scholar 

  24. Mehoudar, E., Arizpe, J., Baker, C. I. & Yovel, G. Faces in the eye of the beholder: Unique and stable eye scanning patterns of individual observers. J. Vis. 14, 6. https://doi.org/10.1167/14.7.6 (2014).

    Article  PubMed  PubMed Central  Google Scholar 

  25. Freeth, M., Foulsham, T. & Kingstone, A. What affects social attention? Social presence, eye contact and autistic traits. PLoS ONE 8, e53286 (2013).

    ADS  CAS  Article  Google Scholar 

  26. McPartland, J. C., Webb, S. J., Keehn, B. & Dawson, G. Patterns of visual attention to faces and objects in autism spectrum disorder. J. Autism Dev. Disord. 41, 148–157. https://doi.org/10.1007/s10803-010-1033-8 (2011).

    Article  PubMed  PubMed Central  Google Scholar 

  27. Vabalas, A. & Freeth, M. Brief report: Patterns of eye movements in face to face conversation are associated with autistic traits: Evidence from a student sample. J. Autism Dev. Disord. 46, 305–314. https://doi.org/10.1007/s10803-015-2546-y (2016).

    Article  PubMed  Google Scholar 

  28. Morris, J. P., Pelphrey, K. A. & McCarthy, G. Controlled scanpath variation alters fusiform face activation. Soc. Cogn. Affect. Neurosci. 2, 31–38 (2007).

    Article  Google Scholar 

  29. Dalton, K. M. et al. Gaze fixation and the neural circuitry of face processing in autism. Nat. Neurosci. 8, 519–526 (2005).

    CAS  Article  Google Scholar 

  30. Marsman, J. B., Renken, R., Velichkovsky, B. M., Hooymans, J. M. & Cornelissen, F. W. Fixation based event-related fmri analysis: Using eye fixations as events in functional magnetic resonance imaging to reveal cortical processing during the free exploration of visual images. Hum. Brain Map. 33, 307–318. https://doi.org/10.1002/hbm.21211 (2012).

    Article  Google Scholar 

  31. Henderson, J. M. & Choi, W. Neural correlates of fixation duration during real-world scene viewing: Evidence from fixation-related (FIRE) fMRI. J. Cogn. Neurosci. 27, 1137–1145. https://doi.org/10.1162/jocn_a_00769 (2015).

    Article  PubMed  Google Scholar 

  32. Baron-Cohen, S., Wheelwright, S., Hill, J., Raste, Y. & Plumb, I. The, “reading the mind in the eyes” test revised version: A study with normal adults, and adults with Asperger syndrome or high-functioning autism. J. Child Psychol. Psychiatry 42, 241–251 (2001).

    CAS  Article  Google Scholar 

  33. Woodbury-Smith, M. R., Robinson, J., Wheelwright, S. & Baron-Cohen, S. Screening adults for Asperger syndrome using the AQ: A preliminary study of its diagnostic validity in clinical practice. J. Autism Dev. Disord. 35, 331–335 (2005).

    CAS  Article  Google Scholar 

  34. Stoesz, B. M., Montgomery, J. M., Smart, S. L. & Hellsten, L.-A.M. Review of five instruments for the assessment of Asperger’s disorder in adults. Clin. Neuropsychol. 25, 376–401 (2011).

    Article  Google Scholar 

  35. Wakabayashi, A., Baron-Cohen, S., Wheelwright, S. & Tojo, Y. The autism-spectrum quotient (AQ) in Japan: A cross-cultural comparison. J. Autism Dev. Disord. 36, 263–270. https://doi.org/10.1007/s10803-005-0061-2 (2006).

    Article  PubMed  Google Scholar 

  36. Broadbent, J., Galic, I. & Stokes, M. Validation of autism spectrum quotient adult version in an Australian sample. Autism Res. Treat. https://doi.org/10.1155/2013/984205 (2013).

    Article  PubMed  PubMed Central  Google Scholar 

  37. Stewart, M. E. & Austin, E. J. The structure of the autism-spectrum quotient (AQ): Evidence from a student sample in Scotland. Personal. Individ. Differ. 47, 224–228. https://doi.org/10.1016/j.paid.2009.03.004 (2009).

    Article  Google Scholar 

  38. Kloosterman, P. H., Keefer, K. V., Kelley, E. A., Summerfeldt, L. J. & Parker, J. D. A. Evaluation of the factor structure of the autism-spectrum quotient. Personal. Individ. Differ. 50, 310–314. https://doi.org/10.1016/j.paid.2010.10.015 (2011).

    Article  Google Scholar 

  39. Austin, E. J. Personality correlates of the broader autism phenotype as assessed by the autism spectrum quotient (AQ). Personal. Individ. Differ. 38, 451–460. https://doi.org/10.1016/j.paid.2004.04.022 (2005).

    Article  Google Scholar 

  40. Manera, V., Del Giudice, M., Grandi, E. & Colle, L. Individual differences in the recognition of enjoyment smiles: No role for perceptual–attentional factors and autistic-like traits. Front. Psychol. 2, 143 (2011).

    Article  Google Scholar 

  41. Rhodes, G., Jeffery, L., Taylor, L. & Ewing, L. Autistic traits are linked to reduced adaptive coding of face identity and selectively poorer face recognition in men but not women. Neuropsychologia 51, 2702–2708 (2013).

    Article  Google Scholar 

  42. Baron, R. M. & Kenny, D. A. The moderator-mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. J. Pers. Soc. Psychol. 51, 1173–1182 (1986).

    CAS  Article  Google Scholar 

  43. Kanwisher, N., McDermott, J. & Chun, M. M. The fusiform face area: A module in human extrastriate cortex specialized for face perception. J. Neurosci. 17, 4302–4311 (1997).

    CAS  Article  Google Scholar 

  44. Haxby, J. V., Hoffman, E. A. & Gobbini, M. I. The distributed human neural system for face perception. Trends Cogn. Sci. 4, 223–233 (2000).

    CAS  Article  Google Scholar 

  45. Grill-Spector, K., Knouf, N. & Kanwisher, N. The fusiform face area subserves face perception, not generic within-category identification. Nat. Neurosci. 7, 555–562 (2004).

    CAS  Article  Google Scholar 

  46. Rotshtein, P., Henson, R. N., Treves, A., Driver, J. & Dolan, R. J. Morphing Marilyn into Maggie dissociates physical and identity face representations in the brain. Nat. Neurosci. 8, 107–113. https://doi.org/10.1038/nn1370 (2005).

    CAS  Article  PubMed  Google Scholar 

  47. Puce, A., Allison, T., Bentin, S., Gore, J. C. & McCarthy, G. Temporal cortex activation in humans viewing eye and mouth movements. J. Neurosci. 18, 2188–2199 (1998).

    CAS  Article  Google Scholar 

  48. O’Toole, A. J., Roark, D. A. & Abdi, H. Recognizing moving faces: A psychological and neural synthesis. Trends Cogn. Sci. 6, 261–266 (2002).

    Article  Google Scholar 

  49. Pelphrey, K. A., Morris, J. P., Michelich, C. R., Allison, T. & McCarthy, G. Functional anatomy of biological motion perception in posterior temporal cortex: An FMRI study of eye, mouth and hand movements. Cereb. Cortex 15, 1866–1876. https://doi.org/10.1093/cercor/bhi064 (2005).

    Article  PubMed  Google Scholar 

  50. Ethofer, T., Gschwind, M. & Vuilleumier, P. Processing social aspects of human gaze: A combined fMRI-DTI study. Neuroimage 55, 411–419. https://doi.org/10.1016/j.neuroimage.2010.11.033 (2011).

    Article  PubMed  Google Scholar 

  51. Baseler, H. A., Harris, R. J., Young, A. W. & Andrews, T. J. Neural responses to expression and gaze in the posterior superior temporal sulcus interact with facial identity. Cereb. Cortex 24, 737–744. https://doi.org/10.1093/cercor/bhs360 (2014).

    Article  PubMed  Google Scholar 

  52. Turk-Browne, N. B., Norman-Haignere, S. V. & McCarthy, G. Face-specific resting functional connectivity between the fusiform gyrus and posterior superior temporal sulcus. Front. Hum. Neurosci. 4, 176. https://doi.org/10.3389/fnhum.2010.00176 (2010).

    Article  PubMed  PubMed Central  Google Scholar 

  53. Grosbras, M. H., Beaton, S. & Eickhoff, S. B. Brain regions involved in human movement perception: A quantitative voxel-based meta-analysis. Hum. Brain Map. 33, 431–454. https://doi.org/10.1002/hbm.21222 (2012).

    Article  Google Scholar 

  54. Frith, C. D. & Frith, U. The neural basis of mentalizing. Neuron 50, 531–534. https://doi.org/10.1016/j.neuron.2006.05.001 (2006).

    CAS  Article  PubMed  Google Scholar 

  55. Schurz, M., Radua, J., Aichhorn, M., Richlan, F. & Perner, J. Fractionating theory of mind: A meta-analysis of functional brain imaging studies. Neurosci. Biobehav. Rev. 42, 9–34. https://doi.org/10.1016/j.neubiorev.2014.01.009 (2014).

    Article  PubMed  Google Scholar 

  56. Gao, T., Scholl, B. J. & McCarthy, G. Dissociating the detection of intentionality from animacy in the right posterior superior temporal sulcus. J. Neurosci. 32, 14276–14280. https://doi.org/10.1523/JNEUROSCI.0562-12.2012 (2012).

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  57. Mar, R. A. The neural bases of social cognition and story comprehension. Annu. Rev. Psychol. 62, 103–134. https://doi.org/10.1146/annurev-psych-120709-145406 (2011).

    Article  PubMed  Google Scholar 

  58. Van Overwalle, F. & Baetens, K. Understanding others’ actions and goals by mirror and mentalizing systems: A meta-analysis. Neuroimage 48, 564–584. https://doi.org/10.1016/j.neuroimage.2009.06.009 (2009).

    Article  PubMed  Google Scholar 

  59. Emery, N. J. The eyes have it: The neuroethology, function and evolution of social gaze. Neurosci. Biobehav. Rev. 24, 581–604. https://doi.org/10.1016/S0149-7634(00)00025-7 (2000).

    CAS  Article  PubMed  Google Scholar 

  60. Peterson, M. F. & Eckstein, M. P. Looking just below the eyes is optimal across face recognition tasks. Proc. Natl. Acad. Sci. U.S.A. 109, E3314-3323. https://doi.org/10.1073/pnas.1214269109 (2012).

    ADS  Article  PubMed  PubMed Central  Google Scholar 

  61. Yi, A., Wong, W. & Eizenman, M. Gaze patterns and audiovisual speech enhancement. J. Speech Lang. Hear. Res. 56, 471–480. https://doi.org/10.1044/1092-4388(2012/10-0288) (2013).

    Article  PubMed  Google Scholar 

  62. Nath, A. R. & Beauchamp, M. S. A neural basis for interindividual differences in the McGurk effect, a multisensory speech illusion. Neuroimage 59, 781–787. https://doi.org/10.1016/j.neuroimage.2011.07.024 (2012).

    Article  PubMed  Google Scholar 

  63. Laidlaw, K. E., Foulsham, T., Kuhn, G. & Kingstone, A. Potential social interactions are important to social attention. Proc. Natl. Acad. Sci. U.S.A. 108, 5548–5553. https://doi.org/10.1073/pnas.1017022108 (2011).

    ADS  Article  PubMed  PubMed Central  Google Scholar 

  64. Pelphrey, K. A., Morris, J. P. & McCarthy, G. Neural basis of eye gaze processing deficits in autism. Brain 128, 1038–1048. https://doi.org/10.1093/brain/awh404 (2005).

    Article  PubMed  Google Scholar 

  65. Redcay, E. et al. Atypical brain activation patterns during a face-to-face joint attention game in adults with autism spectrum disorder. Hum. Brain Mapp. 34, 2511–2523. https://doi.org/10.1002/hbm.22086 (2013).

    Article  PubMed  Google Scholar 

  66. Prior, M. et al. Are there subgroups within the autistic spectrum? A cluster analysis of a group of children with autistic spectrum disorders. J. Child Psychol. Psychiatry 39, 893–902 (1998).

    CAS  Article  Google Scholar 

  67. Beglinger, L. J. & Smith, T. H. A review of subtyping in autism and proposed dimensional classification model. J. Autism Dev. Disord. 31, 411–422 (2001).

    CAS  Article  Google Scholar 

  68. Button, K. S. et al. Power failure: Why small sample size undermines the reliability of neuroscience. Nat. Rev. Neurosci. 14, 365 (2013).

    CAS  Article  Google Scholar 

  69. Bacchetti, P. Small sample size is not the real problem. Nat. Rev. Neurosci. 14, 585 (2013).

    CAS  Article  Google Scholar 

  70. Faul, F., Erdfelder, E., Buchner, A. & Lang, A.-G. Statistical power analyses using G* Power 3.1: Tests for correlation and regression analyses. Behav. Res. Methods 41, 1149–1160 (2009).

    Article  Google Scholar 

  71. Peterson, E. & Miller, S. F. The eyes test as a measure of individual differences: How much of the variance reflects verbal IQ?. Front. Psychol. https://doi.org/10.3389/fpsyg.2012.00220 (2012).

    Article  PubMed  PubMed Central  Google Scholar 

  72. Rojas, D. C. et al. Regional gray matter volumetric changes in autism associated with social and repetitive behavior symptoms. BMC Psychiatry 6, 56. https://doi.org/10.1186/1471-244X-6-56 (2006).

    Article  PubMed  PubMed Central  Google Scholar 

  73. Oldfield, R. C. The assessment and analysis of handedness: The Edinburgh inventory. Neuropsychologia 9, 97–113 (1971).

    CAS  Article  Google Scholar 

  74. Freitag, C. et al. Evaluation der deutschen Version des Autismus-Spektrum-Quotienten (AQ)-die Kurzversion AQ-k. Z. Klin. Psychol. Psychiatr. Psychother. 36, 280–289 (2007).

    Article  Google Scholar 

  75. Kitazoe, N., Fujita, N., Izumoto, Y., Terada, S. I. & Hatakenaka, Y. Whether the autism spectrum quotient consists of two different subgroups? Cluster analysis of the autism spectrum quotient in general population. Autism Int. J. Res. Pract. 21, 323–332. https://doi.org/10.1177/1362361316638787 (2017).

    Article  Google Scholar 

  76. Friston, K. J. et al. Psychophysiological and modulatory interactions in neuroimaging. Neuroimage 6, 218–229. https://doi.org/10.1006/nimg.1997.0291 (1997).

    CAS  Article  PubMed  Google Scholar 

  77. Friston, K. J. Functional and effective connectivity: A review. Brain Connect. 1, 13–36 (2011).

    Article  Google Scholar 

  78. Maldjian, J. A., Laurienti, P. J., Kraft, R. A. & Burdette, J. H. An automated method for neuroanatomic and cytoarchitectonic atlas-based interrogation of fMRI data sets. Neuroimage 19, 1233–1239 (2003).

    Article  Google Scholar 

  79. Eickhoff, S. B. et al. A new SPM toolbox for combining probabilistic cytoarchitectonic maps and functional imaging data. Neuroimage 25, 1325–1335. https://doi.org/10.1016/j.neuroimage.2004.12.034 (2005).

    Article  PubMed  Google Scholar 

  80. Duvernoy, H. The Human Brain (Springer, New York, 1991).

    Google Scholar 

  81. Hunt, L. T., Dolan, R. J. & Behrens, T. E. Hierarchical competitions subserving multi-attribute choice. Nat. Neurosci. 17, 1613 (2014).

    CAS  Article  Google Scholar 

  82. Power, J. D., Barnes, K. A., Snyder, A. Z., Schlaggar, B. L. & Petersen, S. E. Spurious but systematic correlations in functional connectivity MRI networks arise from subject motion. Neuroimage 59, 2142–2154. https://doi.org/10.1016/j.neuroimage.2011.10.018 (2012).

    Article  Google Scholar 

  83. Brett, M., Anton, J.-L., Valabregue, R. & Poline, J.-B. Region of interest analysis using the MarsBar toolbox for SPM 99. Neuroimage 16, S497 (2002).

    Google Scholar 

  84. Sobel, M. Asymptotic confidence intervals for indirect effects in structural equation models. Sociol. Methodol. 13, 290–312 (1982).

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by a Max Planck Research Grant to K.v.K and a China Scholarship Council (CSC) and German Academic Exchange Council (DAAD) fellowship to J.J.

Author information

Authors and Affiliations

Authors

Contributions

J.J. collected the data, J.J. and K.v.K designed the experiment, J.J. and J-F.J. analyzed the data and wrote the manuscript.

Corresponding author

Correspondence to Jing Jiang.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher's note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Jiang, J., von Kriegstein, K. & Jiang, J. Brain mechanisms of eye contact during verbal communication predict autistic traits in neurotypical individuals. Sci Rep 10, 14602 (2020). https://doi.org/10.1038/s41598-020-71547-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1038/s41598-020-71547-0

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing