Introduction

The effects of climate change on planetary life are among the primary existential threats we are facing (Kulp and Strauss, 2019). Scientific consensus on the existence, causes and impacts of climate change is well established (Stern, 2016; Goldberg et al., 2019). Despite rising awareness of climate change (Feldman et al., 2017; Moser, 2010; Moser and Dilling, 2012), a considerable percentage of people appear only moderately concerned (Gifford, 2011; Stern, 2012). Some even completely deny the existence or the happening of anthropogenic climate change, despite increasing evidence (Capstick and Pidgeon, 2014; Häkkinen and Akrami, 2014; Jessani and Harris, 2018; McCright and Dunlap, 2011).

Information concerning climate change has become an object of ideological polarisation, mobilising political and social attitudes, and determining pro- or anti-environmental behaviours in real life (Hornsey et al., 2016). Political affiliation, values and ideologies influence our beliefs about climate change as much as factual knowledge (Hornsey et al., 2016). For instance, past research suggests that people are more likely to update prior beliefs with new evidence when the given information that is desirable and self-confirming. The opposite holds true for undesirable or disconfirming information (Ali et al., 2011; Benabou and Tirole, 2002; Johnson and Fowler, 2011; Sharot and Garrett, 2016). Specifically, the extent to which participants accept or deny the existence of climate change affects how they later update their beliefs regarding climate change: sceptics change their beliefs in response to unexpected good news, while believers change their beliefs in response to bad news (Sunstein et al., 2017).

Concurrently, recent research has emphasised the importance of assessing underlying cognitive processes, such as metacognitive abilities (Rollwage et al., 2018) or cognitive inflexibility (Zmigrod et al., 2019), when studying the emergence of polarised beliefs (Rollwage et al., 2019). Impaired metacognitive sensitivity, that is poor insight into the (in)correctness of one’s beliefs as measured in a low-level perceptual discrimination task (Rollwage et al., 2018), and the reduced adjustment of confidence when presented with corrective post-decision evidence, predicted the extent to which people hold radical beliefs. This line of research may help to understand the adherence to climate change scepticism beliefs, despite scientific consensus (Thaller and Brudermann, 2020; Ortoleva et al., 2015). Scanell and Grouzet (Scannell and Grouzet, 2010) highlighted the relevance of metacognitive abilities in the study of climate change beliefs, suggesting the importance of accurate metacognitive knowledge about climate change in fostering information-seeking behaviour. This relationship is supported by empirical evidence showing that more dogmatic people use their internal confidence signals less efficiently when searching for new information, leading to reduced information-seeking overall (Schulz et al., 2020). It has also been shown that domain-specific meta-knowledge predicted climate change beliefs (Fischer & Said, 2021) more effectively than factual knowledge (Drummond and Fischhoff, 2017).

However, while it has been shown that both domain-general and domain-specific metacognition can explain variation in people’s socio-political beliefs, including beliefs about climate change, the more pertinent question of what it would take to change people’s beliefs and the role of metacognition in that process, remains unexplored. Here we set out to explore whether domain-general metacognitive abilities underlie participants’ attitudes towards climate change and further influence the updating of climate change related beliefs.

Participants (US citizens: n = 699; Study 1 n = 364; Study 2 n = 335) were recruited online (https://gorilla.sc/). After providing informed consent and demographic information, they performed a perceptual decision-making task (see Methods), (Rollwage et al., 2018) to assess domain-general decision-making and metacognitive abilities, where metacognitive sensitivity is defined as the extent to which confidence ratings track the (in)correctness of perceptual judgements. Metacognitive sensitivity was estimated as meta-d’ within a type 2 signal detection theory framework. Next, they participated in a belief-updating task focused on climate change (see Fig. 1 and Methods) (Sunstein et al., 2017). They then answered questions assessing their climate change related knowledge (Fischer & Said, 2021; Sundblad et al., 2007). Participants’ confidence in their climate change answers was used to compute domain-specific metacognitive abilities (accuracy, confidence, and metacognitive sensitivity). Lastly, we assessed participants’ self-reported attitudes regarding climate change (Capstick and Pidgeon, 2014), their ideological dogmatism (Altemeyer, 2002), beliefs in social hierarchies, Social Dominance Orientation (Ho et al., 2015), and their political orientation (Rollwage et al., 2018).

Fig. 1: Experimental design.
figure 1

We carried out two studies with an identical design except for the modality used in the belief-updating task, where climate change information was either communicated with textual-only (Study 1) or visuo-textual (Study 2) information to participants. Participants were exposed to either good or bad news and had to estimate the impact of climate change in two different scenarios. We measured if and how they updated their beliefs in response to good or bad news.

Methods

We conducted online two studies that were hosted on the online behavioural studies platform Gorilla Experiment Builder (https://gorilla.sc/) and administered on the crowdsourcing platform Amazon Mechanical Turk where our participants were recruited (https://www.mturk.com/). Both studies were approved by the Institutional Ethics Committee and were performed in accordance with Declaration of Helsinki.

Participants

Participants for Study 1 were recruited between December 2019 and January 2020 and for Study 2 in March 2020. Data collection was completed during these periods. There were no inclusion or inclusion criteria other than the ones specified below.

In Study 1, a sample of 424 US participants was recruited. All participants provided their informed consent to participate. Sixty participants were excluded due to either poor performance in the metacognitive task or failed attentions checks. The analysis was performed on a sample of 364 participants (133 (36.54%) of women). Participants provided their age, gender, country of residence, country of origin, race or ethnicity and years of education. The age of our sample was M = 36.67, SD = 11.17, with M = 6.66, SD = 3.11 years of education on average and consisted of 81.59% of White, 7.97% of Black, 4.67% of Latino, 3.30% Asian, 1.65% Mixed and 0.82% people which identified as Other.

In Study 2, a random sample of 435 US participants was recruited for an online study. All participants provided their informed consent to participate. Out of them, 64 were excluded due to poor performance on the metacognitive task and 36 due to missing trials or failed attention checks. The following analysis was performed on a sample of 335 participants (142 (42.39%) of women). The mean age of our sample was M = 35.86, SD = 10.11, with M = 5.522, SD = 2.62 years of education on average and consisted of 75.82% of White, 7.76% of Black, 5.07% of Latino, 5.67% Asian, 4.78% Mixed and 0.89% people which identified as Other.

Measures and procedure

Participants took part in the online study that was only accessible via computers or tablets. After providing demographics, they performed the perceptual metacognition task (Rollwage et al., 2018). Then, participants took part in the belief-updating task (Sunstein et al., 2017), where they were exposed to good and bad news conveying climate change and had to provide estimates in two different scenarios (temperature and sea-level rise) in a counterbalanced design. After finishing the tasks, they answered self-report items with questions regarding their climate change knowledge (Fischer et al., 2019; Sundblad et al., 2009), followed by the Dogmatism Scale (Altemeyer, 2002), the Climate Change Scepticism questionnaire (Capstick and Pidgeon, 2014), the Social Dominance Orientation (Ho et al., 2015) and political orientation (Rollwage et al., 2018). All items were presented following a randomised order. Attention check items (e.g., “If you have read the question, please choose Agree completely.”) were administered throughout the questionnaires, to control for data quality. Finally, participants were provided with accurate information regarding the actual state of climate change science in a debrief and re-directed to Amazon Mechanical Turk completion page. The study lasted approximately 40 min, and participants were compensated for their time with 7,5 $/h. Study 2 mirrored the first study regarding the experimental design and setup. The difference was the use of visual material supporting the statements in the Updating Beliefs Task (see below). Participants who completed Study 1 weren’t allowed to take part in the second one.

Perceptual metacognition task

To assess domain-general metacognitive abilities (metacognitive sensitivity, perceptual performance and confidence bias), we used the perceptual decision-making task adapted from (Rollwage et al., 2018), programmed in JavaScripts, using JsPsych (version 5.0.3). Participants were provided with a fixation cross for 1 s, followed by two black squares (250 × 250 pixels) on the left and right of the screen, which were divided into grids of 625 cells, randomly filled with flickering white dots. One square contained 313 filled cells and the other contained the greater number of filled cells. Judgement difficulty was determined as a difference in dot number between the two squares and a trial consisted of 5 configurations of randomly filled cells for 150 ms to create the flickering effect (for more task details, see Rollwage et al., 2018). Participants first underwent a calibration phase, with the black squares present for 750 ms, and then selected the one with more dots. Their response had no time limit. They were provided feedback on their accuracy for 500 ms. This phase was used to determine dot difference between left and right (stimulus strength) for each individual, that elicits approximately 71% correct responses in the discrimination task with a 2-down-1-up staircase procedure (García-Pérez, 1998), as a logarithm of dot difference. Participants started with 33 trials of the calibration phase in order for the staircase to converge. However, even after these initial trials, the task difficulty was continuously adapted to ensure equated performance between participants.

This was followed by the confidence task with 80 trials, where participants chose the square that contained more dots and provided confidence in their decision (reported as subjective probability that their decision was correct on a 9-point scale from 0% (one is sure their answer was incorrect)–100 % (one is sure their answer is correct) and a middle point of 50% (one is unsure if their answer is correct or not)). The subjects were incentivized to provide the most accurate judgements, by offering them reward following a quadratic score rule (Staël von Holstein, 1970). Metacognitive sensitivity, confidence bias and perceptual accuracy were extracted using a hierarchical Bayesian estimation scheme (Fleming, 2017). For more details, see: https://github.com/metacoglab/HMeta-d.

Belief-updating task

To investigate the updating behaviour of climate change related statements, we adapted a task by Sunstein et al. (2017). Participants were exposed to two scenarios: a temperature change and a sea-level rise scenario describing the effects of climate change together with a scientific projection on how much the climate of the scenario may have changed by 2100 (Many scientists have said that by 2100, the average U.S. temperature/sea level will rise at least 6 °F/by approximately 3 feet”). Then, participants were asked to estimate themselves how much they believed the temperature (1–12°F), and respectively the sea-level (0.5–6ft), might change. They also provided their confidence from 1 (guessing)–9 (very confident). After that, participants were confronted, in both scenarios (temperature change/sea-level rise), either with good or with bad news projecting lesser (1–5°F/0.5–2.5ft), and respectively higher (7–11°F/3.5–6ft), climate change (In the last weeks, some prominent scientists have reassessed the science and concluded that the situation is far better/far worse than previously thought. Unless further regulatory steps are taken by 2100, the average U.S. temperature/sea level is projected to increase/rise about 15/711 °F or 0.52.5/3.56 ft, depending on the emissions scenario and climate model“). Again, participants were asked to provide their estimate about how much they believed the change would manifest (1–12°F/0.5–6ft) and give their confidence thereof (1(guessing)9 (very confident)). The scenarios were counterbalanced.

Mode of communication

In Study 2, we added maps of mainland USA illustrating changes within each scenario (temperature change/sea-level rise) and under each condition (good news/bad news). Maps were adapted using GIMP (https://www.gimp.org/, version 2.10.08). The images showed average annual temperature and USA’s population at risk of chronic inundation under different climate change scenarios. These representative concentration pathways (RCPs) scenarios are predictions for global climate future as a function of a concentration of greenhouse gas in the atmosphere in comparison to historical data. The stimuli of both scenarios were matched for size (900 × 600 pixels) and the text for participants’ estimates and confidence remained the same as in Study 1.

For the temperature rise scenario, we adapted maps from the Climate Impact Map webpage (http://www.impactlab.org/) (Hsiang et al., 2017). The initial statement from the belief-updating task (see above) was supplemented with an additional sentence stating: “The maps below show the current and projected states for the temperature distributions across mainland USA”. Below the statement, two coloured maps of absolute temperature in Fahrenheit were presented side by side on a white background, with a temperature legend from 40 °F80 °F. On the left side of the screen, a map of the current average annual temperature in USA was presented (“Current state for 2020”) and on the right side, a map of the future under a moderate scenario (predicting likely mean rise of global temperature from 2.04.5 °F by 2100), labelled “Projected state for 2100”. In the news valence scenarios, we added: “The maps below show the current, initial and recent better/worse projections for temperature distributions across mainland USA”. Under the text, three maps were presented from left to right: “Current state for 2020”, “Projected state for 2100”, depicting medium temperature rise scenario and the “Better/Worse projected state for 2100” (average annual temperature in the USA under RCP2.6 for good news (global temperature rise of 0.51.3 °F) or under RCP8.5 for the bad news condition (rise of 4.78.6 °F). For the sea-level rise scenario maps designating the coastal areas of mainland USA at risk from chronic inundation due to climate change were adapted (based on data from National Oceanic and Atmospheric Administration and economic estimates of damage (Dahl et al., 2017)), sourced from The Union of Concerned Scientists webpage (https://ucsusa.maps.arcgis.com/apps/MapJournal/index.html?appid=b53e9dd7a85a44488466e1a38de87601#). We manually adapted them to be visually comparable to a temperature increase- scenario (for more details see Supplemental Materials).

The following text was added to the initial statement of the belief-updating task (see above): “The maps below show the current and projected states of population affected by sea-level rise and its impact for mainland USA.” Below, two maps were shown on a white background and a legend displaying the population affected by sea-level rise the from 11000 to 100,001300,001 people. The map on the left side showed the current estimate of population at risk (“Current state for 2020”) and the map on the right side the future estimate (“Projected state for 2100”) under a moderate scenario (predicting likely rise of global sea-level from 1.33 feet). After giving their first estimates and confidence thereof, participants were exposed to the bad respective good news scenario introduced with three maps and the following add-on to the initial statement of the belief-updating task (see above): “The maps below show the current, initial and recent, better/worse projections regarding the population affected by sea-level rise and its impact for mainland USA”. Below, three maps displayed from left to right: “Current state for 2020”, “Projected state for 2100”, depicting medium sea rise and “Better/Worse projected state for 2100”. The third map displayed population affected by a sea-level rise under RCP2.6 for the good news (predicting global seal rise of 12.6 ft and under RCP8.5 for the bad news condition 1.63.9 ft). A new variable “mode” was created to code for way of communication (0 = information via text only and 1 = information via text supported by visual representation).

Climate change knowledge

We used 16 climate change knowledge questions (8 true and 8 false) to measure domain-specific epistemic metacognitive functions. These items were originally developed on topics of precipitation, air, temperature and health (Sundblad et al., 2009), selected and further validated by Fischer et al (2019). We used eleven of their statements and checked their actual veracity by most recent Intergovernmental Panel on Climate Change reports (Arneth et al., 2019; Masson-Delmotte et al., 2018; Pörtner et al., 2019). The correct answers were shared with the participants in the debrief. We designed 5 new items (on the topic of species extinction, desertification, food, wildfires and disease; see Supplemental Materials), to better reflect recent report emphasis (Arneth et al., 2019). Participants were asked to select True/False and their confidence from 1 (Guessing)5 (Very confident) for each statement. On average they responded correctly to M = 7.25, SD = 2.27 statements in Study 1, and to M = 7.50, SD = 2.17 statements in Study 2 if measured using the 1-0-0 scoring. Scoring measured using the number-right method, on average M = 7.52, SD = 2.15 statements in Study 1 and M = 7.80, SD = 2.04 statements in Study 2 (see Study Design section) were considered correct. Additionally, participants on average similarly displayed M = 0.88, SD = 0.17 of confidence in Study 1, and M = 0.88, SD = 0.17 in Study 2. Further, metacognitive features, metacognitive sensitivity, confidence bias and accuracy, were extracted for each subject, using R (R Development Core Team, 2016). Accuracy (d’) was assessed with quantification to the difference between z-transformed true-positive and false-positive rates. Mean confidence regarding their knowledge was used as confidence bias. Metacognitive sensitivity (Meta-d’) assesses the degree to which participants’ confidence judgements reflects accurate vs. inaccurate responses to the knowledge questions. Meta-d’ was computed using hierarchical Bayesian estimation scheme(Fleming, 2017). For further details, see: https://github.com/metacoglab/HMeta-d.

Questionnaires

Climate Change Scepticism Scale

We administered a previously established Climate Change Scepticism Scale (Capstick and Pidgeon, 2014) encompassing 20 items. Two factors have been identified resulting in two subscales each with 10 items accordingly: Epistemic scepticism (e.g., “The evidence for climate change is unreliable.”), capturing doubts about scientific and physical properties of climate change and Response scepticism (“The media is often too alarmist about climate change.”), capturing doubts about the efficacy of action undertaken to address it. Scoring ranged from 1 (strongly disagree)–5 (strongly agree). The final summed score of Climate Change Scepticism Scale for Study 1 sample was M = 52.30, SD = 19.60 (range: 2092), and M = 50.62, SD = 19.56 (range: 2092) for Study 2.

Dogmatism

We administered a widely used and validated(Crowson, 2009; Crowson et al., 2008; Moore and Leach, 2016) Dogmatism Scale (Altemeyer, 2002) measuring the inflexibility and unjustified certainty of beliefs, with 20 items (e.g. “My opinions are right and will stand the test of time.”), scored from 1 (strongly disagree)–9 (strongly agree). Participants in our sample tended on average more towards the flexibility of beliefs in Study 1 (M = 63.23, SD = 19.81) and Study 2 (M = 60.09, SD = 21.06).

Political orientation

A three-item Continuous Visual Scale (Rollwage et al., 2018) was used, asking to rate: (1) Overall political attitude on the dimension from 0 (liberal)–100 (conservative). The mean overall political orientation scores of our samples in Study 1 (M = 48.57, SD = 31.61) and Study 2 (M = 41.77, SD = 31.61) both tended towards more liberal attitudes.

Social dominance orientation

We used an updated 8-item short version of Social Dominance Orientation Scale SDO 7S (Ho et al., 2015). The scale yields two factors: Dominancepreference for a system where high-status groups forcefully oppress subordinate ones; and Anti-Egalitarianismpreference for non-forceful group inequality. Each subscale has 4 items, two pro- and two con-traits. Participants were asked to rate statements, from 1 (strongly oppose)–7 (strongly favour). The mean SDO score of our samples in study 1 (M = 21.94, SD = 11.03) and in study 2 (M = 21.18, SD = 11.85) tended towards opposing the dominant social position of the higher-ranking group.

Analysis

Correlation

Analysis was performed in R (R Development Core Team, 2016) using the package stats [version 3.6.1] with spearman. Correlation plot was created with the package ggcorrplot [version 0.1.3].

Linear mixed effects models

Analysis was performed in R (R Development Core Team, 2016) using lme4 [version 1.1–21] software package using maximum likelihood and t-tests with the Satterthwaite’s method (p < 0.05). Multicollinearity was controlled for using performance [version 0.4–4] software package. The software package ggeffects [version 1.1–21] for extracting confidence of all predictors using bootstrapping. Fixed effects were scaled and centred around the mean.

Structural equation models

All models were estimated in R (R Development Core Team, 2016) using the lavaan software package [version 0.6–4] using full information maximum likelihood. Overall model fit was assessed with the χ2 test, RMSEA and its confidence interval (acceptable: 0.05 to 0.08), the comparative fit index (acceptable: 0.95 to 0.97), and SRMR (acceptable: 0.05 to 0.10). Models were compared using a χ2 test when the models were nested and using the AIC in all other cases [following Burnham and Anderson].

Results

We first tested, in Study 1, whether climate change scepticism predicted asymmetric updating to new evidence conveying climate change (Sunstein et al., 2017), using a linear mixed model with belief updating as a dependent variable (see Methods), valence of news (bad news = 0/good news = 1), climate change scepticism (20 items; scaled around the mean) and their interaction as fixed effects with a random intercept for participant ID. Individuals with greater climate change scepticism showed less belief-updating following bad news (β = −0.27, CI = [−0.35, −0.20], p < 0.0001). Moreover, a significant interaction between valence and climate change scepticism (β = 0.14, CI = [0.04, 0.26], p = 0.008; Fig. 2A) showed that whilst participants with higher climate change scepticism updated their beliefs less following bad news, they were more likely to update their beliefs to be in line with good news. This finding replicates Sunstein and colleagues (Sunstein et al., 2017), who reported that participants updated their beliefs about climate change in a self-affirming fashion: People who already believe in the occurrence of climate change are more likely to update their beliefs after exposure to bad vs. good news, whereas people who doubt climate change update their beliefs more likely after exposure to good vs. bad news on climate change.

Fig. 2: Results of Studies 1 and 2.
figure 2

Predicted values for Updating Behaviour in Study 1 (panel A) with n = 364 and Study 2 (panel B) with n = 335 participants. x-axis = climate change scepticism (20 items summed, scaled around the mean, the higher the values, higher values indicate more climate change scepticism. y-axis = update behaviour (positive values indicate greater updating towards new evidence); valence = valence of new evidence 0 = red, bad news; 1 = blue, good news. The zero point on the x-axis indicates a lack of updating. The error bars indicate confidence intervals of predicted values.

Except for dogmatism (20 items, scaled), no other covariate of interest significantly predicted belief updating. The more participants endorsed dogmatic beliefs, the less likely they were to update the following bad news (β = −0.12952, CI = [−0.22, −0.03], p = 0.009).

Study 2 also replicated the effect of climate change scepticism (β = −0.28, CI = [−0.37, −0.19], p < 0.0001) and its interaction with valence (β = 0.18, CI = [0.071, 0.31], p = 0.003) on belief updating. Moreover, in Study 2, unlike Study 1, we found a main effect of valence in predicting belief updating (β = −0.13, [CI] = −0.25, −0.005, p = 0.03): participants were less likely to update their prior beliefs about climate change when positive evidence was communicated through both visual and textual information.

Next, we tested whether domain-general metacognitive abilities (see Methods) influence participants’ updating to new information on climate change with a linear mixed model. Belief updating was included as a dependent variable, valence, and its interactions with perceptual metacognitive abilities (metacognitive sensitivity, perceptual performance, mean confidence; all variables scaled to the mean) as fixed effects, and participants as random intercept. In Study 1, metacognitive sensitivity significantly predicted the extent of updating to bad news, controlling for perceptual task performance. The greater the participants’ insights into the correctness of their perceptual judgements (after controlling for first-order performance), the more likely they were to update their beliefs towards bad news (β = 0.144, CI = [0.06, 0.22], p < 0.001). The opposite was observed for participants’ confidence regarding their perceptual judgements (their metacognitive bias): the higher the mean confidence, the less likely they were to update to bad news (β = −0.091, CI = [−0.17, 0.001], p = 0.03). From our other covariates of interest, political orientation and dogmatism also both significantly predicted belief updating. More politically conservative (β = −0.20997, CI = [−0.29, −0.12], p < 0.00001) or more dogmatic (β = −0.10, CI = [−0.20, −0.001], p = 0.03) participants were less likely to update to bad news.

Study 2 replicated the effect of metacognitive sensitivity, controlling for perceptual task performance, in predicting greater updating to bad news (β = 0.14, CI = [0.059, 0.235], p = 0.001). In addition, we found a significant interaction of metacognitive sensitivity and valence in predicting updating of beliefs to both bad and good news (β = −0.13, CI = [−0.26, −0.02], p = 0.02): Participants’ insight into the correctness of their perceptual choices significantly affected belief updating to new evidence: the greater their metacognitive sensitivity, the more likely they were to update their beliefs to bad news, but the opposite was true for good news.

Interestingly, and unlike in Study 1, participants with higher accuracy in the perceptual decision-making task were significantly more likely to update to bad news conveying climate change (β = 0.10, CI = [0.01, 0.20], p = 0.018), when evidence was conveyed both with textual and visual information. Again, dogmatism (β = −0.10, CI = [−0.20, −0.01], p = 0.02) and participants’ political orientation (β = −0.12, CI = [−0.24, −0.01], p = 0.03) significantly predicted belief updating: the more conservative and dogmatic, the less likely they were to update to new evidence.

Taken together, these findings suggest that domain-general metacognitive sensitivity influences our belief updating to valenced news regarding climate change. Participants with more insight into the correctness of their perceptual decisions were more likely to update to bad news about climate change. Interestingly, when exposed to a combined visual and textual argument they were also significantly less likely to update their beliefs to positively valenced news conveying evidence of climate change.

We next looked into how participants’ domain-specific metacognitive abilities, estimated through a climate-related knowledge questionnaire (Fischer et al., 2019), impacted belief updating. We built a linear mixed model with belief updating as a dependent variable and news valence and its interactions with climate change knowledge-related metacognitive abilities (metacognitive sensitivity, performance, mean confidence; all variables scaled to the mean) as fixed effects. Participants were treated as a random intercept. Study 1 didn’t show a significant effect of metacognitive sensitivity of one’s own climate change knowledge, when controlling for climate-change knowledge. However, mean confidence was significantly predictive of updating beliefs based on bad news: the more confident participants were about the correctness of their climate change answers, the less likely they were to update to bad news (β = −0.10, CI = [−0.20, −0.01], p = 0.02). Furthermore, there was a significant interaction between valence and the amount of correctly verified climate change statements (β = 0.14, CI = [0.04, 0.25], p = 0.009): when participants were more accurate in climate change related knowledge questions, they were also more likely to update their prior belief when exposed to good news.

Study 2 replicated the overconfidence effect (β = −0.11, CI = [−0.21, −0.01], p = 0.02). Furthermore, unlike in Study 1, domain-specific metacognitive sensitivity significantly predicted belief updating: participants with more insight into the correctness of their answers on the climate change knowledge questionnaire were more likely to update their beliefs when exposed to bad news (β = 0.10361, CI = [0.01, 0.19], p = 0.03658).

In sum, across two independent studies, we found that overconfidence about climate change knowledge decreased the likelihood of updating prior beliefs when confronted with negatively framed new evidence. This pattern of results is consistent with participants’ domain-specific confidence bias impacting their readiness to change their mind about climate-related matters. Interestingly, this was only true when the new evidence communicating climate change had been negatively framed, both with text only and visual and textual information combined.

To better understand the relationships between domain-general and domain-specific metacognitive abilities (metacognitive sensitivity, confidence bias and perceptual performance) and climate change scepticism in updating beliefs about climate change, we carried out path analyses using structural equation modelling (R, lavaan; see Figure).

For Study 1, the model had a very good fit to the data [X2(18.484,12) = 0.102; CFI = 0.972; RMSEA = 0.039; SRMR = 0.031] and revealed that both domain-general (i.e., perceptual) and domain-specific metacognitive sensitivities, as well as perceptual and epistemic confidence, significantly predicted participants’ climate change attitudes (R2 = 0.386): lower domain-general (β = −0.14, p = 0.001) and domain-specific (β = −0.39, p < 0.001) metacognitive sensitivity and higher domain-general (β = 0.34, p = 0.001) and domain-specific (β = 0.15, p = 0.001) confidence were each predictive of greater climate change scepticism. In turn, climate change scepticism significantly predicted updating to both bad news (R2 = 0.121) and good news (R2 = 0.028). Overall, participants with higher climate change scepticism were less likely to update their beliefs to new information regarding climate change (bad: β = −0.27, p = 0.000; good: β = −0.13, p = 0.000) independently of the valence of the news.

Furthermore, domain-general metacognitive sensitivity indirectly predicted participants’ updating to good (β = 0.04, p = 0.002) as well as bad (β = 0.01, p = 0.02) news via climate change scepticism: higher metacognitive sensitivity led to lower climate change scepticism and therefore to enhanced belief updating. The same was true for domain-specific metacognitive sensitivity (good: β = 0.10, p < 0.001; bad β = 0.05, p = 0.003).

Confidence bias, however, negatively predicted belief updating: the more confident participants were of their perceptual choices and the more pronounced their climate change scepticism, the less likely they were to update to news independently of the valence (good: β = −0.096, p < 0.001; bad β = −0.045, p = 0.003). The same was true for confidence in the correctness of their answers to the climate change knowledge questionnaire (good: β = −0.044, p = 0.003; bad β = −0.021, p = 0.0021). Perceptual performance itself predicted neither climate change scepticism nor updating behaviour.

These findings suggest that domain-specific and domain-general metacognitive abilities across the spectrum of climate change scepticism influence our updating to newly presented evidence regarding climate change. Indeed, participants showing higher insight into the (in)correctness of their decisions in response to a perceptual decision-making task and a climate change knowledge questionnaire showed lower climate change scepticism and were more likely to update towards new evidence. When they reported higher perceptual and epistemic confidence, however, they were less likely to update to new evidence, via an impact on climate change scepticism (see Fig. 3A).

Fig. 3: Path analysis for Studies 1 and 2.
figure 3

Path analysis depicting how metacognitive abilities impact attitudes and the updating of climate change related beliefs in Study 1 (panel A) and Study 2 (panel B). For both figures, significance level was <0.05. N.S., not significant; Sig. Neg., significant negative pathway; Sig. Pos., significant positive pathway.

The same key findings were replicated in Study 2 (Fig. 3b) with a very good model fit [X2(13.933,12) = 0.305; CFI = 0.981; RMSEA = 0.022; SRMR = 0.029]. Only the previously significant indirect effect of perceptual metacognitive sensitivity via climate change scepticism became a trend in predicting updating to bad news (β = 0.013, p = 0.062).

Lastly, we tested whether the mode in which news regarding climate change was communicated may influence belief updating. Data from both studies were combined and a new variable (0 = textual information only; 1 = combined textual and visual information) was created to predict belief updating in the structural equation model, alongside climate change scepticism, domain-general and domain-specific metacognitive abilities. The model had a good fit to the data [X2(24.314, 13) = 0.028; CFI = 0.972; RMSEA = 0.035; SRMR = 0.024] and confirmed the previous findings on the role of perceptual and epistemic metacognitive sensitivity and confidence. Neither the mode of presentation itself nor its interaction with other factors were significant, indicating that both textual and visual information are both similarly effective in conveying climate change related information.

Discussion and outlook

Taken together, our data shows that climate change scepticism, an attitude encompassing doubts and lack of concern about climate change (Capstick and Pidgeon, 2014) is associated with differences in domain-general as well as domain-specific metacognitive abilities (Morales et al., 2017). These differences further influenced the asymmetrical updating of climate change beliefs, that is, the updating of information in a self-confirming fashion. Metacognitive capacity may therefore contribute more broadly to the formation of polarised beliefs within multiple realms of social, political and legal life (Sharot et al., 2012; Sunstein et al., 2017) via an impact on belief updating.

Our findings support past research emphasising the need to describe the cognitive styles underpinning specific socio-political attitudes (Rollwage et al., 2019) and to distinguish between task performance, confidence bias and metacognitive sensitivity. This is particularly important when studying attitudes issued with an unjustified certainty about the truthiness of one’s beliefs, as metacognition assesses the capacity to realise that one’s beliefs may be wrong (Rollwage et al., 2018). Indeed, climate change sceptics showed poorer metacognitive sensitivity and henceforth updated their beliefs less when confronted with new evidence conveying climate change. Only when presented with self-confirming information (i.e., good news regarding climate change), did sceptics tend to update their beliefs.

Importantly both domain-general and domain-specific metacognition showed significant relationships with climate change scepticism, in line with previous findings showing that metacognition has both domain-general and domain-specific aspects. While domain-general aspects of metacognition were significant predictors of climate change scepticism and belief updating, domain-specific metacognition (i.e., insight about the correctness of one’s climate change related knowledge) had stronger predictive power. This highlights how general cognitive characteristics can affect people’s behaviour with even more pronounced effects.

In a recent study on a German sample (Fischer and Said, 2021), metacognition about participants’ climate change knowledge predicted beliefs about climate change (the riskiness and anthropogenic nature of climate change) but metacognition about other-domain science knowledge did not, suggesting a greater role for domain-specific metacognition. Importantly, however, our domain-general measure of metacognitive sensitivity was derived from a perceptual decision-making task, which may quantify a broader and general level of metacognitive function. Future work is also needed to account for potential national or cross-cultural differences in climate-change related knowledge and metacognition, given that here our focus was exclusively on a US sample.

In addition to investigating the contributions of domain-general and domain-specific metacognition, our study enabled us to disentangle the contribution of a general confidence bias on climate change scepticism. This allowed us to disambiguate the effects of overall beliefs or confidence in one’s performance from the capacity for insight into performance fluctuations (metacognitive sensitivity). Here, we found that both metacognitive sensitivity and confidence bias were independent predictors of belief updating. Higher confidence in domain-general (perceptual) and domain-specific (climate change knowledge) task performance reduced the likelihood of updating to new evidence regarding climate change.

These results are particularly interesting considering recent advances in understanding the role of confidence for the processing of new evidence. It has been shown that high confidence in a belief leads to a neural confirmation bias, making people less open for new information (Rollwage et al., 2020). This is in line with our findings showing that a confidence bias leads to reduced belief updating. However, such confidence-induced confirmation bias is especially a problem for belief updating when confidence is misaligned, i.e., when it is combined with low metacognitive ability (Rollwage and Fleming, 2021). Therefore, our empirical findings are in close alignment with theoretical work suggesting that both confidence bias as well as metacognitive sensitivity should influence a person’s propensity to update their beliefs in light of new evidence.

Finally, the observed effect sizes are noteworthy. In Study 1, 39% of variance in climate change scepticism were explained by our measures, while in study 2, 34% of variance was explained. This strongly supports the argument that (meta)cognitive characteristics are critical building blocks for understand socio-political beliefs, both in terms of understanding the underlying mechanisms (Rollwage et al., 2019) driving such beliefs, as well as providing additional predictive power over and above socio-demographic variables (Zmigrod et al., 2021).

In terms of the limitations of our research, while these two studies had relatively large sample, future research should aim for representative samples and also assess potential cross-cultural differences on the role of domain-general metacognition and its impact on people’s belief updating. In addition, although we found no significant impact of the mode of communication on belief updating, future studies should address the role of climate visuals in promoting awareness about climate change, given the more emotive nature of images as compared to words.

In conclusion, our paper confirms previous findings but it also extends past findings in important new ways by showing the role that domain-general metacognitive abilities play. Past studies have shown that climate scepticism affects belief updating. For example, people are more likely to update prior beliefs with new evidence when the given information that is desirable and self-confirming. Other studies have shown that domain-specific meta-knowledge predicted climate change beliefs but its impact on belief updating has not been assessed. We here combined insights from the aforementioned research to ask the more pertinent question of what it would take to change people’s beliefs and the role of metacognition in that process. Thus, we set out to explore whether domain-general metacognitive abilities underlie participants’ attitudes towards climate change and further influence the updating of climate change related beliefs. Moreover, across two studies we provide compelling evidence in support of the role of domain-general metacognition.

Our findings highlight metacognition as a core capacity underpinning both scepticism and modulating belief-updating behaviour about a topic as dividing and urgent as climate change. Promisingly, recent research suggests that it is possible to enhance domain-general metacognitive sensitivity through training, eventually enabling participants with entrenched beliefs to better reflect on them and ameliorate their information-seeking behaviour in the outside world (Carpenter et al., 2019). Further research should establish whether learning metacognitive skills may foster a practice of mutual understanding and caring for the world we inhabit. Beyond basic research, policy makers can, in the near future, develop and test concrete interventions targeting metacognitive abilities in educational and professional settings as a means of counteracting dogmatic inflexible beliefs held by the general population and decision-makers alike.