Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Facing the pandemic with trust in science

Abstract

How essential is trust in science to prevent the spread of COVID-19? People who trust in science are reportedly more likely to comply with official guidelines, implying that higher levels of adherence could be achieved by improving trust in science. However, analysis of a global dataset (n = 4341) suggests otherwise. Trust in science had a small, indirect effect on adherence to the rules. Nonetheless, it predicted people’s approval of prevention measures such as social distancing, and bridged political ideology and approval of the measures (conservatives trusted science less and in turn approved of the measures less). These effects were stronger in the USA than in other countries. Even though any increase in trust in science is unlikely to yield strong behavioural changes, given its relationships with both ideology and individuals’ attitudes to the measures, trust in science may be leveraged to yield longer-term sustainable social benefits.

Introduction

The outbreak of the COVID-19 pandemic saw scientists recommending preventive measures such as physical distancing and mask wearing. From the start, these measures were controversial, with some sectors of the public questioning their necessity and efficacy (DeMora et al., 2021; Simonov et al., 2020). ‘COVID-19 has shown us in the starkest terms—life and death—what happens when we don’t trust science and defy the advice of experts’ (Oreskes, 2021, p. x).

Policy-makers and scientific institutions have made protecting or rebuilding trust in science a priority: the US President’s Chief Medical Adviser stated that ‘Biden’s real COVID-19 challenge is restoring a nation’s trust in science’ (Fauci, 2020) and the chief executive officer of the American Association for the Advancement of Science echoed similar priorities under the headline ‘Why we must rebuild trust in science’ (Parikh, 2021).

What, specifically, did trust in science achieve as people faced the pandemic? Trust in science correlated positively with people’s adherence to pandemic measures (Bicchieri et al., 2021; Dohle et al., 2020; Mohammed et al., 2020; Pagliaro et al., 2021; Petersen et al., 2021; Plohl and Musil, 2021; Rothmund et al., 2020; Sailer et al., 2021; Stosic et al., 2021). Trust thus seems to be a good way to protect society from major public health hazards by encouraging the following of official guidelines. However, we argue that this conclusion is premature: reported associations between trust and the following of guidelines are not enough to concretely identify the specific role of trust in science in the pandemic, let alone justify that role.

How can trust in science be conceptualised?

Trust in science is a complex topic. It is thus worth first identifying what aspect is most relevant for understanding whether trust in science enhances adherence to pandemic prevention measures, and if not, what its role in the pandemic is.

In general terms, one individual trusts another (or an institution or a system) when the individual is vulnerable to or dependent on that other in some way and accepts the risks entailed in this dependency because the other shows features such as competence or benevolence, or because doing so reduces the complexity of the individual’s decision-making (Hendriks et al., 2016; Larson et al., 2018; Siegrist, 2021). In more specific terms, we are concerned with epistemic trust: trust in the knowledge produced by scientists (Hendriks et al., 2016; Irzik and Kurtulmus, 2019).

Two complementary aspects of epistemic trust are commonly studied: a normative aspect and a more pragmatic one. The normative question is: why should people trust in science? Answers to this question tend to spell out philosophical conditions under which trust is warranted (Irzik and Kurtulmus, 2019), and may focus on the reliability of science as a process, including the interplay between criticism and consensus among diverse scientists (Oreskes, 2021), or the hallmarks of trustworthiness that people use in judging who to trust (Hendriks et al., 2015).

The pragmatic question is: do people actually trust science? This focuses more on socio-psychological factors (Irzik and Kurtulmus, 2019), and this is what researchers are more concerned with when, for instance, they want to know if trust in science has been stable during the pandemic (Agley, 2020; Sibley et al., 2020). In probing whether trust in science encourages adherence to pandemic measures, we are mainly concerned with this pragmatic question: to what extent do people trust what scientists say, and is this trust associated with better adherence to science-based policies?

This is a still a matter of trust because lay-people lack access to the data and expertise that support scientists’ claims, though we highlight that it is nonetheless closely related to a number of similar concepts such as credibility (Hartman et al., 2017) and confidence (Siegrist, 2021), and it is not always clear just how these are to be distinguished.

We note that in some work on trust in science, distrust is not merely the mirror image of trust because there may be multiple species of distrust (Lewicki et al., 1998; Tranter and Booth, 2015). However, in the claims we are scrutinising, the main concern derives from reported associations between higher (vs. lower) trust in science and better (vs. worse) adherence. Our immediate focus, then, is on degrees of trust rather than types of (dis)trust.

Can trust in science explain adherence to pandemic rules?

Multiple studies report that trust in science is associated with better adherence to prevention measures (Bicchieri et al., 2021; Dohle et al., 2020; Mohammed et al., 2020; Pagliaro et al., 2021; Petersen et al., 2021; Plohl and Musil, 2021; Rothmund et al., 2020; Sailer et al., 2021; Stosic et al., 2021). A common conclusion is that trust in science is important precisely because it promotes adherence (Bicchieri et al., 2021; Dohle et al., 2020; Mohammed et al., 2020; Pagliaro et al., 2021; Plohl and Musil, 2021; Sailer et al., 2021; Stosic et al., 2021). Correspondingly, as lower trust is associated with lower adherence to prevention measures, this feeds into calls for trust to be restored (Fauci, 2020; Parikh, 2021).

However, trust in science has been fairly stable in the pandemic in some countries (Agley, 2020; Sibley et al., 2020); in others it even increased early in the pandemic (Wissenschaft im Dialog, 2020), which is the period covered by our data. Perhaps, then, ‘trust in science is not the problem’ (Leshner, 2021). But in that case, why is the belief that it must be restored for better adherence so prevalent? An early view of the public’s understanding of science, the ‘deficit model’, explained negative attitudes towards science as being due to a deficit of knowledge. Although more recent work has highlighted the limitations of the deficit model (for reviews, see Ahteensuu, 2012; Brossard and Lewenstein, 2009; Gregory and Lock, 2008; Sturgis and Allum, 2004), scientists communicating with the public or with policy-makers may still rely heavily on the deficit view (Simis et al., 2016). Perhaps, then, calls by prominent scientists to rebuild trust in science merely reflect the persistence of a deficit model, one that has shifted the blame from a lack of knowledge to a lack of trust.

In moving beyond the limitations of deficit models, we should not immediately blame a lack of trust for low adherence to pandemic measures, but should rather consider what factors might contribute to low trust, or what factors apart from trust explain behaviour in the pandemic (Leshner, 2021). Situating trust in science in this wider context can help identify its specific role in the pandemic, and also in responding to future threats.

What other factors might explain adherence to pandemic rules?

A primary issue is whether trust in science influenced approval of prevention measures in addition to adherence to those measures. Research on social norm change has shown that approval (positive attitudes to norms) and adherence (behaviour in line with norms) are two distinct mechanisms (Bicchieri, 2016). A distinction between these mechanisms has already been observed in the pandemic (Betsch et al., 2020; Dohle et al., 2020). The worry is that people who do not approve of new norms may nonetheless adhere to them because of coercion, fear or propaganda, and in these cases adherence is often fragile or short-lived (Mercier, 2017). In contrast, we would hope that any effect of trust in science is robust and long-lived, in which case it should change minds, not just coerce behaviour. Indeed, it should affect behaviour precisely because it has changed minds.

A second issue is whether trust in science still matters for behaviour change once the effects of social conformity are accounted for. People often trust and conform to others around them (Cialdini and Goldstein, 2004). If people prefer to associate with like-minded others, their adherence may be misattributed to trust in science, while actually stemming from social conformity. Indeed, the influence of one’s social circle had a strong impact on people’s following of COVID-19 rules (Chevallier et al., 2021; Moehring et al., 2021).

Finally, the role that trust in science played in the public’s adherence to COVID-19 measures, even before the divisive issue of vaccination was at play, is unlikely to have been consistent from group to group. Worldview- or value-based factors such as political ideology vary across groups, and are important components of attitudes towards science (Brossard and Lewenstein, 2009; Gauchat, 2012; Hornsey and Fielding, 2017; Rutjens et al., 2018a; Sturgis and Allum, 2004). Conservatives typically trust science less (Gauchat, 2012), but they are more likely to follow COVID-19 rules when they trust science more (Koetke et al., 2021; Plohl and Musil, 2021). It is thus important to consider how ideology impacts the role of trust in science.

How does trust in science vary across countries?

Much work on trust in science has focused on the USA (Diehl et al., 2021; Engels et al., 2013), However, levels of trust in science vary across countries (Borgonovi and Pokropek, 2020; Sturgis et al., 2021), as do associations between trust and other factors, such as political ideology (Pechar et al., 2018; Pennycook et al., 2020). Thus, one crucial aspect of understanding the importance of trust in science for adherence behaviour includes testing the extent to which patterns are consistent internationally.

We aimed to recruit not only from well-studied populations such as the USA, UK or Germany, but also from understudied, non-Western countries, and consequently made our survey available in several languages: Arabic, Bangla, Chinese, English, Farsi, French, German, Hindi, Italian, Spanish, Swedish and Turkish.

Summary of the present study

The main hypothesis being tested here is whether trust in science predicts better adherence to pandemic measures (pre-registration at https://osf.io/ke5yn/). However, the issues raised above prompt us to go beyond just testing for an association between trust in science and reported adherence to pandemic social distancing guidelines.

Consequently, we also test whether this holds after accounting for approval and social conformity (Research Question 1). We also examine whether trust in science acts more on minds (approval of prevention measures) or on behaviour (adherence to the measures), and whether the same holds accounting for political ideology (Research Question 2).

Finally, we check whether the role of trust in science is consistent internationally, or whether some countries deviate from global patterns (Research Question 3).

Methods

Participants

This data was collected as part of a larger project on the normative and social aspects of COVID-19 (Tuncgenc et al., 2021). A convenience sample was recruited in April and May 2020 via social media, university mailing lists, press releases and blog posts. Participation was not compensated. Overall, 6675 participants completed the survey. However, participants were able to opt out of certain personal questions (e.g., on political ideology). Further, the operationalisation of “close social circle” (see below) meant that some participants responded that they had no close circle, in which case there is no data for whether they thought their close circle was adhering to COVID-19 measures (our social conformity measure). These two sources of missing data mean that there are 4341 complete responses for the variables reported here.

Participants’ countries of residence with samples larger than 100 were: UK (1612); Turkey (630); USA (459); Peru (216); Germany (189); France (188); and Australia (109). More country information is available in the supplementary materials.

The study received ethical approval through the University of Nottingham, and all participants provided informed consent. Data was not analysed from any incomplete surveys, abandoned before the final debrief.

Procedure

The survey was delivered via a custom web app (desktop and mobile) written in jsPsych (De Leeuw, 2015).

Participants first selected which language they would like to do the survey in. After providing informed consent, participants indicated their close social circle using an established method (Dunbar and Spoors, 1995). First, participants listed the first names of all those people with whom they had had a conversation in the previous 7 days (ultimately, these names are not retained in the data). Second, those names were presented on the screen, and participants selected which names (if any) they would turn to for comfort or advice, using checkboxes. Their close social circle is operationalised as the subset of names that they selected at this second stage.

Participants were reminded of the general guidelines at the time (April–May, 2020): to keep physical distance from others. They used sliders to respond whether they were adhering to this advice (labels 0 = ‘Not been following the advice at all’; 50 = ‘Been following the advice exactly’; 100=‘Been doing more than what is advised’), and show their approval of the guideline (0 = ‘Not following the advice is completely ok’; 100 = ‘Not following the advice is completely wrong’). They were reminded of the names of those in their close social circle, and responded whether they thought their close social circle was adhering with the same guidelines (using the same slider response format).

To measure trust in science, we selected three items from the six-item Credibility of Science scale (Hartman et al., 2017) for reasons of brevity, given the length and voluntary nature of our study. This scale measures ‘generalised perceptions about the credibility of science (PCoS)—that is, the extent to which one’s default tendency is to trust in the methods and findings of science, hold positive attitudes toward the scientific enterprise, view scientists as credible, and so forth’ (Hartman et al., 2017, p. 358, emphasis ours).

The items used here were:

  1. 1.

    People trust scientists a lot more than they should

  2. 2.

    A lot of scientific theories are dead wrong

  3. 3.

    Our society places too much emphasis on science

Participants rated their agreement with these statements using a slider (0 = ‘completely disagree’; 100 = ‘completely agree’). The ‘trust in science’ score is the average of these three responses, reverse-scored for ease of interpretation such that a high score reflects high trust (reliability ωt = 0.75, α = 0.73, Revelle and Condon, 2019).

We considered whether these items may reflect broader reservations against scientific expertise rather than trust; whether our selection of these three items could bias results; and whether the negative phrasing of all three items may reflect distrust more that trustFootnote 1. To allay these concerns, we conducted a follow-up validation study, described under ‘Supplementary analyses’ below.

Participants described their political ideology, again using a slider (0 = ‘very liberal’; 100 = ‘very conservative’). They could opt out in two ways, with one checkbox indicating that this continuum did not describe their beliefs, and another checkbox indicating that they did not wish to respond.

Finally, participants provided demographic information, including age, gender and education level (which are included as control variables in all models reported here). For other questions asked in the survey as part of the larger project on the normative and social aspects of COVID-19, see Tuncgenc et al. (2021).

Open practices statement

A full demonstration of the survey can be found at the Open Science Foundation (OSF) repository for the broader project (https://osf.io/ke5yn/). The OSF repository for this specific study (https://osf.io/s5mdh/) contains the data and analyses.

The survey design was preregistered at the above project repository. The same registration included the hypothesis that adherence to official guidelines would be predicted by trust in science. For other hypotheses in the broader project (not relating to trust, see Tuncgenc et al., 2021).

The Bayesian models reported below were not pre-registered, but the full R analysis script is available at the above study repository. This includes full details of model priors, random effects structures, and control variables such as gender, age and education, as well as various supplementary analyses briefly described below.

Results

Overview of sample

Of the 6675 participants who finished the survey, 1577 opted out of the question on political ideology and 1199 indicated that they had no close circle (in the specific sense of ‘close circle’ as operationalised here: see the “Methods” section). This leaves 4341 completed responses, as 442 had missing data on both counts.

The final sample included 1293 men, 2985 women, 39 non-binary people, and 24 who chose not to answer the gender question. Table 1 summarises the main variables. The categories for education ranged from 0 = ‘No schooling completed’ to 4 = ‘Postgraduate degree’, so the point nearest the mean value corresponds to ‘3 = University undergraduate degree/professional equivalent’. The demographic variables (gender, age, education) were included as covariates in all analyses reported below, though the model coefficients for these covariates are reported only in the full analysis at https://osf.io/s5mdh/, which also gives details of how education was modelled as a monotonic (not continuous, linear) effect. For further details about recruitment and demographics, see Tuncgenc et al. (2021).

Table 1 Descriptive statistics.

We explore the effects of missing data in more detail at https://osf.io/s5mdh/, though as an initial check that these gaps not bias our conclusions, there was no significant difference in the main outcome variable, adherence to physical distancing guidelines, between the 4341 participants who answered all questions (mean adherence 63.8%) and the 2334 participants who had some missing data (mean adherence 62.9%, a difference of less than one percentage-point: 0.89[0.17,1.97]).

To gauge how well our convenience sample compares with more representative samples, we scaled our trust in science variable and regressed it on published national averages of trust in science (Borgonovi and Pokropek, 2020), which were derived from a global survey (Wellcome Global Monitor, 2018). Trust in science was moderately well predicted by these national average indexes (β = 0.4 [0.38, 0.43]). Indeed, this is a stronger relationship than any that trust in science has in our data (see for instance Fig. 2). We stress that these national norms reflect different survey items, different response scales and different survey delivery methods than our data, and that a comparison between national averages and individual responses will necessarily be noisy, so we consider this an encouraging result. Further, we check in a supplementary analysis that our conclusions still hold, controlling for these national norms (https://osf.io/s5mdh/).

Does trust in science predict unique variance in adherence behaviour?

Figure 1 shows coefficients from four separate Bayesian linear models where adherence was regressed on trust in science, or on trust in science and various combinations of approval and social conformity. Standardised regression coefficients are reported with 95% credibility intervals (CIs), as well as Bayes factors (BFs) where we want to assess the evidence in favour of there being no relationship. These models included country as a random effect (see https://osf.io/s5mdh/ for random effects structures, model priors, calculation of Bayes Factors, and control variables age, gender and education).

Fig. 1: Standardised effects (linear regression betas) with 95% credible intervals (CIs).
figure 1

These show the effects of trust in science, individual approval, and social conformity on adherence behaviour, according to which predictors were included in each model.

The effect of trust in science on adherence behaviour varied depending on which covariates were included. When trust in science was the only predictor, it predicted adherence (β = 0.08[0.06, 0.11]). When social conformity was included, the effect of science was reduced (β = 0.06[0.03, 0.09]). When approval of COVID-19 measures was included, the effect of science dropped out completely (with just approval as co-variate, trust in science β = 0.02[−0.01, 0.04], BF01 = 34; with approval and social conformity as covariates, science β = 0[−0.03, 0.02], BF01 = 70.6).

At best, trust in science had a small role in predicting adherence. At worst, it had no effect whatsoever. Considering direct predictors of adherence, then, it is inadvisable to place too much weight on people’s trust in science, independently of these other critical factors.

Does trust in science predict approval of the rules over adherence to the rules?

A second aim was to see whether trust in science predicts approval of the rules, adherence to the rules, or both. In particular, we argued that, if trust in science is to play a robust role in the pandemic, it should affect behaviour by first changing minds. This aim can be addressed with a path analysis, comprising simultaneous Bayesian linear regressions.

The model pathways are illustrated in Fig. 2a. In the Supplementary Material we justify the inclusion of each pathway, but briefly: in addition to the critical pathways connecting trust in science, approval and adherence, the model included a pathway from social conformity to adherence (Bicchieri et al., 2021; Moehring et al., 2021). Furthermore, as previous research has shown that political ideology predicts trust in science (Gauchat, 2012; Rutjens et al., 2018b), approval (Collins et al., 2021, their ‘support for restrictions’ variable), and adherence (Pennycook et al., 2020), and that trust in science may mediate the latter relationship (Plohl and Musil, 2021), pathways for these relationships were included. All pathways include random intercepts and slopes for country. See https://osf.io/s5mdh/ for further details, including demographic control variables (age, gender and education). Figure 2b plots standardised regression coefficients and CIs for the fixed effects from the simultaneous Bayesian regressions. The model R2 for adherence was 0.31 [0.29, 0.33].

Fig. 2: Pathways and posterior samples for path analysis.
figure 2

See Table S1 for justifications for each pathway. a Model pathway standardised coefficients, including 95% CIs for the direct and total effects of science and conservative ideology. b Posterior samples for model fixed effects, with whiskers showing 89% (thick) and 95% (thin) CIs.

In line with previous research, a more conservative ideology predicted lower trust in science (β = −0.23 [−0.29, −0.17]). There was no direct effect of trust in science on adherence (β = 0 [−0.06, 0.07], BF01 = 31.22). However, trust in science predicted approval (β = 0.25 [0.19, 0.32]), and had an indirect association with adherence via approval (β = 0.08 [0.06, 0.11]). Thus, trust in science had a moderate effect on whether people think they should adhere, but only a small, indirect effect on adherence behaviour.

Conservative ideology had no direct effect on approval (β = 0.01[−0.04, 0.06], BF01 = 38.48), though it had an indirect association with approval via trust in science (β = −0.06 [−0.08, −0.04]). Conservative ideology had no direct effect on adherence (β = −0.04[−0.09, 0.01], BF01 = 12.77), but had an indirect effect via the science—approval pathway (β = −0.02[−0.03, −0.01]), which contributed to a total effect (β = −0.05[−0.11, −0.01]).

How do the key relationships vary across countries?

As the strength of the effects of ideology and trust vary across countries (Czarnek et al., 2021; Pennycook et al., 2020; Siegrist, 2021), the model represented in Fig. 2a included by-country random slopes. The variation in these relationships can be explored using the posterior samples for the random slopes (here, showing the top-10 participating counties by sample size). Figure 3 plots these posterior samples for the pathways leading to and from trust in science (for the other pathways, see https://osf.io/s5mdh/).

Fig. 3: Posterior samples for random slopes for the top 10 countries by sample size.
figure 3

a The negative effect of conservative ideology on trust in science and b the positive effect of trust in science on individual approval. Fixed effects shown with dashed blue lines and 0 shown with dotted red lines. AUS: Australia; BGD: Bangladesh; DEU: Germany; FRA: France; GBR: United Kingdom; ITA: Italy; PER: Peru; SWE: Sweden; TUR: Turkey; USA: United States of America.

Despite some between-country variation, the effects of conservative ideology on trust in science (Fig. 3a) and of science on approval (Fig. 3b) were consistently in the same direction (relative to 0, shown with a dotted red line).

However, compared to population-level effects, in the USA, conservative ideology was more negatively linked to trust in science (consistent with previous findings, Pennycook et al., 2020), and trust in science was more positively linked to people’s approval of COVID-19 measures. Italy showed a similar, though weaker, pattern as the USA, whereas other countries were less consistent. For instance, Turkey had a fairly typical relationship between ideology and science, whereas the relationship between trust in science and approval was weak.

Supplementary analyses

We check that our findings do not depend on narrow assumptions with a range of alternative analyses. The full analyses are at https://osf.io/s5mdh/ and we also briefly summarise these analyses in the Supplementary Material. In particular, we discuss:

  1. 1.

    the reasons for our model pathways based on current literature (Figs. S1and S2; Table S1);

  2. 2.

    alternative pathways (e.g., where social conformity is not just a covariate, independent of the other predictors, Figs. S3S6);

  3. 3.

    alternative regression families instead of Gaussian regression (e.g., generalised linear regression with a zero one inflated beta family, Fig. S7);

  4. 4.

    imputed missing data (Fig. S8);

  5. 5.

    controlling for published national norms such as national levels of trust in science and the stringency of the prevention measures in each participant’s country of residence at the time of their participation (Fig. S9, using norms from Borgonovi and Pokropek, 2020; Hale et al., 2020);

  6. 6.

    simulation of potential unmeasured confounds (Fig. S10).

  7. 7.

    and measurement error (Fig. S11).

Our claims about the role of trust in science are robust against all of these alternative analysis strategies. The only conclusion which changes slightly is that there is sometimes evidence for a direct effect of ideology on adherence, depending on such modelling decisions. However, as our focus here is on trust in science rather than ideology, we simply conclude that there might be a direct effect of the latter on adherence, and that future work should explore this possibility.

In the “Methods” section, we mentioned several caveats about our measure of trust in science: it could reflect broader attitudes to science rather than trust specifically; we used three items from a six-item scale; and all items were negatively valenced, potentially indexing distrust rather than trustFootnote 2.

To assess whether these caveats affect our conclusions, we conducted a follow-up study where we recruited 1002 participants from Amazon’s Mechanical Turk and presented them with the above three items, as well as an item explicitly asking about trust in science (either positive “I trust science” or negative “I don’t trust science”, on the same response scale, with a virtual coin-flip determining which one of these two options each participant saw). At the same time, we re-analysed existing data sets (Sulik and McKay, 2021; Sulik et al., 2020) that include all six items along with variables known to correlate with trust in science (e.g., political ideology and science denial).

For details of the follow-up study and analysis, see https://osf.io/s5mdh/. Briefly, though: (1) We found that the above items correlated strongly with the explicit measure of trust in science (r = 0.76, p < 0.001). (2) We generated all possible combinations of three items from the six-item scale, and correlated each combination with variables known to be associated with trust in science (ideology and science denial). The correlations were very consistent across possible combinations, so our choice of these three items is unlikely to bias our results substantially. (3) We also generated alternative scores of trust in science. These were weighted averages (whereas the Results above report unweighted averages) where the weights were either factor loadings from an exploratory factor analysis, or regression coefficients from when the items in the follow-up study are used to predict either the positive or negative item about explicit trust. Our conclusions are robust against these different scoring methods. We found a difference between the positive and negative items, for which one interpretation is that our measure might more accurately be described as ‘distrust’ rather than ‘trust’. Crucially, though, we also found that this makes no difference to our conclusions above. Thus, the distinction between trust and distrust, though theoretically important, does not alter our conclusions.

Discussion

This study helps tackle the question of what difference trust in science could make when it comes to the adoption of new norms, such as those required by global threats. The results show that, in the face of the COVID-19 pandemic, trust in science only had a small and indirect effect on whether people reported following distancing guidelines. Better trust in science is unlikely to have yielded a major increase in adherence. To illustrate, suppose that a wildly successful messaging campaign leads to a 20% increase in trust in science. Multiplying this by the total effect in Fig. 2a, that would only yield a 2% increase in adherence.

Trust in science could nonetheless be credited for changing minds, if not directly affecting behaviour, in the sense that it was moderately associated with approval of new social distancing rules. One important implication is that the role of trust in science in the pandemic is unlike those of propaganda or threat, which focus on compelling behaviour (Mercier, 2017). This coheres with recent findings that trust in science is associated with support for pandemic measures (Algan et al., 2021; Dohle et al., 2020). However, it goes beyond such studies (which report an association between trust in science and adherence to pandemic measures) in showing that the latter association drops out because approval is itself associated with adherence.

The role of approval here is consistent with meta-analyses showing that in many areas—from adopting climate-friendly behaviours to sunscreen use, to exercise, healthy eating or condom use—having a positive attitude towards the behaviour in question and intending to do it is significantly predictive of how people actually behave (Chevance et al., 2019; Cologna and Siegrist, 2020; McDermott et al., 2016; Webb and Sheeran, 2006).

Attitudes toward science are part of a complex belief system. Extending previous research on associations between science and political ideology (Gauchat, 2012; Rutjens et al., 2018a), our results show that trust in science is a linchpin linking political ideology to approval of science-based guidelines: outside of the role of trust in science, ideology did not have a direct effect on approval of the rules, and its effect on adherence with the rules was small and fragile (e.g., depending on modelling decisions discussed in the Supplementary material). Previous research on climate change denial has shown that pro-science recommendations are more effective when they appeal to people’s values, and when they are consistent with their ideology (DeMora et al., 2021; Dixon et al., 2017; Hornsey and Fielding, 2017; Wolsko et al., 2016).

Based on these findings of a moderate and indirect effect of trust in science on behaviour, and of associations between between trust and ideology, we propose a ‘Bridge Model’ of science for enacting behavioural change. According to this model, trust in science affects behaviours (e.g., adherence to COVID-19 rules) through improving people’s attitudes (here, approval) towards the behaviour in question. In turn, trust in science serves as a bridge between political ideology and these pandemic-relevant attitudes and behaviours. This model contrasts with widespread assumptions in the existing literature that trust in science is important due to having a direct effect on behaviour change.

Trust in science generates other epistemic benefits, too: it makes people less susceptible to misinformation (Roozenbeek et al., 2020) and influences the formation of opinion-networks (Maher et al., 2020). It is a relatively stable trait (Agley, 2020), and is resistant to erosion from ideological opponents (Kreps and Kriner, 2020). In that sense, these findings may be helpful for policy-based interventions as they suggest that trust in science could serve as a ‘boost’ for behavioural change. Unlike ‘nudges’ that focus on behaviour and are usually easily reversible, ‘boosts’ focus on people’s decision-making processes and can thereby achieve sustained behavioural change (Hertwig and Grüne-Yanoff, 2017).

Notes on generalisability

Our study only considered social distancing, which was the dominant concern at the time of data collection, but which also required an abrupt change of behaviour and social norms. Future research should examine whether the link between changes in approval and changes in behaviour will generalise to other COVID-19 measures such as mask wearing and vaccination uptake, or generalise beyond the pandemic context to other cases where behaviour change is necessary, such as climate change. Vaccination has been a major theme of more recent stages of the pandemic, and higher vaccination rates are associated with higher trust in science (Hromatko et al., 2021; Lindholt et al., 2021; Soveri et al., 2021; Sturgis et al., 2021). The relationships between trust and vaccination intentions reported in these studies (e.g., r = 0.58 in Soveri et al., 2021; r = 0.37 in Hromatko et al., 2021) are larger than the association between trust in science and adherence to social distancing measures reported here. This leaves open the possibility that trust in science may matter more for vaccines than it seems to matter for social distancing. Nonetheless, as we have shown that such pairwise relationships are not enough to identify how trust in science matters for behaviour, we recommend that future work apply a framework such as our proposed Bridge Model to better understand that role when it comes to vaccines.

Political ideology is an established correlate of trust in science (Gauchat, 2012; Rutjens et al., 2018a). Here we measured political ideology using a common liberal-to-conservative response scale. However, recent research has shown that other aspects or facets of political ideology might matter more than this general spectrum for science-related attitudes. These include populism (Jylhä and Hellmer, 2020; Mede and Schäfer, 2020), reactance (Hornsey et al., 2018a) and social dominance orientation (Häkkinen and Akrami, 2014; Jylhä et al., 2016; Kerr and Wilson, 2021). As several hundred participants chose to opt out of our liberal-to-conservative item, a question for future research is whether other, more nuanced conceptions of ideology might increase response rates, or alter our conception of how ideology, trust in science, attitudes to policy and adherence to prosocial measures are related.

One of our research questions aimed to examine how general the patterns in the data would be across countries. Our findings indicate that relationships between ideology, trust in science and approval of pandemic measures followed the same pattern in the top 10 countries in our dataset. Still, considerable variation was observed among countries, with the USA appearing to be an outlier in both relations. On the one hand, these findings support previous work showing that conservative ideology is linked to less trust in science in predominantly Western countries (Gauchat, 2012; Pennycook et al., 2020). On the other hand, the high variability of responses provides strong reason to examine the links of trust in science with individual behaviour in diverse populations. Another important question for future research is how cross-country differences in political culture and ideology (i.e., going beyond the liberal/conservative distinction as discussed above) might affect these findings.

A limitation of our study, though not unique to it, is that our social-media recruitment process did not produce a representative sample. Specifically, there was a high proportion of educated women (see ‘Descriptive overview’ in Results). However, all analyses included demographic variables (such as age, gender and education) as covariates, and included country as a random effect to account for the imbalances in our sample distributions. An important indication that our recruitment procedure has not seriously biased results is that the levels of the main phenomenon of interest—trust in science—are strikingly similar to levels reported in previous studies. The average level of trust in science reported here—measured on a percentage scale with three items—was 75.6% (SD = 20%). This compares with levels previously reported during the pandemic, such as 82% (4.12 on a 5-point scale, using 14 items, with a sample recruited via social media, Plohl and Musil, 2021), 77% (5.39 on a 7-point scale, using just two items drawn from the same instrument used here, with a representative sample of New Zealanders, Sibley et al., 2020), or 76% (3.81 on a 5-point scale, using 21 items, with a sample of US residents recruited via Amazon’s Mechanical Turk, Agley, 2020). As these studies varied in the number of items (ranging from 2 to 21), as well as in their recruitment strategy and representativeness, this suggests that measurement of trust in science is somewhat robust to such methodological differences. Further, our finding that these relationships are unusually strong in the USA is consistent with previous work (Allum et al., 2008; Hornsey et al., 2018b).

Another limitation is that we measured people’s self-reported adherence rather than actual social distancing behaviour. However, the same patterns can be observed in social distancing whether measured via self-report and via mobile-phone movement tracking (Petherick et al., 2021), and a recent survey showed that responses regarding COVID-19 compliance do not suffer from social-desirability effects (Larsen et al., 2020). Given how our results cohere with so many findings about how trust relates to ideology, approval and adherence, distortions due to self-report are unlikely to be entirely responsible for driving our findings.

The anonymous, online, cross-sectional nature of our survey, where participants self-selected into the sample, might also conceivably limit the generalisability of our findings. As we only saved responses at the end of the survey (and only used complete responses in our analysis), we do not know how the attitudes of those who chose to quit the survey before finishing might have differed from our reported findings. The same goes for people who clicked on the link to our survey, but decided not to take part. Future work might also consider any effects of motivation: not only whether any participants who complete the survey may nonetheless lack the motivation to provide honest, sincere responses, but also whether such a tendency is associated with any of the factors analysed here. The cross-sectional design also limits our ability to draw causal inferences.

Finally, as noted briefly in the Methods (and in more detail in the supplementary analyses), our measure of “trust in science” might be called a measure of “distrust in science”, “credibility” or “negative attitudes towards science”. However, the relationships between trust, trustworthiness and credibility are not yet agreed theoretically: some researchers view trust as one aspect of credibility (Hartman et al., 2017); others seem to treat credibility and trustworthiness as aspects of trust (Nadelson et al., 2014); and still others see trustworthiness and credibility as related but distinct (Hendriks et al., 2016). Further, the precise nature of these interrelationships is a crucial avenue for future work. Based on our findings, we suggest that such work would benefit from studying trust, trustworthiness or credibility in the context of the ‘Bridge Model’ proposed here.

Conclusions

We probe the mechanisms and limits of trust in science in terms of achieving behavioural change during the current crisis, with implications for the handling of future crises. Trust in science can promote people’s policy approval of new rules, but has only a small, indirect effect on adherence to these rules. Science performs best, not at changing behaviour, but at convincing minds. We also show that trust in science acts as a pivotal link between political ideology and attitudes to science-based measures. This bridging role means it is a vital component in depolarising political and public debates when social changes are required.

Data availability

All study data are available at https://osf.io/s5mdh/, along with the full analysis scripts.

Notes

  1. We thank an anonymous reviewer for these suggestions

  2. We thank an anonymous reviewer for raising these points.

References

  • Agley J (2020) Assessing changes in US public trust in science amid the COVID-19 pandemic. Public Health 25:122–125

    Google Scholar 

  • Ahteensuu M (2012) Assumptions of the deficit model type of thinking: Ignorance, attitudes, and science communication in the debate on genetic engineering in agriculture. J Agric Environ Ethics 25(3):295–313

    Google Scholar 

  • Algan Y, Cohen D, Davoine E, Foucault M, Stantcheva S (2021) Trust in scientists in times of pandemic: panel evidence from 12 countries. Proc Natl Acad Sci USA 118(40):e2108576118

    CAS  PubMed  Google Scholar 

  • Allum N, Sturgis P, Tabourazi D, Brunton-Smith I (2008) Science knowledge and attitudes across cultures: a meta-analysis. Public Underst Sci 17(1):35–54

    Google Scholar 

  • Betsch C, Korn L, Sprengholz P, Felgendreff L, Eitze S, Schmid P, Böhm R (2020) Social and behavioral consequences of mask policies during the COVID-19 pandemic. Proc Natl Acad Sci 117(36):21851–21853

    CAS  PubMed  PubMed Central  Google Scholar 

  • Bicchieri C (2016) Norms in the wild: how to diagnose, measure, and change social norms. Oxford University Press.

  • Bicchieri C, Fatas E, Aldama A, Casas A, Deshpande I, Lauro M, Parilli C, Spohn M, Pereira P, Wen R (2021) In science we (should) trust: expectations and compliance across nine countries during the COVID-19 pandemic. PLoS ONE 16(6):e0252892

    CAS  PubMed  PubMed Central  Google Scholar 

  • Borgonovi F, Pokropek A (2020) Can we rely on trust in science to beat the COVID-19 pandemic? https://doi.org/10.31234/osf.io/yq287

  • Brossard D, Lewenstein BV (2009) A critical appraisal of models of public understanding of science. In: Kahlor L, Stout P (eds) Understanding and communicating science: new agendas in communication. Routledge, New York, pp. 11–39

  • Chevallier C, Hacquin AS, Mercier H (2021) COVID-19 vaccine hesitancy: shortening the last mile. Trends Cogn Sci 25(5):331–333

    PubMed  PubMed Central  Google Scholar 

  • Chevance G, Bernard P, Chamberland PE, Rebar A (2019) The association between implicit attitudes toward physical activity and physical activity behaviour: a systematic review and correlational meta-analysis. Health Psychol Rev 13(3):248–276

    PubMed  Google Scholar 

  • Cialdini RB, Goldstein NJ (2004) Social influence: compliance and conformity. Annu Rev Psychol 55:591–621

    PubMed  Google Scholar 

  • Collins RN, Mandel DR, Schywiola SS (2021) Political identity over personal impact: Early US reactions to the COVID-19 pandemic. Front Psychol 12:607639

    PubMed  PubMed Central  Google Scholar 

  • Cologna V, Siegrist M (2020) The role of trust for climate change mitigation and adaptation behaviour: a meta-analysis. J Environ Psychol 69:101428

    Google Scholar 

  • Czarnek G, Kossowskam M, Szwed P (2021) Right-wing ideology reduces the effects of education on climate change beliefs in more developed countries. Nat Clim Change 11(1):9–13

    ADS  Google Scholar 

  • De Leeuw JR (2015) jsPsych: a JavaScript library for creating behavioral experiments in a Web browser. Behav Res Methods 47(1):1–12

    ADS  PubMed  Google Scholar 

  • DeMora SL, Merolla JL, Newman B, Zechmeister EJ (2021) Reducing mask resistance among White evangelical Christians with value-consistent messages. Proc Natl Acad Sci USA 118(21):e2101723118

    CAS  PubMed  PubMed Central  Google Scholar 

  • Diehl T, Huber B, Gil de Zúñiga H, Liu J (2021) Social media and beliefs about climate change: a cross-national analysis of news use, political ideology, and trust in science. Int J Public Opin Res 33(2):197–213

    Google Scholar 

  • Dixon G, Hmielowski J, Ma Y (2017) Improving climate change acceptance among US conservatives through value-based message targeting. Sci Commun 39(4):520–534

    Google Scholar 

  • Dohle S, Wingen T, Schreiber M (2020) Acceptance and adoption of protective measures during the COVID-19 pandemic: the role of trust in politics and trust in science. Soc Psychol Bull 15(4):1–23

    Google Scholar 

  • Dunbar RI, Spoors M (1995) Social networks, support cliques, and kinship. Hum Nat 6(3):273–290

    CAS  PubMed  Google Scholar 

  • Engels A, Hüther O, Schäfer M, Held H (2013) Public climate-change skepticism, energy preferences and political participation. Global Environ Change 23(5):1018–1027

    Google Scholar 

  • Fauci A [@docfauci] (2020) Biden’s real COVID-19 challenge is restoring a nation’s trust in science. [Instagram photo]. https://www.instagram.com/p/CHWkV6YJs59/c/17958469465368675/.

  • Gauchat G (2012) Politicization of science in the public sphere: a study of public trust in the United States, 1974 to 2010. Am Sociol Rev 77(2):167–187

    Google Scholar 

  • Gregory J, Lock SJ (2008) The evolution of ‘public understanding of science’: public engagement as a tool of science policy in the UK. Sociol Compass 2(4):1252–1265

    Google Scholar 

  • Häkkinen K, Akrami N (2014) Ideology and climate change denial. Personal Individ Differ 70:62–65

    Google Scholar 

  • Hale T, Webster S, Petherick A, Phillips T, Kira B (2020) Oxford COVID-19 government response tracker (OxCGRT). https://github.com/OxCGRT/covid-policy-tracker.

  • Hartman RO, Dieckmann NF, Sprenger AM, Stastny BJ, DeMarree KG (2017) Modeling attitudes toward science: development and validation of the credibility of science scale. Basic Appl Soc Psychol 39(6):358–371

    Google Scholar 

  • Hendriks F, Kienhues D, Bromme R (2015) Measuring laypeople’s trust in experts in a digital age: The Muenster Epistemic Trustworthiness Inventory (METI). PLoS ONE 10(10):e0139309

    PubMed  PubMed Central  Google Scholar 

  • Hendriks F, Kienhues D, Bromme R (2016) Trust in science and the science of trust. In: Blöbaum B (ed) Trust and communication in a digitized world. Springer, pp. 143–159

  • Hertwig R, Grüne-Yanoff T (2017) Nudging and boosting: Steering or empowering good decisions. Perspect Psychol Sci 12(6):973–986

    PubMed  Google Scholar 

  • Hornsey MJ, Fielding KS (2017) Attitude roots and Jiu Jitsu persuasion: understanding and overcoming the motivated rejection of science. Am Psychol 72(5):459

    PubMed  Google Scholar 

  • Hornsey MJ, Harris EA, Fielding KS (2018a) The psychological roots of anti-vaccination attitudes: a 24-nation investigation. Health Psychol 37(4):307

    PubMed  Google Scholar 

  • Hornsey MJ, Harris EA, Fielding KS (2018b) Relationships among conspiratorial beliefs, conservatism and climate scepticism across nations. Nat Climate Change 8(7):614–620

    ADS  Google Scholar 

  • Hromatko I, Tonković M, Vranic A (2021) Trust in science, perceived vulnerability to disease, and adherence to pharmacological and non-pharmacological COVID-19 recommendations. Front Psychol 12:1425

    Google Scholar 

  • Irzik G, Kurtulmus F (2019) What is epistemic public trust in science? The Br J Philos Sci 70(4):1145–1166

    Google Scholar 

  • Jylhä KM, Cantal C, Akrami N, Milfont TL (2016) Denial of anthropogenic climate change: Social dominance orientation helps explain the conservative male effect in Brazil and Sweden. Personal Individ Differ 98:184–187

    Google Scholar 

  • Jylhä KM, Hellmer K (2020) Right-wing populism and climate change denial: the roles of exclusionary and anti-egalitarian preferences, conservative ideology, and antiestablishment attitudes. Anal Soc Issues Public Policy 20(1):315–335

    Google Scholar 

  • Kerr JR, Wilson MS (2021) Right-wing authoritarianism and social dominance orientation predict rejection of science and scientists. Group Process Intergroup Relat 24(4):550–567

    Google Scholar 

  • Koetke J, Schumann K, Porter T (2021) Trust in science increases conservative support for social distancing. Group Process Intergr Relat 24(4):680–697

    Google Scholar 

  • Kreps S, Kriner D (2020) Model uncertainty, political contestation, and public trust in science: evidence from the COVID-19 pandemic. Sci Adv 6(43):eabd4563

    ADS  CAS  PubMed  PubMed Central  Google Scholar 

  • Larsen M, Nyrup J, Petersen MB (2020) Do survey estimates of the public’s compliance with COVID-19 regulations suffer from social desirability bias? J Behav Public Adm 3(2). https://doi.org/10.30636/jbpa.32.164

  • Larson HJ, Clarke RM, Jarrett C, Eckersberger E, Levine Z, Schulz WS, Paterson P (2018) Measuring trust in vaccination: a systematic review. Hum Vaccines Immunother 14(7):1599–1609

    Google Scholar 

  • Leshner AI (2021) Trust in science is not the problem. Issues Sci Technol 37(3):16–18

    Google Scholar 

  • Lewicki RJ, McAllister DJ, Bies RJ (1998) Trust and distrust: new relationships and realities. Acad Manag Rev 23(3):438–458

    Google Scholar 

  • Lindholt MF, Jørgensen F, Bor A, Petersen MB (2021) Public acceptance of COVID-19 vaccines: cross-national evidence on levels and individual-level predictors using observational data. BMJ Open 11(6):e048172

    PubMed  Google Scholar 

  • Maher PJ, MacCarron P, Quayle M (2020) Mapping public health responses with attitude networks: the emergence of opinion-based groups in the UK’s early COVID-19 response phase. Br J Soc Psychol 59(3):641–652

    PubMed  PubMed Central  Google Scholar 

  • McDermott MS, Oliver M, Iverson D, Sharma R (2016) Effective techniques for changing physical activity and healthy eating intentions and behaviour: a systematic review and meta-analysis. Br J Health Psychol 21(4):827–841

    PubMed  Google Scholar 

  • Mede NG, Schäfer MS (2020) Science-related populism: conceptualizing populist demands toward science. Public Underst Sci 29(5):473–491

    PubMed  PubMed Central  Google Scholar 

  • Mercier H (2017) How gullible are we? A review of the evidence from psychology and social science. Rev General Psychol 21(2):103–122

    Google Scholar 

  • Moehring A, Collis A, Garimella K, Rahimian MA, Aral S, Eckles D (2021) Surfacing norms to increase vaccine acceptance. Available at SSRN 3782082

  • Mohammed A, Johnston RM, van der Linden C (2020) Public responses to policy reversals: the case of mask usage in Canada during COVID-19. Can Public Policy 46(S2):S119–S126

    Google Scholar 

  • Nadelson L, Jorcyk C, Yang D, Jarratt Smith M, Matson S, Cornell K, Husting V (2014) I just don’t trust them: the development and validation of an assessment instrument to measure trust in science and scientists. School Sci Math 114(2):76–86

    Google Scholar 

  • Oreskes N (2021) Why trust science? Princeton University Press, Princeton.

  • Pagliaro S, Sacchi S, Pacilli MG, Brambilla M, Lionetti F, Bettache K, Bianchi M, Biella M, Bonnot V, Boza M, Butera F, Ceylan-Batur S, Chong K, Chopova T, Crimston CR, Álvarez B, Cuadrado I, Ellemers N, Formanowicz M, Graupmann V, Gkinopoulos T, Jeong EHK, Jasinskaja-Lahti I, Jetten J, Bin KM, Mao Y, McCoy C, Mehnaz F, Minescu A, Sirlopú D, Simić A, Travaglino G, Uskul AK, Zanetti C, Zinn A, Zubieta E (2021) Trust predicts COVID-19 prescribed and discretionary behavioral intentions in 23 countries. PLoS ONE 16(3):e0248334

    CAS  PubMed  PubMed Central  Google Scholar 

  • Parikh S (2021) Why we must rebuild trust in science. Trend Mag Winter:8–12 https://www.pewtrusts.org/en/trend.

  • Pechar E, Bernauer T, Mayer F (2018) Beyond political ideology: the impact of attitudes towards government and corporations on trust in science. Sci Commun 40(3):291–313

    Google Scholar 

  • Pennycook G, McPhetres J, Bago B, Rand D (2020) Predictors of attitudes and misperceptions about COVID-19 in Canada, the UK, and the USA. https://doi.org/10.31234/osf.io/zhjkp.

  • Petersen MB, Bor A, Jørgensen FJ, Lindholt MF (2021) Transparent communication about COVID-19 vaccines is not sufficient for acceptance but it is necessary for trust. Proc Natl Acad Sci USA 118(29):e2024597118 https://doi.org/10.1073/pnas.2024597118.

  • Petherick A, Goldszmidt R, Andrade EB, Furst R, Hale T, Pott A, Wood A (2021) A worldwide assessment of changes in adherence to COVID-19 protective behaviours and hypothesized pandemic fatigue. Nat Hum Behav 5(9):1145–1160

    PubMed  Google Scholar 

  • Plohl N, Musil B (2021) Modeling compliance with COVID-19 prevention guidelines: the critical role of trust in science. Psychol Health Med 26(1):1–12

    PubMed  Google Scholar 

  • Revelle W, Condon DM (2019) Reliability from alpha to omega: a tutorial. Psychol Assess 31(12):1395–1411

    PubMed  Google Scholar 

  • Roozenbeek J, Schneider CR, Dryhurst S, Kerr J, Freeman ALJ, Recchia G, van der Bles AM, van der Linden S (2020) Susceptibility to misinformation about COVID-19 around the world. R Soc Open Sci 7:201199

    ADS  CAS  PubMed  PubMed Central  Google Scholar 

  • Rothmund T, Farkhari F, Azevedo F, Ziemer CT (2020) Scientific trust, risk assessment, and conspiracy beliefs about COVID-19—four patterns of consensus and disagreement between scientific experts and the German public. https://doi.org/10.31234/osf.io/4nzuy.

  • Rutjens BT, Heine SJ, Sutton RM, van Harreveld F (2018a) Attitudes towards science. Adv Exp Soc Psychol 57:125–165

    Google Scholar 

  • Rutjens BT, Sutton RM, van der Lee R (2018b) Not all skepticism is equal: exploring the ideological antecedents of science acceptance and rejection. Personal Soc Psychol Bull 44(3):384–405

    Google Scholar 

  • Sailer M, Stadler M, Botes E, Fischer F, Greiff S (2021) Science knowledge and trust in medicine affect individuals’ behavior in pandemic crises. Eur J Psychol Educ 1–14.

  • Sibley CG, Greaves LM, Satherley N, Wilson MS, Overall NC, Lee CH, Milojev P, Bulbulia J, Osborne D, Milfont TL et al. (2020) Effects of the COVID-19 pandemic and nationwide lockdown on trust, attitudes toward government, and well-being. Am Psychol 75(5):618–630

    PubMed  Google Scholar 

  • Siegrist M (2021) Trust and risk perception: a critical review of the literature. Risk Anal 41(3):480–490

    PubMed  Google Scholar 

  • Simis MJ, Madden H, Cacciatore MA, Yeo SK (2016) The lure of rationality: why does the deficit model persist in science communication? Public Underst Sci 25(4):400–414

    PubMed  Google Scholar 

  • Simonov A, Sacher SK, Dubé JPH, Biswas S (2020) The persuasive effect of fox news: non-compliance with social distancing during the COVID-19 pandemic. Working Paper 27237. National Bureau of Economic Research. http://www.nber.org/papers/w27237.

  • Soveri A, Karlsson LC, Antfolk J, Lindfelt M, Lewandowsky S (2021) Unwillingness to engage in behaviors that protect against COVID-19: the role of conspiracy beliefs, trust, and endorsement of complementary and alternative medicine. BMC Public Health 21(1):1–12

    Google Scholar 

  • Stosic MD, Helwig S, Ruben MA (2021) Greater belief in science predicts mask-wearing behavior during COVID-19. Person Individ Differ 176:110769

    Google Scholar 

  • Sturgis P, Allum N (2004) Science in society: re-evaluating the deficit model of public attitudes. Public Underst Sci 13(1):55–74

    Google Scholar 

  • Sturgis P, Brunton-Smith I, Jackson J (2021) Trust in science, social consensus and vaccine confidence. Nat Hum Behav 5:1528–1534

    PubMed  Google Scholar 

  • Sulik J, McKay R (2021) Studying science denial with a complex problem-solving task. In: Proceedings of the 43rd annual conference of the Cognitive Science Society. Cognitive Science Society

  • Sulik J, Ross R, McKay R (2020) The contingency illusion bias as a potential driver of science denial. In: Denison S, Mack M, Xu Y, Armstrong BC (eds) Proceedings of the 42nd annual conference of the Cognitive Science Society. Cognitive Science Society, pp. 829–835

  • Tranter B, Booth K (2015) Scepticism in a changing climate: a cross-national study. Global Environ Change 33:154–164

    Google Scholar 

  • Tuncgenc B, El Zein M, Sulik J, Newson M, Zhao Y, Dezecache G, Deroy O (2021) We distance most when we believe our social circle does. Br J Psychol 112(3):763–780

    PubMed  Google Scholar 

  • Webb TL, Sheeran P (2006) Does changing behavioral intentions engender behavior change? A meta-analysis of the experimental evidence. Psychol Bull 132(2):249

    PubMed  Google Scholar 

  • Wellcome Global Monitor (2018) How does the world feel about science and health. https://wellcome.org/reports/wellcome-global-monitor/2018

  • Wissenschaft im Dialog (2020) Science Barometer 2020. https://www.wissenschaft-im-dialog.de/en/our-projects/science-barometer/science-barometer-2020/

  • Wolsko C, Ariceaga H, Seiden J (2016) Red, white, and blue enough to be green: effects of moral framing on climate change attitudes and conservation behaviors. J Exp Soc Psychol 65:7–19

    Google Scholar 

Download references

Acknowledgements

We thank the many people who helped us translate and disseminate the surveys.

Funding

JS and OD gratefully acknowledge support by the NOMIS Foundation through the project “Diversity in Social Environments”. Participant payment for the followup study was funded by British Academy/Leverhulme grant SRG1819\\190996 to JS. GD is funded by CAP2025 Challenge 4. MEZ is funded by the Wellcome Trust grant number 204702. MN is funded by a UKRI Future Leader’s Fellowship grant (MR/T041099/1). Open Access funding enabled and organized by Projekt DEAL.

Author information

Authors and Affiliations

Authors

Contributions

JS: conceptualization, methodology, investigation, data curation, analysis, visualisation, software, writing—original draft, writing—review & editing; OD: conceptualization, methodology, investigation, writing—review & editing; funding acquisition; GD: conceptualization, investigation, writing—review & editing; MN: investigation, writing—review & editing; YZ: data curation, analysis, writing—review & editing; MEZ: conceptualization, methodology, investigation, writing—original draft, writing—review & editing; BT: conceptualization, methodology, investigation, data curation, analysis, writing—review & editing, project administration.

Corresponding author

Correspondence to Justin Sulik.

Ethics declarations

Competing interests

The authors declare no competing interests.

Ethical approval

The study received ethical approval through the University of Nottingham School of Psychology ethics committee (ref: F1248R). The study was conducted in accordance with the principles of the Declaration of Helsinki.

Informed consent

All participants provided written informed consent prior to study commencement and were free to withdraw at any point without any consequences.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Sulik, J., Deroy, O., Dezecache, G. et al. Facing the pandemic with trust in science. Humanit Soc Sci Commun 8, 301 (2021). https://doi.org/10.1057/s41599-021-00982-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1057/s41599-021-00982-9

Further reading

Search

Quick links