Introduction

Historically, the confluence of technological advancements and their implications for economic trajectories has remained at the forefront of academic and societal discourse. While many technological advancements have spurred economic growth, the rise of artificial intelligence has particularly driven academic interest in understanding its economic impact. The formal beginning of artificial intelligence as a discipline dates back to the mid-20th century, marked by the significant Dartmouth symposium led by McCarthy in 1956. This conference marked the first use of the term ‘artificial intelligence,’ though its definition was not universally agreed upon. McCarthy’s postulations encapsulated artificial intelligence as a quest to architect intelligent beings, predominantly through cognitive computing paradigms. Today, the combination of vast data resources and advanced computing methods has propelled artificial intelligence to the forefront of technological innovation. This momentum is evident in the burgeoning investments by major economies, identifying artificial intelligence’s prospective role in galvanizing economic expansion. Additionally, research by Ashraf (2022) and Allal-Chérif (2022) has shown how artificial intelligence is influencing various aspects of society, including religion.

The diverse role of religion in society, outlined by Gowan (2004) and further supported by Basedau et al. (2016), Demiroglu et al. (2017), and Mayoral and Esteban (2019), highlights its significant impact on social structures and socioeconomic patterns. Historically, there has been a deep connection between religious rituals and economic processes, dating back to ancient civilizations that linked spiritual activities with economic production. Contemporary religion, while evolving in its ideologies, continues to have a noticeable and subtle influence on economic trends. Modern religious organizations have expanded beyond their traditional roles and now function as significant economic entities. They not only steward sacred rites but also delve into secular commercial realms, exemplified by endeavors like the merchandising of spiritual artifacts and the augmentation of faith-based tourism. According to Hodge et al. (2020), Janah et al. (2020), and Steiner et al. (2020), some religious institutions have become powerful economic forces, with key spiritual leaders playing central roles in economic initiatives. The economic imprint of religious entities further extends to engendering niche production avenues, driven by their distinct consumer behaviors. Canonical teachings, in many instances, proffer economic wisdom, guiding community fiscal paradigms. At junctures, religious doctrines synergize with and fortify state-economic strategies. Conclusively, the interplay of religious tenets and economic frameworks remains a testament to their co-evolution and mutual reinforcement in societal progress.

Based on comprehensive analyses, the significant interplay between artificial intelligence and religion in shaping economic trends becomes clear. This study explores these interactions across 26 countries from 2000 to 2021. Using panel vector auto-regression analysis, the study identifies a strong positive correlation between artificial intelligence, religious freedom, and factors of economic growth. Insights from forecast-error variance decomposition highlight the growing importance of these interconnected factors in shaping future economic trends. The research also emphasizes the continued importance of traditional economic factors, mainly labor and capital. The veracity and robustness of the elucidated relationships are further substantiated by the Toda-Yamamoto Granger causality scrutiny. Overall, this study reinforces basic economic principles and uncovers new synergies, providing valuable insights for scholars, policymakers, and economic analysts.

This study offers three key contributions to the existing literature on the intersection of artificial intelligence, religious freedom, and economic growth. First, our research explores the relationship between artificial intelligence, religious freedom, and economic growth. Drawing from an extensive dataset spanning 26 nations over the course of two decades (2000–2021), we identify a salient positive interrelation among these elements. This elucidation bridges a lacuna in the academic discourse, spotlighting the symbiotic manner in which technological strides and sociocultural paradigms confluence to mold the economic tapestry of countries. Secondly, our methodological approach, combining panel vector auto-regression and forecast-error variance decomposition, adds depth to economic forecasting literature. This framework not only identifies existing relationships but also predicts future economic trends, underscoring the burgeoningly pivotal role of artificial intelligence and religious liberties in sculpting future economic vistas. Finally, amidst the prevailing scholarly narrative on economic ascension replete with contemporary determinants, our study highlights the ongoing importance of traditional growth drivers, especially labor and capital. Concurrently, the incorporation of the Toda-Yamamoto Granger causality examination reinforces the cogency of this standpoint, engendering a comprehensive comprehension of both time-honored and nascent economic growth stimuli. In summary, our findings provide new insights and methodological advancements, enriching academic discussions in the fields of economic science, technology evolution, and sociocultural dynamics.

The subsequent sections of this article are structured as delineated below: section “Literature review” delves into a review of extant literature, assimilating and synthesizing prior academic pursuits within this domain. Section “Variable and PVAR model” elucidates the research design, accentuating the selected variables and the foundational theoretical underpinnings. The section “Results and Discussion” provides a comprehensive analysis of the derived empirical results. In culmination, the section “Conclusions” distills the principal takeaways, proffers policy recommendations, and earmarks prospective avenues for continued scholarly inquiry in this field.

Literature review

Scholars have long postulated the intricate interplay between artificial intelligence and economic growth. Early observations by Aghion et al. (2017) posited that artificial intelligence might merely be a new technological tool, akin to past innovations, with short-lived economic implications. However, as argued by Walton and Nayak (2021) and Borrellas and Unceta (2021), artificial intelligence’s adaptive learning capacity marked a departure from previous technologies, implying a more transformative role in economies. Contrarily, Liang et al. (2022) cautioned against over-optimism, suggesting artificial intelligence could displace jobs faster than it creates them. Building on this, Osoba and Welser (2017) and Vermeulen et al. (2017) empirically demonstrated a direct correlation between artificial intelligence advancements and short-term job displacement but also highlighted an eventual shift towards newer job roles. Expanding this perspective, Damioli et al. (2021) and Czarnitzki et al. (2023) underscored the significance of artificial intelligence in boosting productivity, particularly in sectors like manufacturing and finance. Cockburn et al. (2018) and Åström et al. (2022) asserted that artificial intelligence’s real potential lies in its ability to trigger innovation and create new market sectors. However, Makridakis (2017) and Lim (2022) contended that the economic benefits of artificial intelligence could be unevenly distributed, leading to increased economic disparities. A meta-analysis by Berdiyeva et al. (2021) synthesized these viewpoints, suggesting that while artificial intelligence can be a potent catalyst for economic growth, its effects are multifaceted and can vary based on regional, sectoral, and policy contexts. Collectively, the literature underscores the profound yet debatable potential of artificial intelligence in shaping future economic trajectories.

Recent decades have seen an increasing focus on the complex relationship between religious freedom and economic growth, with scholars exploring various aspects of this dynamic. Gill and Owen IV (2017) and Gill (2013) pointed out the potential economic implications of religious liberty, suggesting an intricate relationship. This notion was further expanded by Qayyum et al. (2020), who illustrated religious freedom’s role as a catalyst for prosperity. Cao et al. (2019) pivoted the discourse to trade dynamics, suggesting that religious tolerance could foster international commerce. This trade-centric viewpoint was contrasted by Squicciarini (2020), who took a more macroeconomic stance, emphasizing the broader financial implications of religious liberties. While the majority of the literature has focused on the positive aspects, Prasadh and Thenmozhi (2019) cautioned about the differential impacts in emerging economies. This heterogeneity was further examined by Mavelli (2020), who revealed an unexpected connection between religious freedom and financial markets. Kim et al. (2021), Shahzad et al. (2014), and Gutsche (2019) took a more microeconomic approach, investigating how these liberties influence investment patterns and entrepreneurship, respectively. On a continental scale, Oyekan (2023) highlighted Africa’s unique experience, showing both the upsides and downsides of religious diversity for economic growth. Conclusively, Gill’s (2021) comprehensive analysis quantified this relationship, cementing the understanding that religious liberty plays a pivotal role in economic expansion. In summary, while there is a consensus on the significant role of religious liberty in economic growth, the exact nature and extent of this relationship continue to be subjects of debate, highlighting the need for further research.

The contribution of labor and capital inputs to economic growth is a well-explored area in economic studies, with scholars examining various aspects of their impact. Shahbaz et al. (2019) posited that the dynamics of the labor force serve as a foundational driver of economic expansion. Abbas et al. (2020) echoed a parallel sentiment but shifted focus to capital formation, highlighting its indispensable role in bolstering growth. Azenui and Rada (2021) then emphasized the crucial role of labor productivity in shaping macroeconomic outcomes. However, Topcu et al. (2020) brought the discourse back to the significance of capital accumulation for long-term growth trajectories. Ahmed et al. (2020) added complexity by discussing potential constraints on growth due to interplays between labor and capital inputs. This sentiment was expanded by Han and Lee (2020), who underscored the challenge of labor skills mismatches in contemporary growth scenarios. Alshubiri (2022) then discussed the nuances of investment dynamics, emphasizing the non-linearities in growth patterns influenced by capital. Curtin et al. (2019) quantitatively assessed capital investment patterns and their correlation with growth, suggesting a robust relationship. Meanwhile, Liotti (2020) argued for labor flexibility, underscoring its pivotal implications for sustainable economic growth. Lastly, Nordhaus (2021) deliberated on the concept of capital deepening, proposing its central role in shaping future economic prospects. In conclusion, while there is broad agreement on the importance of labor and capital in economic growth, the specifics of their interplay and the constraints they face are still topics of ongoing discussion and research in the field of economics.

Reviewing the studies mentioned earlier reveals a clear story about how artificial intelligence and religious freedom jointly influence economic progress. These studies, known for their methodological rigor and reliance on empirical data, highlight how AI innovations and the extent of religious freedom jointly shape economic paths. The interconnections identified in this research point to a growing need for more in-depth studies. Such future work should aim to further clarify the complex relationships between artificial intelligence and religious freedom and their combined effects on economic development.

Variable and PVAR model

Variable

This study investigates how artificial intelligence and religious freedom interact with traditional economic drivers like labor and capital to influence economic growth. We examine data from 26 countries from 2000 to 2021, focusing on the role of AI, religious freedom, and their impact on economic development. AI progress is measured by the number of AI-related patent filings in these countries. Religious freedom is evaluated on a scale from 0 to 1, where a higher score indicates greater religious liberties. We assess economic growth using each country’s gross domestic product. Capital input is analyzed through gross capital formation, and labor input is considered in terms of employment rates. Detailed information on these variables and the data sample is available in Tables 1 and 2.

Table 1 Sample description.
Table 2 Variable description.

PVAR model

Building on the foundational economic theory of the production function, which identifies labor and capital as key drivers of output, this study introduces new variables, specifically artificial intelligence and religious freedom, to extend the traditional framework. We ground our research in Solow’s (1956) neoclassical growth model, recognizing technological progress, akin to the role of artificial intelligence, as a vital component of economic development. Additionally, we draw on Robinson and Acemoglu’s (2012) work on the impact of institutions, such as religious freedom, on economic outcomes. This study aims to update and expand Romer’s (1990) endogenous growth theory by suggesting that both knowledge and societal constructs like religious freedom can internally drive economic growth. By combining these established theories, our research seeks to clarify how emerging factors, such as artificial intelligence and religious freedoms, integrate with and impact traditional economic models to influence economic growth. The adapted production function we propose is outlined as follows:

$${{{\mathrm{eg}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} = {{{\mathrm{Ali}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}^{{{\mathrm{a}}}}{{{\mathrm{ci}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}^{{{\mathrm{b}}}}{{{\mathrm{ai}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}^{{{\mathrm{c}}}}{{{\mathrm{rf}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}^{{{\mathrm{d}}}}.$$
(1)

In Eq. (1), i is utilized to denote individual countries, while t is representative of specific years under consideration. A is employed to encapsulate total factor productivity, a measure that captures the efficiency of input-to-output transformation in production processes. The coefficients a, b, c, and d correspondingly represent the proportionate shares of the output attributed to each distinct input factor. Notably, the condition a + b + c + d = 1 is maintained, which underscores the premise of consistent returns to scale in the production function. This implies that a proportional increase in all inputs will result in an equivalent proportional increase in output, thereby reflecting the consistent nature of productive efficiency across varying scales of operation. Following He et al. (2022)’s methodology, we apply a logarithmic transformation to the linear models to stabilize the variance-covariance matrix. This allows us to reformulate Eq. (1) in an alternative way, as shown below:

$${{logeg}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} = {{logA}} + {{alogli}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} + {{blogci}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} + {{clogai}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} + {{dlogrf}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} + \epsilon _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}.$$
(2)

In our analysis, detailed in Eq. (2), represents the white noise component. We adopt the panel vector auto-regression framework, as introduced by Love and Zicchino (2006), to explore the complex interactions between artificial intelligence, religious dynamics, and economic growth. The PVAR model is particularly effective for this study as it can handle individual differences over time through fixed effects. This approach significantly enhances the accuracy and reliability of our empirical results. The PVAR model’s strengths are numerous, making it highly effective at untangling complex macroeconomic relationships. Mügge (2016) notes that the model is flexible, focusing on macroeconomic fluctuations without being limited to any specific growth or developmental theories. Similarly, Charfeddine and Kahia (2019) emphasize the importance of an objective analytical approach, warning that biases can distort the interpretation of results.

The PVAR framework stands out for its approach to treating all variables as endogenous, moving beyond the traditional separation into exogenous and endogenous categories. This model considers the evolution of each variable as influenced by both its past behavior and the simultaneous activities of other variables, highlighting their mutual dependence and interactions. It also allows for the analysis of both internal and external influences, which are crucial in shaping macroeconomic trends in interconnected economies. Due to its efficient design and ability to provide insightful estimations, the PVAR model is highly effective for analyzing panel data, whether it spans multiple countries or focuses on a single entity. In our study, we leveraged the PVAR model’s strengths to explore the interplay among artificial intelligence, religious freedom, and economic growth across 26 countries from 2000 to 2021. Consequently, we present Eq. (2) as follows:

$${{y}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} = \mu _{{{\mathrm{i}}}} + {{a}}\left( {{l}} \right){{y}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} + \alpha _{{{\mathrm{i}}}} + \delta _{{{\mathrm{t}}}} + \epsilon _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}.$$
(3)

In Eq. (3), yi,t represents a set of stable variables, including ai, rf, eg, li, and ci. Concurrently, the term μi symbolizes the matrix capturing country-specific idiosyncratic effects. The matrix polynomial, operating within the designated series, is denoted by a(l), [a(l) = α1l1 + α2l2 + … + αp]), wherein αi acts as the vector elucidating the distinct country-specific influences assimilated into the regression analysis. δt is tasked with embodying the dummy indicators tailored for each country’s unique annual data points. The PVAR model uses an efficient matrix notation in Eq. (3), offering a clear view of the complex interconnections among artificial intelligence, religious freedom, and economic growth in the analyzed countries and time frame.

$$\begin{array}{l}{{{\mathrm{{\Delta}}}logai}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} = \mu _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} + \mathop {\sum}\limits_{{{j}} = {{{\mathrm{i}}}}}^{{{\mathrm{p}}}} {{{{\mathrm{a}}}}_{1{{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}log}}\,{{{\mathrm{ai}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{b}}}}_{1{{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logrf}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{c}}}}_{1,{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logeg}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} \\ + \,\mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{d}}}}_{1,{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logli}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{e}}}}_{1,{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logci}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \alpha _{{{\mathrm{i}}}} + \delta _{{{\mathrm{t}}}} + \epsilon _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}.\end{array}$$
(4)
$$\begin{array}{l}{{{\mathrm{{\Delta}}}logrf}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} = \mu _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} + \mathop {\sum}\limits_{{{j}} = {{{\mathrm{i}}}}}^{{{\mathrm{p}}}} {{{{\mathrm{a}}}}_{1{{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}log}}\,{{{\mathrm{rf}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{b}}}}_{1{{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logai}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{c}}}}_{1,{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logeg}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} \\ + \,\mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{d}}}}_{1,{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logli}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{e}}}}_{1,{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logci}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \alpha _{{{\mathrm{i}}}} + \delta _{{{\mathrm{t}}}} + \epsilon _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}.\end{array}$$
(5)
$$\begin{array}{l}{{{\mathrm{{\Delta}}}logeg}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} = \mu _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} + \mathop {\sum}\limits_{{{j}} = {{{\mathrm{i}}}}}^{{{\mathrm{p}}}} {{{{\mathrm{a}}}}_{1{{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}log}}\,{{{\mathrm{eg}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{b}}}}_{1{{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logai}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{c}}}}_{1,{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logrf}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} \\ + \,\mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{d}}}}_{1,{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logli}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{e}}}}_{1,{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logci}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \alpha _{{{\mathrm{i}}}} + \delta _{{{\mathrm{t}}}} + \epsilon _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}.\end{array}$$
(6)
$$\begin{array}{l}{{{\mathrm{{\Delta}}}logli}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} = \mu _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} + \mathop {\sum}\limits_{{{j}} = {{{\mathrm{i}}}}}^{{{\mathrm{p}}}} {{{{\mathrm{a}}}}_{1{{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}log}}\,{{{\mathrm{li}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{b}}}}_{1{{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logai}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{c}}}}_{1,{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logrf}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} \\ + \,\mathop {\sum}\limits_{{{{j}}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{d}}}}_{1,{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logeg}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{e}}}}_{1,{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logci}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}} + \alpha _{{{\mathrm{i}}}} + \delta _{{{\mathrm{t}}}} + \epsilon _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}} .\end{array}$$
(7)
$$\begin{array}{l}{{{\mathrm{{\Delta}}}logci}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} = \mu _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} + \mathop {\sum}\limits_{{{j}} = {{{\mathrm{i}}}}}^{{{\mathrm{p}}}} {{{{\mathrm{a}}}}_{1{{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}log}}\,{{{\mathrm{ci}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{b}}}}_{1{{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logai}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{c}}}}_{1,{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logrf}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} \\ + \,\mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{d}}}}_{1,{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logli}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}} + \mathop {\sum}\limits_{{{j}} = 1}^{{{\mathrm{p}}}} {{{{\mathrm{e}}}}_{1,{{{\mathrm{t}}}} - {{{\mathrm{j}}}}}{{{\mathrm{{\Delta}}}logeg}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{j}}}}} + \alpha _{{{\mathrm{i}}}} + \delta _{{{\mathrm{t}}}} + \epsilon _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}} .\end{array}$$
(8)

To determine the optimal lag length for our models, we use the Schwarz Information Criterion, represented by the parameter j. In our model, we incorporate fixed effects, denoted as αi, to account for unique variations in different empirical contexts. These fixed effects include country-specific influences, represented by μi, which capture persistent national factors not directly observable. The combination of country-specific fixed effects and lagged variables in our model presents challenges due to their correlation. We address this by either applying the mean-difference technique or using the advanced ‘Helmert transformation’, both of which ensure the independence of lagged factors from transformed predictors, enhancing the reliability of our estimations using the system-generalized method of moments.

In the PVAR framework, as detailed in Eq. (3), annual effects are captured by αi, which effectively accounts for global macroeconomic changes impacting all countries. Before integrating these variables into our model, it’s crucial to differentiate them to highlight yearly variations. In our study, these variables align with the system’s theoretical setup. A notable strength of the PVAR methodology is its ability to analyze the effects of independent disturbances, examining the influence of one variable on another while assuming other factors remain constant. This depth of analysis is achieved through the panel impulse response function, which traces the response of a specific variable to changes in another, assuming all other influences are neutralized. To accurately map these impulse responses, we calculate confidence intervals using Monte Carlo simulations with 10,000 bootstrap iterations. Additionally, we conduct variance decomposition to unravel how much of the variance in one variable is due to changes in another. Our analysis primarily focuses on the impacts observed over a decade.

Panel unit root tests

Before using the panel vector auto-regression framework for empirical analysis, confirming the stationarity of included variables is crucial. Broadly speaking, tests for panel data unit roots bifurcate into two primary segments: those predicated on the premise of cross-sectional independence and those receptive to cross-sectional interdependence. Prototypical examples of the former encompass methodologies propounded by Maddala and Wu (1999), Levin et al. (2002), Im et al. (2003), and Hadri (2000), which were rooted in the notion of cross-sectional autonomy. Alternatively, the latter spectrum is illustrated by procedures outlined by Smith et al. (2004), Moon and Perron (2004), Carrion-i-Silvestre et al. (2005), and Pesaran (2007), which are attuned to the nuances of cross-sectional synergies. However, O’Connell (1998) astutely observed that neglecting a pronounced orthogonal residue cross-sectional positive affinity can compromise the integrity of panel unit root tests founded on cross-sectional independence, culminating in potential analytical aberrations due to size distortion. This underscores the pivotal importance of meticulously appraising cross-sectional interrelationships as a prelude to the judicious selection of an estimator tailored for panel inquiries. In the pursuit of discerning cross-sectional dependencies, the present investigation will leverage the diagnostic apparatus delineated by Pesaran (2007), striving for analytical rigor and credibility in subsequent evaluations.

Pesaran diagnostic test

Recognizing the limitations of the Breusch and Pagan (1980) Lagrange multiplier test in assessing cross-sectional interdependencies, especially regarding its effectiveness and temporal accuracy, Pesaran (2021) developed a more refined Lagrangian multiplier test method. This advanced approach provides a deeper and more precise examination of cross-sectional dependencies. In this study, we apply Pesaran’s enhanced metrics to thoroughly analyze cross-sectional dependence, as elaborated in the following sections:

$${{{\mathrm{CD}}}}_{{{{\mathrm{LM}}}}} = \left( {\frac{1}{{{{N}}\left( {{{N}} - 1} \right)}}} \right)^{\frac{1}{2}}\,\mathop {\sum}\limits_{{{i}} = 1}^{{{N}} - 1} \cdot \mathop {\sum}\limits_{{{j}} = {{i}} + 1}^{{N}} {\left( {{{{\mathrm{T}}}}\check \rho _{{{{\mathrm{ij}}}}}^2 - 1} \right)} .$$
(9)

In Eq. (9), the disturbance’s product-moment correlation coefficient is symbolized by ρi,j. Under the asymptotic conditions, the LM test statistic tends towards a normal distribution characterized by parameters T → ∞ and N → ∞, with the foundational assumptions that [under the null: \(\rho _{{{{\mathrm{i}}}},{{{\mathrm{j}}}}} = \rho _{{{{\mathrm{j}}}},{{{\mathrm{i}}}}} = 0\) when i ≠ j; and under the alternative: \(\rho _{{{{\mathrm{i}}}},{{{\mathrm{j}}}}} = \rho _{{{{\mathrm{j}}}},{{{\mathrm{i}}}}} \ne 0\) for at least one pair where i ≠ j]. Nevertheless, when the dimension N overshadows T, potential distortions in the test’s size properties can be observed. Recognizing the limitations intrinsic to such configurations, Pesaran (2021) advanced a nuanced diagnostic mechanism for probing cross-sectional interdependencies, adeptly crafted for structures marked by expansive N- and limited T-values. This cutting-edge diagnostic derives its foundation from the pairwise correlation of residuals, which are extrapolated from the quintessential augmented Dickey-Fuller paradigm, applicable to every individual element within the panel dataset. It’s paramount to underscore that this refined methodology circumvents the quadratic components integral to the conventional LM evaluation. With these modifications, the refined metric for assessing cross-sectional dependence is defined as follows:

$${{{\mathrm{CD}}}} = \left( {\frac{{2{{{\mathrm{T}}}}}}{{{{N}}\left( {{{N}} - 1} \right)}}} \right)^{\frac{1}{2}}\,\mathop {\sum}\limits_{{{i}} = 1}^{{{N}} - 1} \cdot \mathop {\sum}\limits_{{{j}} = {{i}} + 1}^{{N}} {\left( {\check \rho _{{{{\mathrm{i}}}},{{{\mathrm{j}}}}}^2} \right)} .$$
(10)

When postulating the null hypothesis of an absence of cross-sectional interdependence, the cross-sectional dependence metric is anticipated to adhere to a normal distribution irrespective of the sequential positioning of T → ∞ and N → ∞. In his seminal work, Pesaran (2007) expounded upon the sufficiency of a limited cadre of high-caliber sample attributes (manifested by diminutive values of both N and T) in facilitating robust evaluations via cross-sectional dependence statistical assessments. Building on this narrative, both Pesaran (2007) and Smith et al. (2004) accentuated that, in scenarios where the empirical evidence negates the null hypothesis, thus underscoring cross-sectional interdependence, it becomes imperative to transition towards executing the pertinent panel unit root examinations. With an aim to authenticate the stationarity of individual variables while accommodating the inherent cross-sectional dependence within the panel, the present inquiry judiciously harnesses the panel unit root evaluation methodologies championed by Smith et al. (2004) as well as Pesaran (2007).

Unit root test developed by Pesaran (2007)

In his work, Pesaran (2007) advanced a cross-sectional refinement of the IPS test as originally delineated by Im et al. (2003), which fundamentally premised on the autonomy of individual units. The impetus behind this enhancement stemmed from the realization that a plethora of macroeconomic endeavors manifested inherent temporal correlations when scrutinizing data at regional or national granularities. Delving deeper, Pesaran (2007) undertook a meticulous reevaluation of the assumption of cross-sectional independence and consequently proffered a unit-factorial model, adeptly integrating residuals characterized by divergent loading factors. In a bid to bolster the traditional Augmented Dickey-Fuller paradigm, he advocated the incorporation of cross-sectional means of lagged levels alongside the initial disparities of distinct variables. Subsequently, this discourse elucidates the cross-sectional refinement of the augmented Dickey-Fuller assessment.

$${\mathrm{{\Delta}}}y_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} = \pi _{{{\mathrm{i}}}} + \beta _{{{\mathrm{i}}}}{{y}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - 1} + \omega {{{\check y}}}_{{{{\mathrm{t}}}} - 1} + \theta _{{{\mathrm{i}}}}{\mathrm{{\Delta}}}\check{{y}}_{{{\mathrm{t}}}} + \epsilon _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}.$$
(11)

Within the framework of Eq. (6), \({{{\check{\mathrm y}}}}_{{{\mathrm{t}}}} = {\textstyle{{\mathop {\sum}\nolimits_{{{{\mathrm{i}}}} = 1}^{{{\mathrm{N}}}} {\left( {{{{\mathrm{y}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}} \right)} } \over {{{\mathrm{N}}}}}}\). \({\mathrm{{\Delta}}}\check {\mathrm{y}}_{{{\mathrm{t}}}} = {\textstyle{{\mathop {\sum}\nolimits_{{{{\mathrm{i}}}} = 1}^{{{\mathrm{N}}}} {\left( {{{{\mathrm{{\Delta}}}y}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}} \right)} } \over {{{\mathrm{N}}}}}}\). Pesaran (2007) meticulously formulated a panel unit root examination predicated on the cross-sectional average of singular augmented Dickey-Fuller metrics. In a subsequent endeavor, Pesaran (2007) proffered a refined iteration of the IPS test, which is delineated herein:

$${{{\mathrm{NIPS}}}} = \frac{{\mathop {\sum}\nolimits_{{{i}} = 1}^{{N}} {\left( {{{{\mathrm{NADF}}}}_{{{\mathrm{i}}}}} \right)} }}{{{{N}}}}.$$
(12)

Within the context of Eq. (12), NADFi denotes the cross-sectional contemporary Augmented Dickey-Fuller metric for each individual unit, symbolized by (i), which aligns with the t-statistic associated with βi in Eq. (10). Moreover, Pesaran (2007) ascertained the critical thresholds for NIPS, factoring in diverse sample dimensions.

Unit root test developed by Smith et al. (2004)

Smith et al. (2004) utilized the bootstrap panel unit root assessment to confirm the stationarity of a given variable, a method that aptly accommodates both cross-sectional and time series attributes of a datum. This assessment is a refined version of the panel unit root test proposed by Im et al. (2003). \(\overline {{{{\mathrm{LM}}}}} = {\textstyle{{\mathop {\sum}\nolimits_{{{{\mathrm{i}}}} = 1}^{{{\mathrm{N}}}} {{{{\mathrm{KM}}}}_{{{\mathrm{i}}}}} } \over {{{\mathrm{N}}}}}}\) designates the mean of individual Lagrange multiplier (LMi) test statistics, a concept originally introduced by Solo (1984). The test fashioned by Leybourne (1995) is represented by \(\overline {{{{\mathrm{max}}}}}\), while \(\overline {{{{\mathrm{min}}}}} = {\textstyle{{\mathop {\sum}\nolimits_{{{{\mathrm{i}}}} = 1}^{{{\mathrm{N}}}} {{{{\mathrm{min}}}}_{{{\mathrm{i}}}}} } \over {{{\mathrm{N}}}}}}\) is indicative of a robust alternative to the individual Lagrange multiplier, such that \({{{\mathrm{min}}}}_{{{\mathrm{i}}}} = {{{\mathrm{min}}}}\left( {{{{\mathrm{LM}}}}_{{{{\mathrm{fi}}}}},{{{\mathrm{LM}}}}_{{{{\mathrm{ri}}}}}} \right)\). LMfi is predicated upon forward regression techniques, whereas the LMri is anchored in backward regression methodologies. This quartet of evaluations is predicated on the unit test, where the null hypothesis postulates and contrasts against the heterogeneity of the auto-regressive root under the alternative stance. A negation of the null hypothesis suggests stationarity is evident in at least one of the countries under consideration.

Cointegration test for panel-dependent units developed by Westerlund (2006)

While the panel cointegration methodology of Pedroni (1999) is a popular choice among scholars, its assumption of cross-sectional independence has been criticized as overly constraining, particularly by Banerjee and Carrion-i-Silvestre (2006). When cross-sectional dependence exists within a panel, many turn to the cointegration test formulated by Westerlund (2006). Distinguished from residual-based cointegration evaluations, this test sidesteps the pitfalls associated with the common factor constraint. As elucidated by Kremers et al. (1992), this constraint emerges when there is congruence between the long-term cointegration vector at the variable level and the short-term adjustment mechanism of the initial variable differences. Notably, various scholarly works, including Ho (2002), posit that this common factor constraint could be the underpinning rationale for repudiating the null hypothesis, particularly in scenarios where cointegration has a strong theoretical foundation. Westerlund (2006) introduced a quartet of panel cointegration tests (Gt, Ga, Pt, and Pa), which foreground structural dynamics over mere residuals. The advantage of these tests is that they obviate the need for common asymptotic constraints. The main thrust of these tests is to validate the null hypothesis of non-cointegration by postulating that the error correction term in the panel’s conditional error correction model remains null. Each test adheres to a normal distribution and boasts flexibility, allowing for unit-specific trends, varied slope parameters, distinct short-term dynamics for each unit, and the inclusion of cross-sectional dependencies. Significantly, these tests yield robust p-values, adjusted for cross-sectional dependence via bootstrapping. Collectively, they bolster the counter-hypothesis that cointegration is present in at least one examined unit.

$${{G}}_{{{\mathrm{t}}}} = \frac{\sum\nolimits_{{{i}}=1}^{N}\frac{\check{{\rm{a}}}}{{\rm{se}}(\check{{\rm{a}}}_{\rm{i}})}}{{{N}}}.$$
(13)
$${{G}}_{{{\mathrm{a}}}} = \frac{\mathop {\sum}\nolimits_{{{i}} = 1}^{{N}} \frac{{\mathrm{T}}\check{{\rm{a}}}_{{\mathrm{i}}}}{\check{{\mathrm{a}}}_1(1)}}{{{N}}}.$$
(14)
$${P}_{{\mathrm{t}}} = \frac{\check{{\mathrm{a}}}}{{\mathrm{se}}\left(\check{{\mathrm{a}}}\right)}.$$
(15)
$${{P}}_{{{\mathrm{t}}}} = T\check{{\mathrm{a}}}.$$
(16)

Persyn and Westerlund (2008) made a significant advancement in the study of panel cointegration by proposing an alternative hypothesis. They argued that cointegration extends beyond isolated instances in individual panel units and is actually a widespread phenomenon affecting the entire panel. This broader approach emphasizes the need to examine cointegration not just at an individual unit level but across the entire panel, recognizing the interconnectedness of all units. Their work highlights the complexities of panel data studies and the importance of a comprehensive evaluation of cointegration dynamics.

Results and discussion

Cross-sectional dependence test

In our research, we have chosen to employ advanced second-generation panel unit root tests, renowned for effectively handling cross-sectional dependence. These tests assume a null hypothesis of cross-sectional independence among units. The test statistics are expected to follow a standard normal distribution with zero mean and a variance of one. Scholars like Smith et al. (2004) and Pesaran (2007) have significantly contributed to the development and refinement of these tests, bolstering their reliability. Our goal in using these tests is to rigorously confirm the stationarity of each variable in our dataset, providing a solid statistical base for further analysis. We have meticulously compiled the results of these tests, which are available in Table 3.

Table 3 Results of cross-sectional dependence test.

Table 3 reveals a consistent rejection of the null hypothesis across all variables at a 1% significance level, a finding that holds true for various lag lengths ranging from 1 to 6. This result indicates a strong presence of cross-sectional dependence among the countries in our study, likely due to their complex economic interconnections. Given this scenario, it’s essential to use panel unit root tests that can adequately capture such cross-sectional dependencies. Scholars like Smith et al. (2004) and Pesaran (2007) have significantly contributed to this field by developing tests suited for these complexities. Building on these insights, Table 4 further details the outcomes of using these advanced panel unit root tests, specifically designed to address cross-sectional dependence.

Table 4 Results of unit root test.

Table 4 presents the results that shed light on the integration characteristics of the five key variables central to our study. These variables each exhibit first-order integration at a significance level of 1%, indicating their stationarity when considering their first-order differences. This confirmation holds significant importance as we delve into advanced analytical techniques, such as panel vector auto-regression. The outcomes of these panel unit root tests unequivocally reveal the presence of cross-sectional dependence, coupled with first-order integration, among the five variables under scrutiny. Understanding these characteristics forms the basis for our subsequent academic exploration, which aims to uncover potential long-term relationships interwoven among these variables. To delve deeper into this inquiry and either confirm or refute our hypothesized relationships, we leverage Westerlund’s (2006) cointegration test, specifically designed for panels with interdependent units. This test operates within the context of a null hypothesis that asserts the absence of cointegration and an alternative hypothesis suggesting its presence. The results of this analytical investigation are presented comprehensively in Table 5.

Table 5 Results of cointegration test.

Table 5 provides insights into the results obtained from the cointegration analysis. Upon careful examination of the variables presented, it becomes evident that all the tests consistently support the null hypothesis. This observation implies a lack of enduring relationships among the key variables under study, namely artificial intelligence, religion, labor input, capital input, and economic development. In light of this finding, in order to gain a more nuanced understanding of the interconnections among these variables, we will work with their first-order difference forms. Following this initial exploration, our analytical framework will further employ panel vector auto-regression methodologies to delve deeper into the empirical correlations and potential causal relationships.

PVAR estimation

In this section, we delve into the interrelationships among artificial intelligence, religion, labor input, capital input, and economic growth using the panel vector auto-regression tool. The empirical results obtained from the PVAR (1) estimation, along with the coefficients from the Generalized Method of Moments (GMM), are presented in Table 6, providing a comprehensive overview for detailed examination and further discussion.

Table 6 Results of PVAR (1) estimation.

Based on the empirical evidence presented in Table 6, a noticeable positive correlation emerges between artificial intelligence and religious freedom concerning economic expansion. Furthermore, an analysis of the same dataset reveals that both labor and capital inputs exhibit a conducive relationship with economic growth. Such findings highlight the potential synergistic roles of technological innovation, societal freedoms, and traditional economic drivers in fostering growth dynamics.

Causality test

To further explore the causal relationships among the mentioned variables, this study utilizes the Toda-Yamamoto conditional Granger causality test. Determining the direction of causality is crucial, as it informs strategic decision-making processes related to artificial intelligence, religious freedom, labor and capital inputs, and overall economic growth trajectories. The Toda-Yamamoto approach is based on a vector auto-regressive model with a lag of p, supported by a modified Wald test statistic that effectively identifies the direction of causality among the focal variables. Importantly, this methodology offers a significant advantage over the Pairwise Granger causality technique, which requires that all examined variables be integrated of order I(0) or I(1). One of the notable strengths of the Toda-Yamamoto Granger causality framework is its adaptability; it can be easily applied and yield robust insights when the variables under consideration adhere to I(0) or I(1) integration. Originating from the work of Toda and Yamamoto (1995), this technique is rooted in the vector auto-regressive distributed lag model, as delineated below:

$$\begin{array}{l}\left[\begin{array}{c} {{{{\mathrm{ai}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}} \\ {{{{\mathrm{rf}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}} \\ {{{{\mathrm{eg}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}} \\ {{{{\mathrm{li}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}} \\ {{{{\mathrm{ci}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}}}} \end{array} \right] = \left[ \begin{array}{c} {{a}} \\ {{b}} \\ {{c}} \\ {{d}} \\ {{e}} \end{array} \right] + \mathop {\sum}\limits_{{{t}} = {{{\mathrm{l}}}}}^{{{\mathrm{n}}}} \left[\begin{array}{ccccc} {{\lambda }}_{11{\mathrm{l}}} & {{\lambda }}_{12{\mathrm{l}}} & {{\lambda }}_{13{\mathrm{l}}} & {{\lambda }}_{14{\mathrm{l}}} & {{\lambda }}_{15{\mathrm{l}}} \\ {{\lambda }}_{21{\mathrm{l}}} & {{\lambda }}_{22{\mathrm{l}}} & {{\lambda }}_{23{\mathrm{l}}} & {{\lambda }}_{24{\mathrm{l}}} & {{\lambda }}_{25{\mathrm{l}}} \\ {{\lambda }}_{31{\mathrm{l}}} & {{\lambda }}_{32{\mathrm{l}}} & {{\lambda }}_{33{\mathrm{l}}} & {{\lambda }}_{34{\mathrm{l}}} & {{\lambda }}_{35{\mathrm{l}}} \\ {{\lambda }}_{41{\mathrm{l}}} & {{\lambda }}_{42{\mathrm{l}}} & {{\lambda }}_{43{\mathrm{l}}} & {{\lambda }}_{44{\mathrm{l}}} & {{\lambda }}_{45{\mathrm{l}}} \\ {{\lambda }}_{51} & {{\lambda}_{52}} & {{\lambda}_{53}} & {{\lambda}_{54}} & {{\lambda}_{55}}\end{array} \right] \times \left[ \begin{array}{c} {{{\mathrm{ai}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{l}}}}} \\ {{{{\mathrm{rf}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{l}}}}}} \\ {{{{\mathrm{eg}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{l}}}}}} \\ {{{{\mathrm{li}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{l}}}}}} \\ {{{{\mathrm{ci}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {{{\mathrm{l}}}}}} \end{array} \right] \\ + \mathop {\sum}\limits_{{{j}} = {{p}} + 1}^{{{m}}_{{{{\max}}}}} \left[\begin{array}{ccccc} {\lambda }_{11{\mathrm{j}}} & {\lambda }_{12{\mathrm{j}}} & {\lambda }_{13{\mathrm{j}}} & {\lambda }_{14{\mathrm{j}}} & {\lambda }_{15{\mathrm{j}}} \\ {\lambda }_{21{\mathrm{j}}} & {\lambda }_{22{\mathrm{j}}} & {\lambda }_{23{\mathrm{j}}} & {\lambda }_{24{\mathrm{j}}} & {\lambda }_{25{\mathrm{j}}} \\ {\lambda }_{31{\mathrm{j}}} & {\lambda }_{32{\mathrm{j}}} & {\lambda }_{33{\mathrm{j}}} & {\lambda }_{34{\mathrm{j}}} & {\lambda }_{35{\mathrm{j}}} \\ {\lambda }_{41{\mathrm{j}}} & {\lambda }_{42{\mathrm{j}}} & {\lambda }_{43{\mathrm{j}}} & {\lambda }_{44{\mathrm{j}}} & {\uplambda }_{45{\mathrm{j}}} \\ {\uplambda }_{51{\mathrm{j}}} & {\lambda }_{52{\mathrm{j}}} & {\lambda }_{53{\mathrm{j}}} & {\lambda }_{54{\mathrm{j}}} & {\lambda }_{55{\mathrm{j}}} \end{array}\right] \times \left[\begin{array}{c} {{{\mathrm{ai}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {\mathrm{j}}} \\ {{{{\mathrm{rf}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {\mathrm{j}}}} \\ {{{{\mathrm{eg}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {\mathrm{j}}}} \\ {{{{\mathrm{li}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {\mathrm{j}}}} \\ {{{{\mathrm{ci}}}}_{{{{\mathrm{i}}}},{{{\mathrm{t}}}} - {\mathrm{j}}}} \end{array}\right] + \left[ \begin{array}{c}1\epsilon _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} \\ 2\epsilon _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} \\ 3\epsilon _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} \\ 4\epsilon _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} \\ 5\epsilon _{{{{\mathrm{i}}}},{{{\mathrm{t}}}}} \end{array}\right]\end{array} .$$
(17)

In Eq. (17), n represents the order of the vector auto-regressive process, augmented with m supplementary lags of mmax. Here, mmax typifies the highest degree of integration within the vector auto-regressive system. The Wald test statistic’s asymptotic χ2 distribution is discerned within the context of a vector auto-regressive model of order n + mmax. This mathematical construct aids in empirically validating the propositions under the Toda-Yamamoto Granger causality framework. The postulated null hypothesis symbolized as λ15l ≠ 0 for Vl, posits a causal interplay between the duo of assessed variables. This is indicative of one variable possessing predictive power over the other. In contrast, the alternative hypothesis, articulated as λ15l = 0 for Vl, negates such a causal linkage, suggesting that the two variables evolve independently without influencing each other’s trajectories. Drawing from the aforementioned methodological exposition, we proceeded to implement the Toda-Yamamoto causality test to further elucidate the causal relationship among artificial intelligence, religious freedom, labor and capital inputs, and economic growth under scrutiny. The empirical outcomes of this analysis are systematically tabulated and presented in Table 7 for evaluation.

Table 7 Results of causality test.

Based on the results presented in Table 7, a clear causal relationship emerges among artificial intelligence, religious freedom, labor input, and capital input. This causal interplay aligns with existing literature (Dahlum et al., 2022; Mokyr et al., 2015; Grim 2015) that posits the intricate interconnection of technology, societal freedoms, and economic factors in driving growth and development. For example, previous scholarly investigations have highlighted the role of technological advancements, such as artificial intelligence, in shaping economic trajectories. Simultaneously, the freedom of religious expression has been associated with societal stability and, consequently, favorable economic environments. The observed causal dynamics among these variables in our findings further emphasize the multifaceted nature of their interrelation, underscoring the importance of a holistic approach when analyzing the determinants of economic growth.

Impulse response function

In this section, we conduct a comprehensive examination of impulse response functions and their significance in academic discourse. A fundamental requirement for conducting the impulse response function analysis is the careful arrangement of variables in a relevant sequence. In an effort to orthogonalize shocks, this study employs Cholesky’s decomposition method on the variance-covariance residual matrix within the framework of the panel vector auto-regression model, drawing inspiration from the pioneering work of Sims’s (1980). Sims proposed the idea that leading variables, even in their lagged form, tend to exhibit higher exogeneity. Conversely, subsequent variables are susceptible to simultaneous changes, making them inherently endogenous. Additionally, earlier variables are typically influenced by a one-period lag. Guided by this theoretical framework, our research adopts a sequence comprising artificial intelligence, religion, labor input, capital input, and economic growth. The results of the impulse response function analyses are visually presented in Fig. 1.

Fig. 1: Results of impulse response function analysis.
figure 1

Reaction of religion and economic growth to religious freedom, artificial intelligence, labor input, and capital input one-standard deviation shock.

In the process of empirical examination, we adhered to the sequential arrangement of variables used within the panel vector auto-regression when conducting the impulse response function estimations. The results, as depicted in Fig. 1, illustrate the systemic evolution resulting from a one-standard-deviation shock over a decade. An analysis of the dataset reveals that a one-standard-deviation shock in artificial intelligence leads to a discernible and statistically significant increase in economic growth. This relationship finds support in various academic contributions and theoretical postulates. Acemoglu and Restrepo (2018) proposed that technological advancements driven by artificial intelligence could lead to increased productivity, attributed to improved resource allocation and enhanced operational capabilities. Brynjolfsson and McAfee’s (2014) work on the ‘Second Machine Age’ suggested that artificial intelligence, at the intersection of emerging digital paradigms, had the potential to drive economic expansion by fostering innovative business ecosystems and markets. Complementing this perspective, Bughin et al. (2017) highlighted the transformative potential of artificial intelligence as a new production catalyst, working synergistically with labor and capital to create unprecedented economic value.

From a theoretical standpoint, Aghion et al. (2017) also offered insights into the multifaceted growth implications of artificial intelligence, emphasizing its ability to stimulate R&D efforts, drive innovation diffusion, and reshape competitive market dynamics. Furthermore, from a theoretical perspective, endogenous growth theory highlights the fundamental role of technological advancements, with artificial intelligence playing a pivotal role, in shaping long-term growth trajectories. Schumpeter’s concept of creative destruction suggests that emerging technologies like artificial intelligence may render obsolete existing processes but also pave the way for new growth-oriented ventures. Additionally, the theory of human capital augmentation posits a harmonious symbiosis between artificial intelligence and human roles. In this view, artificial intelligence serves as an enhancer rather than a substitute for human functions, amplifying productivity and, consequently, promoting growth. Given this comprehensive amalgamation of academic discussions and theoretical frameworks, the strong positive correlation between artificial intelligence-induced perturbations and economic growth becomes evident.

The outcomes depicted in Fig. 1 shed light on an association suggesting that economic advancement experiences a notable and statistically significant upswing in response to perturbations originating from religious freedom. This symbiotic relationship between religious liberty and economic prosperity has sparked scholarly interest and merits comprehensive exploration. At the forefront of this discourse, McCleary and Barro (2003) presented empirical evidence suggesting that religious beliefs and practices could significantly shape economic behaviors and outcomes. They argued that specific religious tenets could foster a strong work ethic, strengthen societal trust, and reduce transactional impediments, thereby catalyzing economic vitality. Building upon this foundation, Guiso et al. (2003) systematically examined the interplay between religious values and socioeconomic indicators, concluding that religious doctrines emphasizing individual agency were positively correlated with economic growth.

Offering a historical perspective, Becker and Woessmann (2009) highlighted how Protestant regions in Europe exhibited higher literacy rates, which in turn influenced superior economic trajectories. From an institutional standpoint, Stulz and Williamson (2003) posited that religious convictions could shape and refine the framework of economic, judicial, and political institutions in ways that bolstered economic dynamism. Lastly, Iannaccone (1998) introduced a novel perspective by theorizing that religious freedom, by stimulating healthy competition among religious factions, could drive innovation and operational efficiencies, resembling the principles of competitive markets and indirectly promoting economic growth. Theoretically, the cultural economics framework outlined by Molle (2019) suggests that religious freedom, as an integral aspect of cultural diversity, can enhance economic growth by stimulating creativity, innovation, and adaptive agility. Furthermore, the principles of institutional economics, rooted in Mayer’s (2018) scholarship, underscore the crucial role of institutions, including religious ones, in shaping economic trajectories. By ensuring religious pluralism and preventing monolithic religious hegemonies, religious freedom potentially contributes to an institutionally rich environment that fosters economic vitality.

In the concluding observations presented in Fig. 1, it is evident that economic expansion exhibits a pronounced and statistically significant increase following perturbations attributed to both labor and capital inputs. This fundamental interrelationship, deeply rooted in the annals of economic thought, warrants a comprehensive analytical examination supported by contemporary scholarly contributions. Solow (1956) emerged as a luminary in this discourse, with his seminal exposition positing that labor and capital, as the principal factors of production, formed the foundation for sustained economic growth. His model emphasized that changes or additions in these inputs would inevitably translate into fluctuations in economic output. Building upon this foundational paradigm, Romer (1986) introduced an endogenous perspective, elucidating how capital accumulation, particularly in the context of human capital, could yield increasing returns, thereby accelerating economic growth. From a theoretical standpoint, the classical theory of production, with its roots tracing back to figures, underscores the pivotal roles of labor and capital as the twin engines propelling the machinery of economic development. It argues that variations in these inputs correspondingly affect aggregate economic outcomes. In a more contemporary conceptualization, the endogenous growth theory asserts that internal factors such as labor productivity and capital deepening, especially in knowledge-intensive economies, can intrinsically guide the course of long-term economic growth. This theory emphasizes that shocks in labor and capital not only influence the contemporary economic state but also lay the groundwork for future growth trajectories by impacting innovation and technological progress.

Variance decomposition

Impulse response function analyses, widely acclaimed in the economic literature, serve as pivotal tools for discerning the dynamic reactions of one variable in response to unanticipated shocks in another. However, an intrinsic shortcoming of these methodologies is their inability to quantify the exact magnitude and duration of such propagating influences. To bridge this epistemic gap, scholars often resort to variance decomposition techniques. This approach delineates the fraction of forecast-error variance of a given dependent series that can be attributed to its own innovations, in juxtaposition to those engendered by other exogenous determinants. Table 8 encapsulates the results derived from the variance decomposition analysis, leveraging the coefficient matrices of orthogonalized impulse responses. This examination provides an intricate comprehension of the multifaceted interactions and dependencies among the variables under consideration.

Table 8 Results of forecast-error variance decomposition (%).

Table 8 presents the forecast-error variance decompositions stemming from the foundational PVAR model, delineated at intervals of 5, 10, 15, and 20 periods. Upon close examination, it is discernible that the variance in economic growth can be attributed, in part, to several determinants. Specifically, artificial intelligence accounts for an estimated 3.052% of the overall variance in economic growth, while religious freedom contributes to approximately 2.267%. Notably, labor and capital inputs emerge as the paramount factors, commanding a substantial share in elucidating the total variance in economic growth. Intriguingly, the findings also hint at a prospective trend: over extended periods, the influence of artificial intelligence and religious freedom on economic growth appears poised to gain greater prominence and significance.

Conclusions

In light of the endeavor undertaken in this article, we have delved deeply into the intricate dynamics of artificial intelligence and religious freedom as catalysts for economic growth, spanning 26 countries over the period 2000–2021. By applying the panel vector auto-regression model, the results reveal a definitive positive correlation between artificial intelligence, religious freedom, and the dynamics of economic growth. The analytical depth provided by the forecast-error variance decomposition further elucidates the escalating prominence of these variables, forecasting their augmented influence on future economic development trajectories. Concurrently, the enduring significance of traditional determinants, notably labor and capital inputs, in the economic growth narrative remains unassailable, as confirmed by this study. The validation brought forth by the Toda-Yamamoto Granger causality analysis further accentuates the robustness and credibility of the findings expounded in this article. Collectively, this study not only reaffirms established paradigms but also sheds light on emerging dynamics, offering pivotal insights for stakeholders in the economic and policy arenas.

Based on the insights gleaned from this research, several concrete policy implications emerge that could inform decision-making in the realms of artificial intelligence, religion, and economic growth. First, given the discernible positive correlation between artificial intelligence and economic growth, it is imperative for policymakers to prioritize the integration of artificial intelligence-driven methodologies across key sectors of the economy. This entails not only investing in artificial intelligence research and development but also fostering an environment conducive to artificial intelligence adoption through appropriate educational reforms, skill development programs, and regulatory frameworks. A nation’s strategic emphasis on artificial intelligence readiness can potentially position it at the forefront of the next wave of global economic leaders. Second, the study accentuates the often overlooked economic dividends of religious freedom. Policymakers should be cognizant of the economic benefits that can be reaped from ensuring and promoting religious freedom. This extends beyond mere tolerance and requires proactive measures to foster interfaith dialog, remove discriminatory regulations, and champion religious pluralism. Such an inclusive approach can catalyze social cohesion, reduce conflicts, and, in turn, provide a stable environment conducive to economic activities and foreign investments. Finally, while emergent factors like artificial intelligence and religious freedom are gaining traction, the unassailable significance of traditional determinants such as labor and capital cannot be sidelined. It remains vital for policymakers to continuously refine and bolster policies related to labor markets, workforce training, and capital investments. Ensuring the fluidity and efficiency of these foundational economic elements, in tandem with new growth catalysts, will ensure a holistic and sustainable economic development trajectory for nations.

Based on the findings of this study, several limitations and potential avenues for future research can be identified: Limitations: (1) The small sample size, which consists of only 26 countries, may limit the generalizability of the findings across different cultural, religious, and economic contexts. (2) The panel vector auto-regression model, while useful, may not fully capture the complex, non-linear relationships between artificial intelligence, religion, and economic growth. (3) The study’s focus on aggregate measures of artificial intelligence, religion, and economic growth may obscure important nuances and variations within each of these domains. (4) The investigation does not consider potential moderating factors, such as political or social variables, that could influence the relationships between artificial intelligence, religion, and economic growth. Future research directions: (1) expanding the sample size and diversity of countries to provide a more comprehensive and globally representative understanding of the relationships between artificial intelligence, religion, and economic growth (2) conducting longitudinal studies that extend the time frame of the analysis to capture the ongoing evolution of artificial intelligence and its potential impacts on religion and economic growth. (3) Employing alternative methodological approaches, such as machine learning or network analysis, to better capture the intricate and dynamic interplay between artificial intelligence, religion, and economic growth (4) Examining sub-domains within artificial intelligence, religion, and economic growth, as well as potential interactions between these sub-domains, to gain deeper insights into their specific relationships and mechanisms of influence. (5) Investigating the role of moderating factors, such as political, social, or cultural variables, in shaping the relationships between artificial intelligence, religion, and economic growth could help identify context-specific policy implications. These limitations and future research directions serve to further advance the understanding of the complex relationships between artificial intelligence, religion, and economic growth while also encouraging the development of more nuanced, context-sensitive, and actionable insights for policymakers and stakeholders.