Introduction

Disinformation and conspiracy theories have always been used as tools of deception, and even more so in the Digital Age, poor public communication catalyzes the spread of disinformation in public opinion. Social Media have been the digital disruption of the 21st century, and its uninterrupted development continues to shape the media and communication landscape.

Nowadays, anyone can produce political content that can reach a global audience, with everyday citizens playing a crucial role in the information ecosystem (Tucker et al. 2018). Another pivotal aspect of modern information campaigns is that social media have become a fundamental instrument for political leaders to vehiculate messages to the electorate (Stier et al. 2018; Martella and Bracciale, 2022). In recent years, the term “fake news” became mainstream referring to falsified stories circulated on social media platforms to fuel propaganda or gain from advertising by becoming viral (Wu et al. 2019). The rise of political polarization and hyper-partisan media created a fertile ground for spreading disinformation, which poses an inevitable threat to society by influencing citizens’ opinions and shaping the democratic processes.

On the 25th of September 2022, 63.91%Footnote 1 of Italians voted to elect the new Parliament, with high abstentionism that has characterized these general elections, which saw the competition of four main coalitions or movements.Footnote 2The elections have also been characterized by a short period of the electoral campaign, being officialized by Republic President Sergio Mattarella only three months in advance, on the 27th of June. A key aspect of the election campaign was the presence in the political debate of the term “nuclear,” with GoogleTrends showing that the term was searched as much as the term “immigration” (an evergreen topic in Italian politics) during the election campaign, and GoogleNews revealing >1500 news published during the 3-month campaign. Even considering the use of the term for clickbait purposes, it seems clear that the word has penetrated the electoral debate, and it has done so with two different declinations: regarding the use of nuclear power for civilian use and the escalation of the geopolitical situation in Ukraine, with the fear of an atomic conflict.

The nuclear energy debate started in late 2021 with the discussion of the introduction in the European Taxonomy for Sustainable Activities of both nuclear power and gas. The topic was initially followed by specialized print before obtaining the attention of prominent national newspapers. On the 1st of January 2022, the European Commission started consultations with experts in the field. On the 2nd of February 2022, they presented the “Taxonomy Complementary Climate Delegated Act” that included gas and Nuclear as sustainable sources for decarbonization. Italy has a tormented relationship with nuclear energy, with two referenda against nuclear plants in 1987 (after the Chernobyl disaster) and 2011 (after the Fukushima-Daiichi accident). Literature regarding Italian relations with nuclear power mainly treats the argument from an energetic and environmental point of view (Contu et al. 2016; Bersano et al. 2020) or focuses the attention on the debate right after or before the accidents of Chernobyl (Santarossa, 1990; Belelli, 1988) and Fukushima (Standish, 2009; Butler et al. 2011) mainly focusing the attention on traditional media (Bernardi et al. 2018).

Recently, the term nuclear has also been associated with chronicles regarding the scenario of an atomic conflict. The debate was raised with the start of the Russo-Ukrainian War, which saw the direct and indirect participation of countries with the largest atomic arsenals (Green, 2022). Another episode that raised attention was the battles between the Ukrainian and Russian armies to control two nuclear power plants: Chernobyl and Zaporizhzhya. There are multiple views regarding the possibility of atomic conflict among political actors (Barel, 2022; Sethi, 2022) and the broader public (Riad et al. 2023). On the one hand, some experts argue that the nuclear weapons possessed by both sides act as a deterrent. On the other hand, some advocate the possibility of using tactical nuclear weapons with “limited” destructive power. An evident aspect remains the perception of nuclear power shaped by the geopolitical situation, which has been debated since the beginning of the Ukrainian conflict (Green, 2022).

Studies on Nuclear debate that involve social media platforms (Kim and Kim 2014; Tsubokura et al. 2018) refer to the Japanese debate after the Fukushima-Daiichi incident and rely on methodologies based on text analysis. Generally speaking, Twitter has been the subject of extensive research regarding how social media have shaped society, politics, and public opinion (Jungherr, 2016). An important aspect is that even if studies have demonstrated that users are not representative of the general population (McGregor, 2019) and are highly influenced by political players (Weeks et al. 2017), academics and journalists rely on Twitter as a source of public opinion. The reason is that it has been recently demonstrated that ideas and issues represented on Twitter still provide insight into what every day citizens think (van Klingeren et al. 2021). By focusing on high-level attributes that transcend specific technologies when examining social media uses and outcomes, insights from studies of one platform can be understood in the context of other media that share the same affordances (Ellisona and Boyd, 2013). Being aware of that, we aim to apply similar approaches to different platforms, using Twitter, Facebook, and Instagram as starting points to explore the cross-platform dynamics. We aim to individuate narratives and users that convey information by connecting different social media platforms using bridges. A bridge is a concept from the transmedia theory we adopted to describe the narrative connection between social media platforms. Every platform today can be used with other communication channels and social media platforms (Boczkowski et al. 2018). Users inside this ecosystem can select the preferred tool to meet their communicative goals, creating a customized social media ecology (Bayer et al. 2020). We try to deal with the fluidity of social media platforms and their continuously evolving, adapting, and mimicking other platforms’ affordances, making it hard to define a status quo.

Our goal with this research is to focus on the spread of disinformation between platforms during the Italian electoral campaign, with particular attention to nuclear thematics. For simplicity, we use disinformation as an umbrella term to refer to any information one could encounter online that could lead to a factually incorrect view of the world (Wu et al. 2019). The term includes conspiracy theories, fake news, unverified information, rumors, and hyper-partisan news. The questions we aim to answer are the following:

RQ1) Considering the nuclear debate on social media during the Italian electoral campaign, which kind of content has been vehiculated by bridges and for what purpose?

RQ2) Considering debates such as nuclear energy and the atomic conflict, what has been the role of bridges for disinformation dissemination dynamics inside the social media ecosystem?

We found primary evidence on mainstream platforms that political actors and activists discuss nuclear energy for civilian use. The debate about the contemporary geopolitical situation is fueled by daily users who, in the name of citizen journalism, bring attention to hyper-partisan news channels, fake news, and damaging disinformation. Challenging aspects are bridges to “non-mainstream” platforms with different affordances and dynamics that should be carefully considered to assess whether and how they can isolate users’ views at the expense of serendipity. Information disseminated through bridges can be described as affective, defective, and infective. Affective means actors support their arguments with emotional and/or fear-mongering content. Defective means that the information shared on communication channels is biased, potentially leading the audience to biases. Infective means that the content conveyed has the potential to go viral. All of these adjectives should prompt us to rethink the informational well-being of today’s society.

Methodology

We consider a model of social media discussion called the Trench Warfare Dynamic (Karlsen et al. 2017). In this model, user interaction depends on each social media platform’s characteristics, affordances, and elements: liking content, reacting, sharing, commenting, and linking. All these options represent different “weapons” that users can employ in the discussion. Nowadays, the variegated social media ecosystem, with many different platforms (social networking sites, videosharing platforms, messaging apps, etc.) and their different affordances, is a crucial aspect of online dynamics. Each platform today can be used with other communication channels and social media platforms (Boczkowski et al. 2018). Users within this ecosystem can select the preferred tool to meet their communication goals, creating a personalized social media ecology (Bayer et al. 2020).

Hyperlinking is a feature present in all social networking platforms, where an ordinary dynamic is the ability to share links or references that lead to content on another social media network. The use of this resource is an intentional act (Ryfe et al. 2016) that creates networks (Fu and Shumate, 2017) and an interconnected media environment (Singer and Franklin, 2019) that influence how users find, consume, and share news (Martin and Dwyer, 2019). The literature on cross-platform analysis has focused on standard hyperlinks considering news consumption (Park and Kaye, 2020; Del Vicario et al. 2017) or analyzed cross-platform dynamics among social media sites from the perspective of the user account (Amara et al. 2023; Gruzd et al. 2018). Through their role in information dissemination, hyperlinks can make sense of social phenomena in their digital dimension due to their centrality in the digital environment (Hsu and Park, 2011). Our goal is to contribute to studying hyperlinks as objects for multiplatform dynamics by applying a transmedia perspective. Defining hyperlinks as Bridges will help add nuance regarding the narrative construct that draws people to secondary platforms. Bridging is a concept from transmedia theory where the term was first used by Gary Hayes (2006) to describe the movement of audiences across different media platforms. We will use it to describe the connection between social media platforms, considering the digital object and the narrative built around the object that drives people to other platforms.

Our research relies on Digital Methods (Rogers, 2019) to collect and process data from our main platforms (Twitter, Facebook, and Instagram). Digital methods are research strategies that analyze natively digital data from web platforms to repurpose the information to represent collective phenomena, social changes, and cultural expressions. The fundamental concept on which digital methods analysis is based is that of native digital object (NDO). NDO are the objects that shape and attribute their peculiarities to the Social Media platform, such as users, links, and comments. However, they can also be a peculiarity of a specific platform: peculiar entities on Twitter are hashtags, retweets, and mentions, while on Facebook, we have likes, reactions, and pages. These digital objects should not be seen as empty but as they conceal a specific meaning. Likes and reactions can be read as sentiment indicators, retweets can be used as a yardstick for evaluating the content, hashtags should not be seen as mere labels but represent a clear message, and the number of shares of hyperlinks estimates the reputation that certain content enjoys. When used as bridges, hyperlinks represent a connection with a particular goal, allowing users to shift their attention to a different platform. We will select and study bridges that led users from primary to secondary platforms.

The methodological approach will follow an explanatory sequential design (Creswell et al. 2011) that presents a quantitative phase followed by a qualitative phase that further explores something (the bridges) discovered during the quantitative analysis. The quantitative part consists of designing a robust query for our research, collecting and systematizing the data, and finally, pre-processing the data to identify bridges between platforms. After the bridge identification phase, we will move to a fundamental qualitative phase, manually exploring content and moving between platforms to gather new information and enrich data. The elasticity of a qualitative approach can provide the possibility to gain insights, becoming essential in exploring a highly dynamic environment such as social media. Finally, we will visualize our results with dedicated software.

The multiplatform approach and the different possibilities offered by the analyzed social media networks force us to follow slightly different paths in the data collection and analysis phases, which will be explained in detail below.

Data collection

The first step of data collection is to define a query design that considers the topic’s nuances and details by determining the correct keywords and their combination to allow us to interrogate the database and collect reliable and consistent data to analyze. In our case, we defined two queries, one referring to nuclear energy and atomic conflict (Q1) and the second referring to Italian electoral elections (Q2). The final query (Qf) will be the intersection of the previous ones, which allows us to strictly select the topic “nuclear” debated in the context of “Italian elections” (Fig. 1).

Fig. 1: Query design intersection structure.
figure 1

Q1 contains nuclear references, and Q2 contains electoral references and hashtags.

The second step consists in capturing data. To collect data from Twitter, we relied upon 4CAT (Peeters and Hagen, 2022) using Twitter’s APIs for Academic access. Differently, to collect data from Facebook and Instagram, we used CrowdTangle Team (2022). The first difference is that Twitter allows data collection from every user, while Meta only gives access to the public content from verified accounts, public pages, and public groups. This difference highly influences the dataset’s dimension.

4CAT allows us to upload external datasets, so we uploaded both datasets from Facebook and Instagram on the platform to start our pre-processing data phase.

Pre-processing

In this phase, which we can call bridge detection, we filtered (using a 4CAT function) every dataset by keeping only the posts containing Urls, which represent the connections towards other generic platforms and/or sites. From 4CAT, we exported our remaining data on a spreadsheet that we searched with accuracy, realizing a second-level filter by manually flagging only the connections towards other social media platforms, considering the categories below (Table 1):

Table 1 Platform classification.

As we can see above, we also included landing pages that can contain bridges directed on many different platforms.

Data enrichment, analysis and visualization

Being the analysis structured with an explanatory sequential design (Creswell et al. 2011), we moved from the quantitative step (which detected the bridges) to the qualitative step by formatting the spreadsheet to ease the exploration of the content and label data with additional information. Qualitative bridge analysis, which consists of empirically exploring content in search of new information to collect, will help us widen the net between different social media platforms, allowing us to describe the characteristics of the information conveyed, considering the nuances of transmediality that are not always found with quantitative analysis.

The additional pieces of information for the phase of data enrichment are listed in Table 2.

Table 2 Elements for data enrichment manually collected exploring bridges.

The classification of categories, topics, and user positions was an iterative process, driven by newly discovered information and the continuous adaptation of old parameters with new ones. For the user categories, we initially considered the ones already included in the Facebook data exported by Crowdtangle, adding and modeling new categories based on what emerged during the analysis. We finally defined politic (considering both individuals and party pages), news (individuals and channels from mainstream information), alternative news for non-mainstream channels, activist as people proselytizing on the subject with no political role or involvement in mainstream information, generic for individuals debating the issue with mixed perspectives, comedian for satiric content and music for music channels. Concerning Topic and Positions, a differentiation between nuclear energy and the atomic conflict narrative is necessary. The fact that nuclear energy is described as a polarizing topic in the literature led us to label it dichotomously in Pro-nuke and No-nuke positions. In our analysis, we found substantially polarized content and clearly defined positions. Regarding the atomic conflict narratives, we started from dichotomous positions, but quickly found nuances that led to distinctions between Anti-Nato, Pro-Russia or Anti-Usa, all containing distinctive peculiarities. The first describes a generic distance from Nato politics, the second also expresses an endorsement of Russia’s actions, while the third expresses grudge towards USA politics. The Topic field are also presented other topics to represent the nuances of the positions in the debate under analysis. Through this step of classifying and labeling content, we can now hierarchically classify categories, topics, and user positions (Fig. 2) and qualitatively analyze content in a methodical manner.

Fig. 2: Circle packing diagram.
figure 2

It displays values of User Category, Topic, and Positions on a hierarchical structure by using nested circles areas. Used to guide the content classification phase.

We can now decide to focus on alternative news channels that are against NATO or compare the positions of politicians in favor or against nuclear energy, checking whether the content shared can be categorized in some way as disinformation. Disinformation is an umbrella term which refers to:

  • Unverified information or rumors, information that can be described as unverified before being verified, ultimately proving to be false or inaccurate;

  • Propaganda, information that is potentially factually correct, but packaged in such a way as to denigrate opposing viewpoints;

  • Hyper-partisan content, is news that aims to portray a political party or its members in a positive light while mocking or blaming political opponents;

  • Fake-news stories deliberately false for the purpose of raising revenue or feeding propaganda and going viral;

  • Conspiracy theories are attempts to explain the causes of social and political events with claims of secret conspiracies by two or more powerful actors.

This categorization taken from literature (Wu et al. 2019; Born and Edgington, 2017; Sanovich et al., 2018; Douglas et al., 2019) helped us to recognize and qualitatively describe the content highlighting the different nuances of disinformation and how the various actors at play reconciled them in cross-platform dynamics through the narrative constructs of bridges between platforms.

The classification phase allowed us to explore the content in an orderly manner and first-person “crossing bridges” from the main platforms towards various secondary platforms, reading, visualizing, and classifying a wide content sphere. This exploratory approach is meant to improve the understanding of the nuanced aspects of the topic by visiting the social environments where internet users’ different points of view are located.

To visualize the data, we imported the labeled spreadsheets on Cortext (Breucker et al. 2016), creating a data corpus and using the Network Mapping feature to map together the users/channels of the primary platform with the users/channels of the secondary platforms. We exported a graph file with a .gexf extension that we managed with Gephi (Bastian et al. 2009), creating the representation visible in the Results section.

Results

We analyzed and explored 660 bridges (Fig. 3) from our primary platforms (Twitter, Facebook, and Instagram) oriented to secondary platforms (Youtube, LinkedIn, Mastodon, Telegram, etc.). A point that must be clarified is that a primary platform such as Facebook can become secondary when analyzing bridges detected on Twitter.

Fig. 3: Bridges graph.
figure 3

Realized by the author, it separately represents bridges identified in the three main platforms. The nodes represent users and/or channels/pages on primary and secondary platforms, while the edges represent the bridges. Nodes color distinguish platforms. On the left, we have the bridges detected on Twitter, with light-blue nodes connected to the users and channels of the secondary platforms, on the right we have the bridges detected on Facebook, with dark-blue nodes connected to the users and channels of the secondary platforms, and in the center we have the bridges detected on Instagram, with pink nodes connected to the users and channels of the secondary platforms.

Regarding Twitter bridges, we found that the leading destination platforms are Facebook (56.88%) and Youtube (26.67%), with the third place taken by different Mastodon instances called sociale.network (n.d.) and mastodon.uno (n.d.) (8.86%). The remaining bridges connect to mainstream platforms such as LinkedIn and Instagram and below-the-radar platforms such as Friendica. Those bridges address issues regarding nuclear energy for 81% of the cases and atomic conflict for a share of 13.1%, with the remaining 5.9% that mixes different topics. On Twitter, we found different kinds of disinformation: conspiracy theories were not represented as we expected, leaving the floor mainly to hyper-partisan content and unverified information present from the narratives we investigated: nuclear energy and atomic conflict.

Bridges from Facebook are directed toward a more significant number of platforms, with Youtube taking the most share (59.78%), followed by Instagram, Telegram, and Twitter, all around 11.75%. The minor platforms linked from Facebook are the mainstream TikTok and Twitch and below-the-radar platforms such as Mastodon (sociale.network), Rumble, and Sfero. Facebook’s disinformation included more content related to conspiracy theories than Twitter, with hyper-partisan content and unverified information. The ecosystem of this kind of content was more varied and oriented to different below-the-radar platforms.

Instagram bridge detection has been carried out differently from other platforms because Instagram does not allow direct links in descriptions. Most bridges have been identified with the text “link in Bio.” The Bio usually contains hyperlinks or landing pages where it is possible to find different bridges simultaneously, configuring a star topology network. The bridges were directed to the same user’s accounts on other platforms, and actually directed to the generic profile and not to a specific content. In this case, we crossed the platform without exposure to a different kind of more extreme disinformation content because the pages encountered usually followed the same “editorial line” on all platforms.

Nuclear energy propaganda and hyper-partisan politics

The bridges from Twitter to Facebook mostly lead to politicians strongly polarized on nuclear power in both positions (Fig. 4). We find pro-nuclear energy profiles (Lega - Salvini Premier and Noi con Salvini) of the League party that has promised to restart the Italian nuclear program. The League presented the introduction of European taxonomy as a political victory against opposing parties and used a strong affective lexicon comparing environmentalists to the Taliban. The no-Nuke counterpart (5-Star Movement, Alternative, Democratic Party, and Angelo Bonelli) responded significantly against Matteo Salvini, secretary of the League. They accused pro-Nuke politicians of misleading voters with misinformation about the fourth-generation nuclear power narrative and called for a public debate on the energy issue. Lined up in both factions, we can find activists who have advocated positions for (Umberto Minopoli, n.d.) or against (Andrea Corrino, n.d.) civil nuclear power. Facebook’s hyper-partisan content is usually posted with text and a single image, together with video content (especially by politicians sharing interviews during campaigning). The content from Facebook to Twitter accounts for most of the political content we identified. It is material shared within public groups with different political orientations. It mainly bridges to content from pro-Nuke politicians Matteo Salvini, Carlo Calenda, and Nicola Danti (ItaliaViva), with only a few posts referring to the no-Nuke positions of Angelo Bonelli (Europa Verde) and 5-Star Movement. Other public groups with no-Nuke positions are mainly oriented on Instagram to political profiles such 5-Star Movement and Giuseppe Conte.

Fig. 4: Political debate.
figure 4

Examples of posts from politicians and activists discussing nuclear energy for civil use contain bridges characterized by affective and defective narratives.

The political content is polarized, with pro-Nuke parties using different approaches. On the one hand, a pragmatic approach by Azione & ItaliaViva introduced the importance of the energy mix to achieve the 2050 zero-emissions goal. Conversely, La Lega chose affective language and narratives close to people’s concerns. For example, Matteo Salvini used hyperbole to advocate for nuclear energy, claiming that it would be no problem for him to have a power plant near his home in Milan or that investing in fourth-generation nuclear power is the only way to lower citizens’ bills. The latter statement is factually false because it is known that the technology is not ready and is not a short-term solution. The opposition response used the same affective approach, with Giuseppe Conte (5-Star Movement) responding to Matteo Salvini’s hyperbole by organizing a public debate in Milan to express his opposition to “nuclear waste and the contamination of our children’s future.” Other objections to pro-Nuke statements by no-Nuke parties used similar rhetorical fallacies, with Angelo Bonelli (Europa Verde) sharing a map of potential sites for nuclear power plants supposedly chosen by his political opponents. The map was false and inaccurate and was debunked by a well-known debunking site called PagellaPolitica (n.d.).

In addition to the bridges to mainstream platforms, we also found bridges to below-the-radar platforms such as Mastodon (Fig. 5), which, although not quantitatively relevant, deserve further analysis. Mastodon is open-source software that enables self-hosted social networking services. It has microblogging functions and consists of a large number of independent instances. The links we encountered are directed mainly to a particular instance called sociale.network created by users gubi (Carlo Gubitosa, n.d.), a writer and no-Nuke activist, and the Peacelink association (n.d.).

Fig. 5: Twitter-Mastodon.
figure 5

The image shows Twitter feed (on the left) and Mastodon feed on sociale.network instance (center). The Mastodon post recalls the tweet and shows highly polarized comments (on the right, blurred for privacy).

One strategy observed with these Mastodon-oriented bridges is the cross-posting of content on Twitter to open a parallel discussion on the Mastodon instance. In the case above, the debate is between Carlogubi (Carlo Gubitosa, writer and no-Nuke activist), who responded to content from L’avvocato Atomico (n.d.), a pro-Nuke activist. Regardless of the debate on the source platform, a highly polarized debate opens up on the bridge-targeted instance to which Twitter users can be directed. The bridge is a rabbit hole to a polarized social space away from open debate. Browsing the feed of this instance, it is clear that all the news shared is against nuclear energy for civilian use, creating a defective environment in what can be described as an echo chamber by design, meaning a space far from serendipity.

Video is the type of content conveyed with connections to Facebook (Marco Fumagalli m5s) and, of course, Youtube, mainly directed to channels of no-Nuke politicians (Giuseppe Conte of the 5-Star Movement) and pro-Nuke activists (Parresia, n.d.). Along with the political content, we detected other affective approaches, such as the satirical content of user Grilletto Facile (n.d.), who jokes about pro-Nuke positions, and the ironic viral content of CartoniMorti (n.d.), who refers to Youtube in support of pro-Nuke positions.

We found a narrative that mixes elections and Masonic conspiracy theories among Youtube-oriented content. A post on the G**** Facebook page urges users to stop giving electoral consent to “political freemasonry,” suggesting two alternatives: on the one hand, avoid the ballot box, and on the other hand, go to the polls and write “G****“ on the ballots. The motive for these actions is justified by “becoming an active participant in the process of opposing the plan of global destruction, with nuclear weapons, starting in 2024, as explained through indisputable legal acts in this video.” The video links to a 2-hour 40-minute monologue by Carlo Palermo, a lawyer, former magistrate, and politician, who explains several points in his book “The Beast. From Italian Mysteries to the Masonic Powers Directing the New World Order,” published in 2018. The video (Salvatore, 2022, Aug. 20) reinterprets historical episodes by identifying the action of freemasonry acting behind the scenes in several events, such as the assassination attempt on Karol Wojtyla in 1981, the Yalta conference in 1943, the invasion of Iraq in 1991 described as the beginning of the NWO project, and the attack on the twin towers on 11 September 2001. These contents are described as a reason to join the G****, which is “[…] an Extraterritorial Monetary Organization with philanthropic aims for the promotion of Human Rights and Fundamental Freedoms […] having become aware of the fact that many illnesses from which a large part of humanity suffers are generated by stressors related to the constant economic difficulties resulting from the mismanagement of world finance wanted by the Integrated Criminal System that governs the world that has created a state of slavery that, at all costs, it wants to maintain in order to force all human beings on this planet to submit to it perpetually.” The construction of the conspiracy seems structured to force people to participate in some monetary scam, and the bridge with Youtube is part of this mechanism to tease people into believing the alternative narrative.

Atomic conflict rumors and conspiracy theories

Most of the content passing through Youtube is related to news and information channels that focus mainly on the atomic conflict narrative, with several channels sharing geopolitical news/rumors and commentary to express their positions (Stefano Cetica, n.d.). Among Youtube content, it was possible to identify conspiracy and hyper-partisan channels (ByobluFootnote 3 Play, Pangea, n.d., Djuki San, n.d., and Aljaliya 24, n.d.) that advocate pro-Russia and anti-NATO positions (Fig. 6).

Fig. 6: Youtube-oriented videos.
figure 6

Content is highly polarized on pro-Russia and antiNato positions and disseminated using defective and affective narratives.

Analyzing this content we crossed different narratives and rhetorical approaches. Some bridges lead to content opposed to NATO and “the Ukrainian narrative,” providing a hyper-partisan version of the battle for control of the Zaporizhzhia power plant. Other content seeks to position itself in a more neutral perspective, declaring itself opposed to the war and claiming to want Italy out of the conflict. This “pacifist” narrative is also exploited from an electoral standpoint to politically attack parties that expressed support for Ukraine during the election campaign. In some cases, attempts have been made to convey messages with affective and emotional content, for example, by accompanying them with music, as in the case of R.E.M.’s song entitled “Final Straw, No More War.” Finally, there is the more extreme content of full support for the Russian cause that uses hyper-partisan news and alarmist tones, bragging about the outbreak of an atomic conflict and suggesting that the public build their atomic bunker. Browsing and exploring these highly polarized circles, we found defective positions that support Vladimir Putin’s actions (TV Radio). Among these extreme positions, we can also find conspiracy content on several channels (LeoZagami, n.d.Footnote 4, TelePandemiaChannel, 2022, RedRonnie, VisionTV, and luogocomune2, 2022).

These channels deal with the narrative of the Russian-Ukrainian war but also shift the focus to other topics such as the Covid-19 pandemic (Vision TV, 2022, min 0:30), aliens and religion (RedRonnie, 2022, min 5: 45), Freemasonry and the New World Order (TelePandemicChannel, 2022, min 21:20). They also convey messages against mainstream media (luogocommune2, 2022, min 1:35) and national and international political institutions (RadioRadioTV, 2022, min 1:20). They have in common the imitation of journalistic style toward information, with fake news and disinformation masquerading as authoritative content (Fig. 7). In addition, the names of the different channels refer to television (TV and Tele) to equate with the traditional media in an attempt to gain the same credibility with the public.

Fig. 7: Youtube-television.
figure 7

Conspiracy content, which highlights the mimicking of mainstream information.

Notably, the alt-news profiles usually had a more complex social media ecosystem, with the example below (Fig. 8) showing eight different social media platforms systematically referenced and linked to each other in what appears to be an attempt to generate infectivity and virality. To attract people to the defective content, platforms such as Bitchute and Rumble are described as free from censorship and used as a backup for video content banned by Youtube. Other conspiracy content was present in other below-the-radar platforms such as Gab (known to have a far-right community), Mewe (whose tagline reads “imagine a place where you trust your social network”), and, finally, messaging apps such as Telegram.

Fig. 8: Platform ecosystem.
figure 8

The graph shows the mix of mainstream and below-the-radar platforms.

Conspiracy content is quickly delivered by Telegram (Nuovagalassia, n.d., WeSovereign, Lantidiplomatico, n.d., LaBestiaNeradPolitica, n.d., In_telegram_veritas, n.d.), Rumble (Pangea, n.d.) and Sfero (Byoblu TV, n.d.). Telegram, characterized by the immediacy of communication, becomes fertile ground for factually interpreted news, and most content can be recognized as unverified information and rumors. Bridges on Telegram are rarely directed at a particular post but usually refer to the entire channel. In the image below (Fig. 9), we can appreciate the mood of some channels regarding the Russian-Ukrainian war. The WeSovereign channel uses alarmist tones announcing the approach of World War III. In contrast, the infective Lantidiplomatico channel (with nearly 90,000 subscribers) posts news of a new military aid package for Ukraine from the United States, registering polarized reactions (visible in the post below) that are 93% [vomit] and [anger]. Finally, the In_telegram_veritas channel refers to Zelensky as “the Ukrainian comedian,” accusing him of profiting from the episode of Russian missiles allegedly flying over Romanian (and so NATO) territory (Kudrytski & Vilcu, 2023). Romania denied the violation of its airspace, contradicting the Ukrainian claim. The defective channel invented a correlation with the June 6 earthquake in Romania to “mafia-like warnings” from NATO caused by the HAARP project.

Fig. 9: Telegram content.
figure 9

Channels with infective disinformation and partisan content are able to reach more than 35,000 users in total.

Bridges to below-the-radar platforms have often shifted the focus to channels rather than specific content, as in the case of the Pangea channel on Rumble and the Byoblu channel on Sfero. Both channels are a conglomeration of disinformation, hyper-partisan and fake news, anti-scientific narratives, and conspiracies. Rumble is a video-sharing platform popular among American right-wing and far-right users and has been described as part of the “alt-tech” by several observers (Munn, 2023). Sfero is an Italian social media created in opposition to censorship on mainstream platforms that, with its blog-like structure, allows users to publish content without fact-checking or debunking.

Finally, a peculiar fact is the conspiracy-minded Instagram profile Byoblu, which, along with bridges to other social media platforms, also invites the public to follow them on mainstream television by mentioning their TG Byoblu24 on various satellite TV channels.

Discussion

The debate on nuclear power revealed the predominance of political actors particularly active in propaganda about the pros and cons of power plants. In addition to them, activists reinforce politicians’ statements in both directions, with only a few instances of non-partisan approaches, such as Pagella_Politica on Instagram, which debunks content from all political sides, and informational approaches, such as CartoniMorti and Parresia on Youtube.

The narratives of ordinary users described a fragmented situation, especially concerning the narrative of the atomic conflict. On the one hand, people supported the Ukrainian cause, especially regarding humanitarian aid. On the other hand, ordinary people expressed doubts about the role of the United States and NATO in the conflict, showing divergent ideas about the role Italy should maintain in the geopolitical scenario. Moving to the extremes, some users expressed an even more drastic position, supporting Putin’s actions and condemning Western countries.

In response to our RQ1, we can describe the content conveyed by bridges in the electoral debate regarding nuclear narratives with three adjectives: affective, defective, and (potentially) infective.

First, affective means that political actors (politicians, journalists, and political influencers on social media) tend to support their arguments with emotional and/or fear-mongering content. This rhetoric is easily pursued with the atomic conflict narrative but infuses discourses on nuclear energy. This is the case with the debate we saw above between Matteo Salvini and Giuseppe Conte on the Milan nuclear power plant or the persistent reference to the danger of a third world war by disinformation channels. Another peculiar aspect we found is the presence of messages conveyed with Youtube music content, which seems to be an attempt to exploit the mood and words of music to deliver a message.

Second, by defective, we mean that the information shared on some communication channels is partisan, unverified, or completely false, potentially leading the public to biases that can confuse and shift opinion. This is the case with politician Angelo Bonelli, who shared inaccurate information about potential sites for power plants, or Matteo Salvini, who falsely presented fourth-generation nuclear power as a ready-made technology. More defective content related to the geopolitical scenario can be found on the hyper-partisan and conspiracy channels we identified, such as Sfero’s Byoblu channel, YouTube’s LeoZagami channel, and the public Telegram groups In_telegram_veritas, Wesovereign, and Lantidiplomatico, among others. In the latter case, a distinctive aspect of bridging is that the destination platform is usually presented as a trusted source exposing news hidden from mainstream news outlets.

Finally, infective means that the content conveyed has the potential to go viral due to the multiplatform dynamic and effectiveness of the content.

Speaking of infectivity, in our analysis, we found that misleading information is usually channeled with bridges oriented toward under-the-radar social networks (Sfero), video-sharing platforms (Youtube, Rumble), or messaging applications (Telegram), which make up about 50 percent of the target platforms. This could be due to their particular policies; in fact, these environments have less control over the content, as opposed to mainstream social media such as Facebook and Twitter, which, especially with the Covid-19 pandemic, have structured stricter content quality control to hinder fake news (Hartley and Vu, 2020). We found misinformation and unverified content on mainstream video-sharing platforms such as Youtube, noticing that when content was removed due to a policy violation, it was systematically uploaded to below-the-radar video-sharing platforms such as Rumble and Bitchute, described as “free from censorship” since they provide a lower level of content control.

Disinformative and propagandistic content can be found in messaging apps such as Telegram, where public channels are the central aspect that fosters the development of exclusive echo chamber-like environments. The public channel grants anonymity to content creators who can hide by showing only the channel name (Urman and Katz, 2022). This possibility seems to be appreciated by users and has prompted competitors (Whatsapp) to develop similar solutions by implementing public groups. Controversial content, even when not directly related to national elections, spreads misinformation and insidiously spreads the message that governments, institutions, and national media are hiding news from the public. By conveying and promoting a conspiratorial approach to information, this behavior potentially affects society by undermining the credibility of national and international institutions, driving people away from a balanced political life, and pushing them toward extremism (Sutton and Douglas, 2020). This is particularly relevant in a country like Italy, which has a growing trend of absenteeism in national elections and recorded the highest rate during the last elections (Damilano, 2022).

Among the platforms we discovered in the social media ecosystem of the nuclear debate during the Italian elections, some are relatively recent and were initially born to counter the spread of disinformation and highly polarized content. This is the case of Mastodon, a platform based on the open and decentralized social networking protocol ActivityPub. A peculiarity of Mastodon is that to create a profile, users must join a particular instance (which represents the actual microblogging communication space). This feature implies that entering the infrastructure directly from a rabbit hole that leads into potentially polarized environments is mandatory, with the apparent detriment of serendipity. On Mastodon, each instance has its code of conduct, terms of service, privacy policy, privacy options, and moderation policies, a feature driven by the idea that small communities can moderate their content better than large companies. Social media evolve with society, and their features adapt to the needs of the people who use them or are shaped by their use. In light of this, the original vocation did not prevent Mastodon from being forked in 2019 to use it as infrastructure for Gab (a platform with a known far-right community) and TruthSocial, a Republican-oriented platform founded by Donald Trump after his suspension from Twitter (Kan, 2021). This aspect suggests that platforms, with their affordances and policies, are not the only actors in influencing social media environments.

The presence of different platforms in the social media ecosystem alters the dynamics of information dissemination. In response to our RQ2, the role of bridges between platforms can be seen as a communicative “weapon” with multiple uses. On the one hand, bridges are used to disseminate information in order to achieve a specific communicative purpose with different strategies:

  • presenting what the author considers a better or new perspective on a specific topic (with the possibility of running into propaganda news, hyper-partisan content, and conspiracies);

  • convey the message with auxiliary content that prompts the mood (e.g., with music);

  • imitate the characteristics of other media (alternative news channels mirroring TV news), realizing the remediation of traditional media conceptualized by Bolter and Grusin (2000).

On the other hand, they pursue a secondary goal related to media ecology, seeking to bring audiences into favorable environments for speakers. A favorable environment indicates an online space in which the affordances of the platform (likes, reactions, comments) return positive feedback on the content being conveyed, as, for example, Mastodon’s feed on the istance sociale.network in which all the voices and comments were polarized on the no-Nuke positions.

Conclusions

In this multiplatform research, we explored different social media environments and their cross-platform dynamics to navigate the Italian electoral debate, focusing on disinformative content related to the nuclear narrative. Crossing the bridges between platforms showed an ecosystem made by different actors and narratives bound together.

The main limitation of this study is that Facebook and Instagram content comes only for public pages, public groups, and verified profiles due to Meta’s policies on privacy. However, it should be emphasized that content shared in Facebook public groups and pages usually comes from profiles belonging to everyday users, allowing us to have a representation of this type of users in our datasets. A second limitation is also related to textual analysis of Instagram content which is usually more audiovisual-centered. Also, due to a change in Instagram policies since 31st July 31st, 2022, all videos under 15 min are classified as reels and not supported by CrowdTangle APIs. Therefore were not available in our research results.

This paper aims to highlight the complexity of today’s social media ecosystem in the dynamics of disinformation dissemination, showing the interlacing of mainstream and below-the-radar platforms, with important roles played by video-sharing platforms and messaging apps. Following Bruns, Harrington, and Hurcombe (2021) suggestions, the elastic research design is oriented to investigate “the key inflection points where the spread of such disinformation can be reduced” (Bruns et al. 2021). The research provides an original hybrid technique for cross-platform analysis that brings together the robust approach of thematic research with approaches that rely on seeking one or more digital objects shared across platforms such as hyperlinks (Rogers, 2023). The step which goes beyond the existing body of knowledge, is the application of a transmedia perspective by defining the concept of Bridge. The aim is to add a further key to the cross-platform analysis by emphasizing the narrative device that drives audiences to secondary platforms.

The paper contributes to analyze the political debate on nuclear-related themes, which are not frequently investigated in the literature concerning Italian electoral campaigns, and responds to the call to broaden analysis of political communication in the field of multimodality. The article complements discussions on the affective nature of communication in participatory social media ecosystems (KhosraviNik, 2018; Sakki and Martikainen, 2021; Laaksonen et al. 2022) by suggesting that while the particular discourse studied is not generalizable, many of the discursive strategies observed are more general in nature and have been found in different contexts. Furthermore, the focus on defective content underlined the dynamics peculiar to the broader social media ecosystem, which focuses on below-the-radar or fringe platforms (Rogers, 2021). Content analysis shows environments where there is a clear overexposure of disinformation content: while Telegram channels facilitate and speed up the transmission of unverified rumors and information, the observation of particular environments (Rumble, Bitchute, and Sfero) denotes the presence and circulation of countless alternative information channels.

Finally, I would like to emphasize the importance of shedding light on the ‘below the radar’ platforms when adopting policies to counter disinformation. The recent European Digital Services Act (Leiser, 2023), is an important step that does not, however, take into account the ‘below the radar’ environments (even Telegram is excluded from the list of Very Large Online Platforms). The risk in this way is to chase technology trying to remedy already corrupt environments. A suggestion is that it might be productive to investigate and regulate these fringe communication spaces from their genesis, before they become mainstream and it is already too late.