Abstract
Online communities have become a central part of the internet. Understanding what motivates users to join these communities, and how they affect them and others, spans various psychological domains, including organizational psychology, political and social psychology, and clinical and health psychology. We focus on online communities that are exemplary for three domains: work, hate, and addictions. We review the risks that emerge from these online communities but also recognize the opportunities that work and behavioral addiction communities present for groups and individuals. With the continued evolution of online spheres, online communities are likely to have an increasingly significant role in all spheres of life, ranging from personal to professional and from individual to societal. Psychological research provides critical insights into understanding the formation of online communities, and the implications for individuals and society. To counteract risks, it needs to identify opportunities for prevention and support.
Similar content being viewed by others
Introduction
Online communities are social networks on the internet that utilize technology for interaction. They began to gain popularity in the 1990s with the development of the internet and information and communications technologies1,2,3. The emergence of online communities was further accelerated by Web 2.0 and social media starting in the mid-2000s4. Social media platforms provide users fast access to likeminded others, and they speed up communication and offer new ways for interaction5,6,7.
The progress of these interactive technologies has been remarkably fast, and they carry both opportunities and risks. The work context is a good example of the complexity of online communities. On one hand, online communication is flexible, fast, and effective regardless of location, and it provides workers with new ways to collaborate and socialize with each other8. On the other hand, online communication bears risks, such as workplace cyberbullying9,10 and misinterpretation of messages and feedback, endangering the mental well-being of employees by inducing technostress, psychological distress, and work exhaustion11,12. We recognize that online communities can be supportive and enhance well-being in many ways, but there are also online communities that carry risks for participants and wider society such as hate and addiction communities.
In this perspective article, we review the characteristics of online communities along with the opportunities and risks they present to their users. We cover the role of online communities in the contexts of 1) work, 2) hate and harassment, and 3) addiction, as three domains of outstanding relevance for the society that showcase the multifaceted nature of online communities. These diverse domains provide an excellent starting point for the theoretical overview of online communities. The first topic of work recognizes that online communication and online communities have a growing importance in today’s work life. The second topic concerns research evidence of online hate communities that are based on harmful ideas and actions against other people. This topic has had massive implications on political and societal discussions starting in the 2010s. The third topic talks about online communities in the context of addictions. The online dissemination and proliferation of various views and behaviors have further led to people discovering new, potentially harmful activities or becoming excessively engaged in the digital world. This has prompted significant research into online addictions13,14,15.
How and why online communities form
Human beings are inherently social and seek companionship and social engagement whenever possible16. Online communication responds to this social need of belonging. Online communities emerge and thrive in digital spaces, comprised of members who engage in active communication in a shared topic or interest area17,18. Online communities can form in a variety of online contexts, including but not limited to social media platforms, discussion forums, and chatrooms19. Online communities are significant for finding companionship, fostering connection, accessing information, and receiving support20,21. Like any group or community, online communities vary in size, cohesion, and network and focus area. The degree of members’ anonymity may also vary considerably4,22. A distinct feature of the global online sphere is that no matter how unusual or rare one’s interests are, they are likely to find others with similar interests23. Members of online communities are often heterogenous in their social characteristics, including socioeconomic status, life stage, ethnicity, and gender, but tend to be likeminded and homogeneous in terms of their shared interests and attitudes24,25.
The online sphere provides an ideal environment for the building of networks that hold significance for the social and personal identity of their participants5. Social identity theory (SIT), as initially proposed by Tajfel and Turner26, describes a process in which an individual’s identity is partially shaped by their sense of belonging to preferred social groups. This concept is often measured by evaluating an individual’s subjective feeling of being a part of the desired groups27. Theories and models that use the social identity approach28,29 are very relevant for understanding online communities and online group behavior.
One of the most important models proposed over the years has been the social identity model of deindividuation effects (SIDE)30,31. The model was originally motivated by the topic of online communication’s anonymity – an issue that had already drawn the attention of social psychologists in the early 1980s32. According to the SIDE model, the deindividuation effect of social identification is especially prevalent in online interactions that are characterized by at least a certain level of anonymity, as it promotes a shift from individual to group self and therefore facilitates behaviors benefiting the group as well as stereotyping outgroup members and viewing them as a representative of their group rather than an individual33,34,35,36,37. Relatedly, lack of social cues, such as eye contact, in online interactions has been found to lead to behavioral disinhibition through the so-called online sense of unidentifiability38.
Context collapse occurs very commonly in social media. This means that boundaries between different social spheres blur together (e.g. interacting with people from different life spheres such as work, family, and friends on the same platform). This can influence and challenge the users’ self-presentation and their navigation within different online discussions and audiences39. In these diverse social contexts, which can vary greatly in terms of values and norms, users may need to balance their personal authenticity based on their audience expectations. Additionally, they can face situations that compromise their privacy or necessitate self-censorship40,41. However, social media simultaneously provides multiple features or opportunities (i.e., affordances) for users to control and maintain their public identities and social networks. In other words, users can manage how they present themselves to the public online and how they interact with their social networks through the tools and functions provided by social media platforms. Typically, these tools include the customization of ones’ profile, maintaining visibility, allowing access, editing content, and providing links and connections to other platforms39,42,43,44. Affordances essentially link to the question of how users maintain and create smaller groups or wider communities within certain platforms. This depends on the features and design of each social media platform or internet site.
Characteristics of both internet platforms and applications strongly influence online human behavior. In particular, commercial platforms are designed to attract people’s interests in the content. Moral and emotional contents spread faster on social media as they capture users’ attention more effectively as compared to neutral content45. The process is also facilitated by the algorithms of social media platforms5,46, and social influence, which lead people to engage with content that is already popular47. This has a profound impact on online communities and the way they communicate, especially as these effects have been found to be much stronger within networks comprised of individuals who share similar views than between such networks48. Thus, a tendency can be induced toward likeminded individuals who support certain opinions and behaviors, strengthening their existing attitudes and providing a stronger identification with the ingroup49,50. In this context, even exposure to differing opinions may primarily serve as a tool to further distinguish between “us” and “them” and hinder productive dialogue4,51.
These mechanisms are further captured by the Identity Bubble Reinforcement Model (IBRM), which focuses on explaining how characteristics of online communication facilitate the formation of tight-knit networks, namely social media identity bubbles4,5. Such bubbles or echo chambers are characterized by three mutually reinforcing features: high identification with the other members (ingroup), homophily or the strong tendency to interact with likeminded others, and information bias, namely heavy reliance on information obtained from the community4,5. Involvement in such communities has been found to be associated with compulsive internet use5, cyberaggression52 and problem gambling53. At the same time, involvement in online identity bubbles facilitates social support and may buffer mental well-being in some situations54.
In summary, there are three core aspects to consider in online communities. First, technological design is a critical component that impacts what people can do within online communities. The phenomenon of online communities has existed as long as the internet2, but current social media platforms use different interfaces and AI algorithms than their earlier versions, being essentially engineered to provide content for users. This technological side impacts significantly how people behave and react online. Second, contextual issues are highly important in online communities. Generally, contexts in the online sphere may collapse easily. At the same time, the internet and social media platforms facilitate development of very closed online communities that are based on shared interests. These interests may sometimes be very specific. Core social psychological group theories and their updates provide good tools for understanding evolving online group behavior. SIDE and IBRM are examples of theories that have been proven very useful in empirical research.
Social media communities at work
The accelerated development of information technology in recent decades has significantly reshaped the workplace. The foundation for this technological advancement was laid in the 1960s with the development of ARPANET, a forerunner to the internet4. The same era also witnessed the birth of the Open Diary – an internet-based diary community allowing user participation through messages, thereby serving as a precursor to social media55. During the emergence of the internet in the 1990s, known now as the Web 1.0 era, personal web pages, content creation, and numerous work communication tools, such as online telecommunications and email, became prevalent55,56. The term Web 2.0 was first coined in 2004, concurrently when Facebook became popular, symbolizing the internet’s transition to a more socially diverse and interactive era55. The change in user behavior from passive web content consumers to active, bidirectional information creators and editors was an evident part of the transition from Web 1.0 to Web 2.04.
In recent years, there has been a substantial increase in the use of digital communication technologies in the workplace, primarily driven by advancements of the internet and social media services12,57. Numerous expert organizations are now leveraging corporate social media platforms, such as Microsoft Teams and Workplace from Meta, for their communication needs8,58. Networking in enterprise social media platforms facilitates real-time messaging, task organisation, and formal and informal team collaboration, synchronously and asynchronously across organisational groups and in different geographic locations8,59,60. Work communication also unfolds through instant-messaging applications, such as WhatsApp, and general social media platforms, such as Facebook, X (formerly Twitter), and LinkedIn, which are being utilised for professional purposes. These social platforms have the potential to encourage professionals to engage in collaboration, share information and ideas, and expand their expertise on a global scale, extending beyond their specific job responsibilities and organisational boundaries8,61. Notably, social communication tools have expanded to encompass traditional white-collar environments and now provide value for blue-collar workers as well, for example, as a medium for communication and task organisation11.
Social media messaging and networking for professional purposes not only enhance knowledge transfer and flow but also nurture the human need for social belonging62,63. Given the growing prevalence of remote and hybrid forms of work, social media has the potential to maintain and foster social interactions regardless of location57,64. Remote and hybrid work arrangements can, however, reduce the chances of establishing and nurturing high-quality work relationships65,66. Recent studies have also indicated a link between resistance to remote work and having quality workplace relationships67. Indeed, working far from the physical work community can increase the growing phenomenon of loneliness at work65,68,69. At the same time, in some circumstances, online networking among colleagues nurtures social connections and alleviates feelings of loneliness54. Social connections and feelings of belongingness in the work community and one’s professional circles are vital to support employees’ mental well-being and combat loneliness at work54,70. Feelings of loneliness at work can, for example, lower professionals’ work engagement, increase their dissatisfaction at work71 and burnout72.
The use of social media platforms for professional objectives can enrich communication and foster meaningful connections8,73. Professional online relationships can be formed and maintained individually person to person or as a part of bigger professional online communities. In the professional sphere, online communities are commonly referred to as communities of practice due to their origins within the cultural framework of either virtual or traditional organisations74. Communication visibility in these online communities of practice can foster knowledge sharing and social learning, trust, and innovation75,76,77. The sense of belonging and togetherness with colleagues can also be enhanced in these online communities8,78. Online communities of practice can be a source of affective social support that promotes experiencing group identification and meaningfulness, which in turn can foster employees’ engagement in their work79. Employees’ social media collaboration is also associated with increased team and employee performance78, and employees perceived social media–enabled productivity80. Both formal and informal online communities are known to accelerate professional development8,81,82.
However, online communities at work can have downsides. These include tensions within the organisation due to employees sharing nonwork-related information that can tighten the bonds and build trust but, interestingly, can also hinder work-related information sharing76. Furthermore, stress arising from technology use (i.e., technostress), psychological distress, and burnout are pervasive challenges of professional online collaboration in technologised work environments11,12. Concentration problems can emerge, and the boundaries of work and private life can also be blurred and stimulate conflicts83,84. In addition, social relationships at work can be challenged85 by discrimination, ostracism, and face-to-face bullying. These issues are also present in online communication, where they take on new forms and meanings. Work-related cyberbullying9,10,86 and hate and harassment, which may also come from fellow work community members, can be detrimental for the targets and lead to lowered well-being87.
Hate communities
The ease of online communication facilitates the dissemination and proliferation of negative and dangerous views and behaviors. Subsequently, online hate (i.e., cyberhate) and online hate crime have emerged as a prominent area of research in the context of online communication, with the same ease of access contributing to their prevalence88,89. Online hate covers a wide range of intensive and hostile actions that target individuals or groups based on their beliefs and demographic factors, such as ideology, sexual orientation, ethnic background, or appearance90. The rise of hostile online communication has been considered a growing societal concern over the past decade4,87,91,92,93.
The history of hate in online communication goes back to the first internet networks. Organised hate groups have always been interested in the latest technologies to recruit new members and disseminate information. For instance, White supremacists in the US were pioneers in adopting electronic communication networks during the 1980s. Notably, in 1983, neo-Nazi publisher George P. Dietz established the first dial-up bulletin board system (BBS), marking an early utilisation of online communication methods94. Shortly after the inception of the World Wide Web, hate groups marked their online presence. Stormfront.org, launched in 1995, was one of the first and most important hate sites during the Web 1.0. era95. Since then, over the past 30 years, continuous technological advancements have significantly enhanced their communication capabilities4.
Particularly the rise of social media since the mid-2000s was an important game changer in the dissemination and development of online hate. Foxman and Wolf96(p. 11) summarized this change concerning the Web 2.0 era of social media: “In the interactive community environment of Web 2.0, social networking connects hundreds of millions of people around the globe; it takes just one ‘friend of a friend’ to infect a circle of hundreds or thousands of individuals with weird, hateful lies that may go unchallenged, twisting mind in unpredictable ways.” The last 10 years of the internet have been, however, striking, as online hate has lurked from the margins and started to become a tool of political populists in the Western world1,97. Uncertainty of the times with various crises related to terrorism, economy, and the global COVID-19 pandemic have also accelerated the phenomenon.
Research on online hate associated with the COVID-19 pandemic has suggested that, in crisis situations, hate communities can organise quickly and rapidly develop new narratives98,99, reactively focusing on recent and highly debated issues100. Such hateful messages spread most effectively in smaller, hierarchical, and isolated online communities99, highlighting the dangers of online echo chambers or identity bubbles4,5,101. Even if hateful narratives are not endorsed by most users on the platform, the flow of such information tends to be sustained over time, as members of echo chambers encourage each other and amplify their shared worldview102. This is often done by referring to and contesting opposing views in a marginalising and undermining way, making counter-messaging ineffective or even counter-effective103. Various options of demonstrating (dis)agreement and promoting content on social media are used for creating echo chambers and disseminating hateful content104. However, it is worth noting that even on social media sites derived of content-promoting algorithms and vanity metrics present on many of the major platforms, users can quickly learn to recognise and promote extremist content as important and worthy of attention105.
The example of COVID-19-related hateful activity showed how hate communities effectively spread malicious content across various social media sites, incapacitating moderation attempts of any single platform98. Gaming sites are another type of environment where hate and extremist communities organise, recruit, and communicate. It has been argued that the development and characteristics of the gaming industry and the games themselves make online gaming platforms a suitable place for spreading hateful ideologies106. Hate communities also commonly use less moderated online spaces as an alternative to mainstream social media platforms, moving toward the creation of parallel ecosystems107,108,109. The need to leave mainstream spaces due to the risk of moderation and censorship is often used for community building by means of leveraging the sense of online persecution and victimisation109,110.
Hate communities, especially their influential members, use various other techniques and activities for community building. These activities include, for example, the development and promotion of jargon and coded language that underline the “us vs. them” dichotomy, often using derogatory and offensive phrasing100,104,105,106,110,111,112,113 as well as the use of various audiovisual and interactive materials to capture the recipients’ attention106,107,109. These strategies can be used differently in different contexts and adapted to groups’ needs114. The incel (i.e., “involuntarily celibate males”) online community is an interesting example of how these strategies are used in practice. According to research, all active participants of online incel discussions commonly use derogatory terms to refer to women115 and they create powerful dichotomies between themselves and outgroups: both women and society at large, using memes, reels, and other forms of online content to carry their message115,116.
Another commonly utilised method is ironic and humorous messaging in the form of memes and jokes that further allows for the spread of radical ideologies using seemingly unserious content104,106,117,118,119. Such jokes and memes are often part of conspiracy talk, which is a type of everyday discourse common among hate communities, referring to conspiracy theories through implicit references and anecdotal evidence from community members’ own experiences, often in reaction to news coverage from mainstream sources119,120. Research has suggested a strong community-building potential of this type of online discourse, as it allows users to share their concerns and worries and make sense of their experiences119. These uncertainties are used by extremist groups to create new anxieties and introduce new problems, as well as to strengthen the community, as evoking feelings of threat can boost the sense of belonging and reinforce the ingroup’s worldview100. This is especially concerning considering evidence on the associations of supporting far-right ideologies with distrust toward traditional media outlets121. Individuals distrustful toward established broadcasters may be motivated to search for alternative sources of information and, as a result, get involved in online hate communities, where they may become further radicalised through community-building practices such as those described above107.
Research has suggested that, over time, as online communities develop, both positive and negative sentiments in their content increase, and this effect may be stronger in hate communities than in comparable non-hateful groups111. This is attributed to the group-formation processes as shared outgroups are established, leading to more negative emotions being expressed. Simultaneously, involvement in a likeminded community results in more positive affect111. Interestingly, influential users in online hate communities commonly use seemingly neutral and value-free language, often referring to news from mainstream sources. This is, however, done in a way that is meant to evoke emotion and provoke hateful discussion122. This helps to avoid content deletion or user suspension and may further endanger new users looking for alternative sources of information by exposing them to hateful discussions and possibly fostering their radicalisation and involvement in the community123.
Hateful online content is likely to increase as a result of offline hateful acts124,125 and local socio-political events that are significant to the group and their worldviews. Together, these can have long-term effects on online hate communities, resulting in increased activity and group cohesion126. Although online communities might avoid encouraging offline violence for fear of the discussion being moderated or even completely banned by site administrators104, they nevertheless contribute to the creation of an environment where hate – both online and offline – is seen as more acceptable and justified119,127,128. Indeed, perpetrators of violent extremist acts offline have been previously found to be involved in extremist online communities prior to the act129, and the spread of hateful content in social media has been tied to subsequent offline hate crimes93,130.
Addiction and online communities
There is a complex relationship between addiction and online communities which can be explained through three core factors. First, fast internet connection and mobile devices have enabled unlimited, easy, and continuous online access. Studies have reported that heavy online use symptoms are comparable to substance-related addiction, including mood modification, withdrawal symptoms, conflict, and relapses15,131. Second, major social media sites use algorithms to attract and engage their users4. Connectedness to others and positive emotions arising from actions and vanity metrics, such as “likes” and supportive comments, reinforce usage and can lead people to become addicted131,132. Third, participation in online communities has addictive power. For instance, Naranjo-Zolotov and colleagues133, who investigated Latin American individuals, found that the sense of a virtual community was the primary factor fueling addiction to social media usage. There is a symbiotic relationship between online communities and technology: technology provides the means for a wide range of activities and it’s those activities, rather than the devices themselves, that users typically become addicted to. These online activities often concern the most recognised behavioral addictions such as sex, shopping, gaming or gambling.
When discussing addiction related to online use, it should be acknowledged that, in current terminology, there is a wide variety of terms expressing the excessive use of the internet or social media. For example, compulsive internet use and problematic internet use are commonly used134,135,136. Technological devices and social media sites are designed to be as engaging as possible. Features such as notifications, personalised content, and interactive elements are strategically implemented to capture users’ attention and encourage prolonged usage137. These devices and the features within have greatly transformed social interactions, especially in technologically advanced countries and particularly among younger generations who have grown up with smart technology. Current reviews underline a need to build a more complex understanding of different ways of social media use138. This involves investigating the geographical, sociocultural, and digital environments within which problematic behaviors arise and unfold15.
In this Perspective, our focus is on exploring the role of online communities in reinforcing certain problem behaviors. Our examples come from gambling and digital gaming online communities. Online communities centered around gambling and digital gaming are growing in popularity, drawing users to engage and exchange ideas and experiences with others who share similar interests in these activities21,139. Online gambling communities usually manifest independently from the actual games, often taking the form of discussion forums dedicated to all aspects of gambling. These forums serve as platforms for participants to engage in dialogues typically including the exchange of tips, strategies, and personal experiences related to gambling139. A review of research on online gambling communities indicates that content on these types of online platforms commonly presents gambling in a predominantly positive light21. This positive portrayal also seems to resonate with individuals who have a preexisting affinity for gambling, drawing them to participate in the communities online. Joining gambling communities online also appears to be a socially transmitted behavior, as existing members frequently invite their friends or online contacts to join these communities, often through social media where gambling operators also admin and promote communities for their followers21,140. The existence of online communities dedicated to gambling provides a convenient platform for gamblers to express interests and emotions they might otherwise hesitate to share in face-to-face interactions. The risk associated with online communities, like those that unite individuals based on a common interest, goals, and norms, is that they might normalise gambling activities and encourage the development of new gambling habits and behaviors. Notably, research has linked active participation in online gambling communities to an increased risk of problem gambling21,139,141,142.
Online gaming communities are distinct from gambling communities as they inherently exist within the games they are tied to139. Virtual social groups that form within games tend to be persistent, and players utilise them to collaborate with each other and enhance their in-game success143. Within these communities, members freely exchange skills, knowledge, and virtual assets, including currency used in the game. Players can have different roles and responsabilities within gaming communities. These include sharing responsibilities and communal resources such as in-game items and money139. Gaming communities can significantly contribute to the construction of gamers’ online identities, which could explain the remarkable success of these communities. This process acts as a validating influence, enabling players to reintegrate themselves through features like avatars and virtual belongings within their communities144. Social engagement with fellow players serves as a primary motivator for gaming and can lead to positive social capital gains145,146, but it can also immerse players in the games, which can lead to excessive time spent on gaming and even to online gaming addiction147,148. Further, some in-game activities, such as forms of microtransactions that bear resemblance to gambling, seem to gain support within the gaming community, posing challenges to prevention149.
Although involvement in various online communities can potentially lead to harmful behaviors and even the initiation or maintenance of addiction, it is crucial to recognise that these communities also serve as a valuable resource for their users. For instance, gamers who harness social bonds within video games often report favorable social outcomes, including support from in-game friends150. Online discussion forums have proven to be a valuable source of support for gamblers, especially those experiencing gambling-related problems or harms21. Engaging in conversations online with peers who share similar experiences provides a natural and easily accessible safe space where they can narrate experiences without the fear of judgment. Participants can openly discuss how behaviors like gambling have impacted their lives and share their current self-perceptions. Members of communities focusing on recovery actively exchange information about available resources and offer insights into how to effectively utilise online forums that aid and encourage the recovery process21,139.
Growing relevance of online communities
Online communities have growing importance in people’s lives today. We are in the middle of remarkable technological change with increasingly ubiquitous computing, which includes major leaps in the development of artificial intelligence technologies and extended realities151,152. In some visions, the metaverse is the future of the internet and the 3D model of the internet. The term has been hyped during the early 2020 s, partially so because one of the biggest technology companies, Facebook, renamed itself to Meta and envisioned a metaverse-integrated, immersive ecosystem152. Part of the development of the metaverse is tied to technologies and gadgets, but it is hardware independent and functions globally, also within the mobile devices we already use in 2024152,153. At this point, it is too early to say how important the metaverse will be in the forthcoming years154, but it is certain that online communities will play a role in any future development of the internet.
Online communities are fundamentally enabled by the human need for social relatedness16,155. Social psychological evidence has shown that group formation takes place easily in any context – also online2,26,156,157. This has been shown in both the SIDE and IBRM4,37. Characteristics of online communication are tied to the mediated nature of the communication, but, with the help of advanced technologies, the line between on- and offline has become increasingly blurred. Today’s research evidence emphasizes the increasing significance of online communities in shaping social connections within both work and everyday life. However, the full extent of this impact is challenging to predict due to the rapid development of internet and social media platforms. Going forward, social psychological theory stands as a cornerstone in understanding the intricate mechanisms of online communities. However, it is crucial to maximise its significance by integrating and considering methodologies and findings from other disciplines of psychology.
In this Perspective, we focused on online communities at work, online hate communities, and online communities based on addiction, and how they contribute to both benefits and risks of human interaction, behavior, and well-being, and what implications such communities hold for the society at large. In the context of work, online communities can facilitate efficient collaboration, knowledge transfer, and social belonging. However, virtual workplace environments may also lead to exclusion, cyberbullying, psychological distress, and technology-induced technostress. Online hate communities pose a worrisome phenomenon, spreading extremist ideas, false information, and conspiracy theories. These activities can have real-world consequences, including increased distrust in institutions and offline deviant behavior. Additionally, online communities related to addiction impact users’ time, sleep, relationships, and finances. Despite challenges, online communities offer potential for intervention and support. Research in this multidisciplinary field is urgently relevant, considering technological, societal, and psychological aspects.
References
Krämer, B. Populist online practices: The function of the Internet in right-wing populism. Inf., Commun. Soc. 20, 1293–1309 (2017).
Rheingold, H. The virtual community: Finding commection in a computerized world. Addison-Wesley Longman Publishing Co., Inc (1993).
Turkle, S. Ghosts in the machine. Sciences 35, 36–39 (1995).
Keipi, T., Näsi, M., Oksanen, A., & Räsänen, P. Online hate and harmful content: Cross national perspectives. Routledge. https://doi.org/10.4324/9781315628370 (2017).
Kaakinen, M., Sirola, A., Savolainen, I. & Oksanen, A. Shared identity and shared information in social media: development and validation of the identity bubble reinforcement scale. Media Psychol. 23, 25–51 (2020).
Leonardi, P. M., Huysman, M. & Steinfield, C. Enterprise Social Media: Definition, History and Prospects for the Study of Social Technologies in Organizations. J. Computer-Mediated Commun. 19, 1–19 (2013).
Raj, M., Fast, N. J. & Fisher, O. Identity and Professional Networking. Personal. Soc. Psychol. Bull. 43, 772–784 (2017).
Oksa, R., Kaakinen, M., Savela, N., Ellonen, N. & Oksanen, A. Social media use in professional organizations: Boosting and draining the workforce. Behav. Inf. Technol. 42, 1740–1757 (2023).
Farley, S. et al. (eds) Concepts, Approaches and Methods. Handbooks of Workplace Bullying, Emotional Abuse and Harassment, vol. 1 (pp. 236–243). Springer.
Oksanen, A., Oksa, R., Savela, N., Kaakinen, M., & Ellonen, N. Cyberbullying at work: Socia media identity bubble perspective. Computers in Human Behavior, 109. https://doi.org/10.1016/j.chb.2020.106363 (2020).
Oksa, R., Kaakinen, M., Saari, T., Savela, N. & Oksanen, A. The Motivations for and Well-Being Implications of Social Media Use at Work Among Millennials and Members of Former Generations. Int. J. Environ. Res. Public Health 18, 803 (2021).
Oksanen, A. et al. COVID-19 Crisis and Digital Stressors at Work: A Nationwide Longitudinal Study. Computers Hum. Behav. 122, 106853 (2021).
Andreassen, C. S. Online social network site addiction: A comprehensive review. Curr. Addiction Rep. 2, 175–184 (2015).
Jouhki, H. & Oksanen, A. To get high or to get out? Examining the link between addictive behaviors and escapism. Subst. Use Misuse 57, 202–211 (2022).
Kuss, D. J., Kristensen, A. M. & Lopez-Fernandez, O. Internet addictions outside of Europe: A systematic literature review. Computers Hum. Behav. 115, 106621 (2021).
Baumeister, R. F. & Leary, M. R. The need to belong: desire for interpersonal attachments as a fundamental human motivation. Psychological Bull. 117, 497–529 (1995).
Ridings, C. M., Gefen, D. & Arinze, B. Some antecedents and effects of trust in virtual communities. J. Strategic Inf. Syst. 11, 271–295 (2002).
Ridings, C. M. & Gefen, D. Virtual community attraction: Why people hang out online. J. Computer-Mediated Commun. 10, JCMC10110 (2004).
Pendry, L. F. & Salvatore, J. Individual and social benefits of online discussion forums. Computers Hum. Behav. 50, 211–220 (2015).
Brandtzæg, P. B. Social networking sites: Their users and social implications—A longitudinal study. J. Computer-Mediated Commun. 17, 467–488 (2012).
Savolainen, I., Sirola, A., Vuorinen, I., Mantere, E. & Oksanen, A. Online communities and gambling behaviors—A systematic review. Curr. Addiction Rep. 9, 400–409 (2022).
Keipi, T. & Oksanen, A. Self-exploration, anonymity and risks in the online setting: analysis of narratives by 14–18-year olds. J. Youth Stud. 17, 1097–1113 (2014).
Oksanen, A., Hawdon, J. & Räsänen, P. Glamorizing Rampage Online: School Shooting Fan Communities on YouTube. Technol. Soc. 39, 55–67 (2014b).
Hiltz, S. R. & Wellman, B. Asynchronous learning networks as a virtual classroom. Commun. ACM 40, 44–49 (1997).
Malinen, S. Understanding user participation in online communities: A systematic literature review of empirical studies. Computers Hum. Behav. 46, 228–238 (2015).
Tajfel, H., & Turner, J. C. An integrative theory of intergroup conflict. In W. G. Austin, & S. Worchel (Eds.), The social psychology of intergroup relations (p. 33‒47). Brooks Cole (1979).
Cruwys, T. et al. Social Identity Mapping: A procedure for visual representation and assessment of subjective multiple group memberships. Br. J. Soc. Psychol. 55, 613–642 (2016).
Hornsey, M. J. Social identity theory and self‐categorization theory: A historical review. Soc. Personal. Psychol. compass 2, 204–222 (2008).
Postmes, T., & Branscombe, N. R. (Eds.). Rediscovering social identity. Hove: Psychology (2010).
Reicher, S. D., Spears, R. & Postmes, T. A social identity model of deindividuation phenomena. Eur. Rev. Soc. Psychol. 6, 161–198 (1995).
Postmes, T., Spears, R. & Lea, M. Breaching or building social boundaries? SIDE-effects of computer-mediated communication. Commun. Res. 25, 689–715 (1998).
Kiesler, S., Siegel, J. & McGuire, T. W. Social psychological aspects of computer-mediated communication. Am. psychologist 39, 1123–1134 (1984).
Felmlee, D. H.. Interaction in social networks. In J. Delamater (Ed.), Handbook of social psychology. Handbooks of sociology and social research. Springer. https://doi.org/10.1007/0-387-36921-X_16 (2006).
Joinson, A. N., McKenna, K. Y. A., Postmes, T., & Reips, U. D. (Eds.). Oxford handbook of internet psychology. Oxford University Press. https://doi.org/10.1093/oxfordhb/9780199561803.001.0001 (2012).
Postmes, T., Spears, R., Sakhel, K. & de Groot, D. Social Influence in Computer-Mediated Communication: The Effects of Anonymity on Group Behavior. Personal. Soc. Psychol. Bull. 27, 1243–1254 (2001).
Spears, R., Lea, M., Corneliussen, R. A., Postmes, T. & Haar, W. T. Computer-mediated communication as a channel for social resistance: The strategic side of SIDE. Small group Res. 33, 555–574 (2002).
Spears, R., & Postmes, T. Group identity, social influence, and collective action online: Extensions and applications of the SIDE model. The handbook of the psychology of communication technology, 23–46. https://doi.org/10.1002/9781118426456.ch2 (2015).
Lapidot-Lefler, N. & Barak, A. Effects of anonymity, invisibility, and lack of eye-contact on toxic online disinhibition. Computers Hum. Behav. 28, 434–443 (2012).
Boyd. D. Taken Out of Context: American Teen Sociality in Networked Publics. Berkeley, CA: University of California (2008).
Marwick, A. E. & Boyd, D. I tweet honestly, I tweet passionately: Twitter users, context collapse, and the imagined audience. N. Media Soc. 13, 114–133 (2011).
Marwick, A. E. & Boyd, D. Networked privacy: How teenagers negotiate context in social media. N. Media Soc. 16, 1051–1067 (2014).
Ellison, N. B., Gibbs, J. L. & Weber, M. S. The use of enterprise social network sites for knowledge sharing in distributed organizations: The role of organizational affordances. Am. Behav. Scientist 59, 103–123 (2015).
Ellison, N. B., & Vitak, J. Social network site affordances and their relationship to social capital processes. In S. Shundar (Ed). The handbook of the psychology of communication technology, 203–227. Wiley-Blackwell. (2015).
Leonardi, P. M. & Treem, J. W. Knowledge management technology as a stage for strategic self-presentation: Implications for knowledge sharing in organizations. Inf. Organ. 22, 37–59 (2012).
Brady, W. J., Gantman, A. P. & Van Bavel, J. J. Attentional capture helps explain why moral and emotional content go viral. J. Exp. Psychol.: Gen. 49, 746–756 (2020).
Pariser, E. The filter bubble: What the Internet is hiding from you. Penguin UK. (2011).
Salganik, M. J., Dodds, P. S. & Watts, D. J. Experimental study of inequality and unpredictability in an artificial cultural market. Science 311, 854–856 (2006).
Brady, W. J., Wills, J. A., Jost, J. T., Tucker, J. A. & Van Bavel, J. J. Emotion shapes the diffusion of moralized content in social networks. Proc. Natl Acad. Sci. 114, 7313–7318 (2017).
Iandoli, L., Primario, S. & Zollo, G. The impact of group polarization on the quality of online debate in social media: A systematic literature review. Technological Forecasting & Social Change, 170, https://doi.org/10.1016/j.techfore.2021.120924 (2021).
Sunstein, C. R. (2009). Republic.com 2.0. Princeton University Press.
Bail, C. A. et al. Exposure to opposing views on social media can increase political polarization. Proced. Natl Acad. Sci. 115, 9216–9221 (2018).
Zych, I. et al. The role of impulsivity, social relations online and offline, and compulsive Internet use in cyberaggression: A four-country study. N. Media Soc. 25, 181–198 (2023).
Savolainen, I. et al. Online Relationships and Social Media Interaction in Youth Problem Gambling: A Four-Country Study. Int. J. Environ. Res. Public Health 17, 8133 (2020).
Latikka, R., Koivula, A., Oksa, R., Savela, N. & Oksanen, A. Loneliness and psychological distress before and during the COVID-19 pandemic: Relationships with social media identity bubbles. Soc. Sci. Med. 293, 114674 (2022).
Kaplan, A. M. & Haenlein, M. Users of the world, unite! The challenges and opportunities of social media. Bus. Horiz. 53, 59–68 (2010).
Castells, M. The rise of the network society: Second edition with a new preface. Wiley Blackwell. (2010).
Leonardi, P. M. COVID‐19 and the new technologies of organizing: digital exhaust, digital footprints, and artificial intelligence in the wake of remote work. J. Manag. Stud. 58, 249 (2021).
Oksa, R., Kaakinen, M., Savela, N., Hakanen, J., & Oksanen, A. Professional social usage and work engagement: A four-wave follow-up study of Finnish professionals before and during the COVID-19 outbreak. J. Med. Int. Res. 23. https://doi.org/10.2196/29036 (2021).
Choudhury, P., Foroughi, C. & Larson, B. Work-from-anywhere: The productivity effects of geographic flexibility. Strategic Manag. J. 42, 655–683 (2021).
Leonardi, P. M. & Vaast, E. Social media and their affordances for organizing: A review and agenda for Research. Acad. Manag. Ann. 11, 150–188 (2017).
Davis, J., Wolff, H. G., Forret, M. L. & Sullivan, S. E. Networking via LinkedIn: An examination of usage and career benefits. J. Vocational Behav. 118, 103396 (2020).
Keppler, S. M., & Leonardi, P. M. Building relational confidence in remote and hybrid work arrangements: novel ways to use digital technologies to foster knowledge sharing. J. Comput-Med. Commun. 28, https://doi.org/10.1093/jcmc/zmad020 (2023).
Randall, P. M., Lartey, F. M. & Tate, T. D. Enterprise social media (ESM) use and employee belongingness in US corporations. J. Hum. Resour. Manag. 8, 115–124 (2020).
Sun, Y., Zhou, X., Jeyaraj, A., Shang, R.-A. & Hu, F. The impact of enterprise social media platforms on knowledge sharing: an affordance lens perspective. J. Enterp. Inf. Manag. 32, 233–250 (2019).
Becker, W. J., Belkin, L. Y., Tuskey, S. E. & Conroy, S. A. Surviving remotely: How job control and loneliness during a forced shift to remote work impacted employee work behaviors and well-being. Hum. Resour. Manag. 61, 449–464 (2022).
Yang, X., Ye, H. J. & Wang, X. Social media use and work efficiency: Insights from the theory of communication visibility. Inf. Manag. 58, 103462 (2021).
Mergener, A. & Trübner, M. Social relations and employees’ rejection of working from home: A social exchange perspective. N. Technol., Work Employ. 37, 469–487 (2022).
Ozcelik, H. & Barsade, S. G. No employee an island: Workplace loneliness and job performance. Acad. Manag. J. 61, 2343–2366 (2018).
Wright, S. & Silard, A. Unravelling the antecedents of loneliness in the workplace. Hum. Relat. 74, 1060–1081 (2021).
Oyanedel, J. C. & Paez, D. Editorial: social belongingness and well-being: international perspectives. Front. Psychol. 12, 735507 (2021).
Basit, A. A. & Nauman, S. How workplace loneliness harms employee well-being: A moderated mediational model. Front. Psychol. 13, 1086346 (2023).
Anand, P. & Mishra, S. K. Linking core self-evaluation and emotional exhaustion with workplace loneliness: does high LMX make the consequence worse? Int. J. Hum. Resource Manag. 32, 2124–2149 (2021).
Donelan, H. Social media for professional development and networking opportunities in academia. J. Furth. High. Educ. 40, 706–729 (2016).
Johnson, C. M. A survey of current research on online communities of practice. Internet High. Educ. 4, 45–60 (2001).
Dong, J. Q. & Wu, W. Business value of social media technologies: Evidence from online user innovation communities. J. Strategic Inf. Syst. 24, 113–127 (2015).
Neeley, T. B. & Leonardi, P. M. Enacting knowledge strategy through social media: Passable trust and the paradox of nonwork interactions. Strategic Manag. J. 39, 922–946 (2018).
Pan, Y. et al. Integrating social networking support for dyadic knowledge exchange: A study in a virtual community of practice. Inf. Manag. 52, 61–70 (2015).
Song, Q., Wang, Y., Chen, Y., Benitez, J. & Hu, J. Impact of the usage of social media in the workplace on team and employee performance. Inf. Manag. 56, 103160 (2019).
Ihl, A., Strunk, K. S. & Fiedler, M. The mediated effects of social support in professional online communities on crowdworker engagement in micro-task crowdworking. Computers Hum. Behav. 113, 106482 (2020).
Oksa, R., Pirkkalainen, H., Salo, M., Savela, N. & Oksanen, A. Professional social media-enabled productivity: a five-wave longitudinal study on the role of professional social media invasion, work engagement and work exhaustion. Inf. Technol. People 35, 349–368 (2022).
Dille, K. B. & Røkenes, F. M. Teachers’ professional development in formal online communities: A scoping review. Teach. Teach. Educ. 105, 103431 (2021).
Macià, M. & García, I. Informal online communities and networks as a source of teacher professional development: A review. Teach. Teach. Educ. 55, 291–307 (2016).
Salo, M., Pirkkalainen, H. & Koskelainen, T. Technostress and social networking services: Explaining users’ concentration, sleep, identity, and social relation problems. Inf. Syst. J. 29, 408–435 (2019).
Sun, Y. et al. Dark side of enterprise social media usage: A literature review from the conflict-based perspective. Int. J. Inf. Manag. 61, 102393 (2021).
Fattori, A. et al. Estimating the impact of workplace bullying: Humanistic and economic burden among workers with chronic medical conditions. BioMed Res. Int. 1–12. https://doi.org/10.1155/2015/708908 (2015).
Celuch M., Oksa, R., Savela, N., & Oksanen, A. Longitudinal effects of cyberbullying at work on well-being and strain: A five-wave survey study. New Media Soc. https://doi.org/10.1177/14614448221100782 (2022).
Oksanen, A., Celuch, M., Latikka, R., Oksa, R. & Savela, N. Hate and harassment in academia: The rising concern of the online environment. High. Educ. 84, 541–567 (2022).
Castaño-Pulgarín, S. A., Suárez-Betancur, N., Vega, L. M. T. & López, H. M. H. Internet, social media and online hate speech. Systematic review. Aggression Violent Behav. 58, 101608 (2021).
Chetty, N. & Alathur, S. Hate speech review in the context of online social networks. Aggression Violent Behav. 40, 108–118 (2018).
Hawdon, J., Oksanen, A. & Räsänen, P. Exposure to online hate in four nations: A Cross-National Consideration. Deviant Behav. 38, 254–266 (2017).
Oksanen, A., Hawdon, J., Holkeri, E., Näsi, M. & Räsänen, P. Exposure to Online Hate Among Young Social Media Users. Sociol. Stud. Child. Youth 18, 253–273 (2014a).
Reichelmann, A. et al. Hate knows no boundaries: Content and exposure to online hate in six nations. Deviant Behav. 42, 1100–1111 (2021).
Williams, M. L., Burnap, P., Javed, A., Liu, H. & Ozalp, S. Hate in the machine: Anti-Black and Anti-Muslim social media posts as predictors of offline racially and religiously aggravated crime. Br. J. Criminol. 60, 93–117 (2020).
Levin, B. Cyberhate: A legal and historical analysis of extremists’ use of computer networks in America. Am. Behav. Scientist 45, 958–986 (2002).
Bowman-Grieve, L. Exploring “Stormfront”: A virtual community of the radical right. Stud. Confl. Terrorism 32, 989–1007 (2009).
Foxman, A., & Wolf, C. Viral hate: Containing its spread on the Internet. Palgrave MacMillan. (2013).
Schaub, M. & Morisi, D. Voter mobilisation in the echo chamber: Broadband internet and the rise of populism in Europe. Eur. J. Political Res. 59, 752–773 (2020).
Velasquez, N. et al. Online hate network spreads malicious COVID-19 content outside the control of individual social media platforms. Sci. Rep. 11, 11549 (2021).
Uyheng, J. & Carley, K. M. Characterizing network dynamics of online hate communities around the COVID-19 pandemic. Appl. Netw. Sci. 6, 1–21 (2021).
Collins, J. Mobilising Extremism in Times of Change: Analysing the UK’s Far-Right Online Content During the Pandemic. Eur. J. Criminal Policy Res. 1–23. https://doi.org/10.1007/s10610-023-09547-9 (2023).
Törnberg, P., & Törnberg, A. Inside a White Power echo chamber: Why fringe digital spaces are polarizing politics. New Media Soc. 14614448221122915. https://doi.org/10.1177/14614448221122915 (2022).
Ozalp, S., Williams, M. L., Burnap, P., Liu, H. & Mostafa, M. Antisemitism on Twitter: Collective efficacy and the role of community organisations in challenging online hate speech. Soc. Media Soc. 6, 2056305120916850 (2020).
Bright, J., Marchal, N., Ganesh, B. & Rudinac, S. How do individuals in a radical echo chamber react to opposing views? Evidence from a content analysis of Stormfront. Hum. Commun. Res. 48, 116–145 (2022).
Gaudette, T., Scrivens, R., Davies, G. & Frank, R. Upvoting extremism: Collective identity formation and the extreme right on Reddit. N. Media Soc. 23, 3491–3508 (2021).
Åkerlund, M. Influence without metrics: Analyzing the impact of far-right users in an online discussion forum. Soc. Media Soc. 7, 20563051211008831 (2021).
Wells, G., et al Right-Wing Extremism in Mainstream Games: A Review of the Literature. Games and Culture, 15554120231167214. https://doi.org/10.1177/15554120231167214 (2023).
Dehghan, E. & Nagappa, A. Politicization and radicalization of discourses in the alt-tech ecosystem: A case study on Gab Social. Soc. Media Soc. 8, 20563051221113075 (2022).
Rogers, R. Deplatforming: Following extreme Internet celebrities to Telegram and alternative social media. Eur. J. Commun. 35, 213–229 (2020).
Zeng, J. & Schäfer, M. S. Conceptualizing “dark platforms”. Covid-19-related conspiracy theories on 8kun and Gab. Digital Journalism 9, 1321–1343 (2021).
Jasser, G., McSwiney, J., Pertwee, E. & Zannettou, S. ‘Welcome to# GabFam’: Far-right virtual community on Gab. N. Media Soc. 25, 1728–1745 (2023).
Dilkes, J. Rule 1: Remember the human. A socio-cognitive discourse study of a Reddit forum banned for promoting hate based on identity. Discourse Soc. 09579265231190344. https://doi.org/10.1177/09579265231190344 (2023).
Scrivens, R. Exploring radical right-wing posting behaviors online. Deviant Behav. 42, 1470–1484 (2021).
Scrivens, R., Davies, G. & Frank, R. Measuring the evolution of radical right-wing posting behaviors online. Deviant Behav. 41, 216–232 (2020).
Hutchinson, J., Amarasingam, A., Scrivens, R. & Ballsun-Stanton, B. Mobilizing extremism online: comparing Australian and Canadian right-wing extremist groups on Facebook. Behav. Sci. Terrorism Political Aggression 15, 215–245 (2023).
Halpin, M., Richard, N., Preston, K., Gosse, M., & Maguire, F. Men who hate women: The misogyny of involuntarily celibate men. New Media Soc. 14614448231176777. https://doi.org/10.1177/14614448231176777 (2023).
Daly, S. E. & Reed, S. M. I think most of society hates us”: A qualitative thematic analysis of interviews with incels. Sex. Roles 86, 14–33 (2022).
Askanius, T. On frogs, monkeys, and execution memes: Exploring the humor-hate nexus at the intersection of neo-Nazi and alt-right movements in Sweden. Television N. Media 22, 147–165 (2021).
Colley, T. & Moore, M. The challenges of studying 4chan and the Alt-Right:‘Come on in the water’s fine’. N. Media Soc. 24, 5–30 (2022).
Døving, C. A. & Emberland, T. Bringing the enemy closer to home:‘conspiracy talk’and the Norwegian far right. Patterns Prejudice 55, 375–390 (2021).
Peucker, M. & Fisher, T. J. Mainstream media use for far-right mobilisation on the alt-tech online platform Gab. Media, Cult. Soc. 45, 354–372 (2023).
Figenschou, T. U. & Ihlebæk, K. A. Challenging journalistic authority: Media criticism in far-right alternative media. Journalism Stud. 20, 1221–1237 (2019).
Åkerlund, M. The importance of influential users in (re) producing Swedish far-right discourse on Twitter. Eur. J. Commun. 35, 613–628 (2020).
Jolley, D., Meleady, R. & Douglas, K. M. Exposure to intergroup conspiracy theories promotes prejudice which spreads across groups. Br. J. Psychol. 111, 17–35 (2020).
Bliuc, A. M., Betts, J., Vergani, M., Iqbal, M. & Dunn, K. Collective identity changes in far-right online communities: The role of offline intergroup conflict. N. Media Soc. 21, 1770–1786 (2019).
Lupu, Y. et al. Offline events and online hate. PLoS one 18, e0278511 (2023).
Bliuc, A. M. et al. The effects of local socio-political events on group cohesion in online far-right communities. PloS one 15, e0230302 (2020).
Kasimov, A. Decentralized hate: sustained connective action in online far-right community. Social Movement Stud. 1–19. https://doi.org/10.1080/14742837.2023.2204427 (2023).
Wiedlitzka, S., Prati, G., Brown, R., Smith, J. & Walters, M. A. Hate in word and deed: the temporal association between online and offline islamophobia. J. Quant. Criminol. 39, 75–96 (2023).
Regehr, K. How technology facilitated misogyny moves violence off screens and onto streets. N. Media Soc. 24, 138–155 (2020).
Müller, K. & Schwarz, C. Fanning the flames of hate: Social media and hate crime. J. Eur. Economic Assoc. 19, 2131–2167 (2021).
Kuss, D. & Griffiths, M. Social Networking Sites and Addiction: Ten Lessons Learned. Int. J. Environ. Res. Public Health 14, 311 https://www.mdpi.com/1660-4601/14/3/311?_kx=# (2017).
Salehan, M. & Negahban, A. Social networking on smartphones: When mobile phones become addictive. Computers Hum. Behav. 29, 2632–2639 (2013).
Naranjo-Zolotov, M., Turel, O., Oliveira, T. & Lascano, J. E. Drivers of online social media addiction in the context of public unrest: A sense of virtual community perspective. Computers Hum. Behav. 121, 106784 (2021).
Kuss, D. J. & Lopez-Fernandez, O. Internet addiction and problematic Internet use: A systematic review of clinical research. World J. Psychiatry 6, 143–176 (2016).
Montag, C., Wegmann, E., Sariyska, R., Demetrovics, Z. & Brand, M. How to overcome taxonomical problems in the study of Internet use disorders and what to do with “smartphone addiction”? J. Behav. Addictions 9, 908–914 (2021).
Sanchez-Fernandez, M., Borda-Mas, M. & Mora-Merchan, J. Problematic internet use by university students and associated predictive factors: A systematic review. Computers Hum. Behav. 139, 107532 (2023).
Montag, C., Lachmann, B., Herrlich, M. & Zweig, K. Addictive Features of Social Media/Messenger Platforms and Freemium Games against the Background of Psychological and Economic Theories. Int. J. Environ. Res. Public Health 16, 2612 (2019).
Valkenburg, P. M. Social media use and well-being: What we know and what we need to know. Curr. Opin. Psychol. 45, 101294 (2022).
Sirola, A., Savela, N., Savolainen, I., Kaakinen, M. & Oksanen, A. The role of virtual communities in gambling and gaming behaviors: A systematic review. J. Gambl. Stud. 37, 165–187 (2021).
Gainsbury, S. M., Delfabbro, P., King, D. L. & Hing, N. An exploratory study of gambling operators’ use of social media and the latent messages conveyed. J. Gambl. Stud. 32, 125–141 (2016).
James, R. J. & Bradley, A. The use of social media in research on gambling: A systematic review. Curr. Addiction Rep. 8, 235–245 (2021).
Sirola, A. et al. Online identities and social influence in social media gambling exposure: A four-country study on young people. Telemat. Inform. 60, 101582 (2021).
Hau, Y. S. & Kim, Y. G. Why would online gamers share their innovation-conducive knowledge in the online game user community? Integrating individual motivations and social capital perspectives. Computers Hum. Behav. 27, 956–970 (2011).
Costa Pinto, D., Reale, G., Segabinazzi, R. & Vargas Rossi, C. A. Online identity construction: How gamers redefine their identity in experiential communities. J. Consum. Behav. 14, 399–409 (2015).
Eklund, L. Bridging the online/offline divide: The example of digital gaming. Computers Hum. Behav. 53, 527–535 (2015).
O’Connor, E. L., Longman, H., White, K. M. & Obst, P. L. Sense of community, social identity and social support among players of massively multiplayer online games (MMOGs): a qualitative analysis. J. Community Appl. Soc. Psychol. 25, 459–473 (2015).
Kuss, D. J. Internet gaming addiction: current perspectives. Psychol. Res. Behav. Manag. 6, 125–137 (2013).
Kuss, D. J. & Griffiths, M. D. Internet gaming addiction: A systematic review of empirical research. Int. J. Ment. Health Addiction 10, 278–296 (2012).
Stevens, M. W., Delfabbro, P. H. & King, D. L. Prevention approaches to problem gaming: A large-scale qualitative investigation. Computers Hum. Behav. 115, 106611 (2021).
Mäyrä, F. Exploring gaming communities. In Kowert, R. & Thorsten, Q. (Eds.) The Video Game Debate (pp. 153–175). Routledge. (2015).
Dwivedi, Y. K. et al. Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. Int. J. Inf. Manag. 57, 101994 (2021).
Dwivedi, Y. K. et al. Metaverse beyond the hype: Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. Int. J. Inf. Manag. 66, 102542 (2022).
Park & Kim, 2022, Park, S. M. & Kim, Y. G. A metaverse: taxonomy, components, applications, and open challenges. IEEE Access 10, 4209–4251 (2022).
Kshetri, N., Voas, J., Dwivedi, Y. K., Torres, D. R. & O’Keefe, G. Metaverse’s Rise and Decline. Computer (IEEE Computer Soc.) 56, 77–82 (2023).
Deci, E. L., & Ryan, R. M. Self-determination theory. In Van Lange, P. A. M., Kruglanski, A. W., & Higgins, E. T. (Eds.) Handbook of theories of social psychology, 1 416–436. (2012).
Kwon, K. H., Stefanone, M. A. & Barnett, G. A. Social network influence on online behavioral choices: Exploring group formation on social network sites. Am. Behav. Scientist 58, 1345–1360 (2014).
Turner, J. C., Sachdev, I. & Hogg, M. A. Social categorization, interpersonal attraction and group formation. Br. J. Soc. Psychol. 22, 227–239 (1983).
Author information
Authors and Affiliations
Contributions
Atte Oksanen: conceptualization, project administration, supervision, writing (original draft), writing (review and editing); Magdalena Celuch: conceptualization, writing (original draft), writing (review and editing); Reetta Oksa: conceptualization, writing (original draft), writing (review and editing); Iina Savolainen: conceptualization, writing (original draft), writing (review and editing).
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Communications Psychology thanks Felice Addeo and the other, anonymous, reviewer(s) for their contribution to the peer review of this work. Primary Handling Editors: Marike Schiffer. A peer review file is available.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Oksanen, A., Celuch, M., Oksa, R. et al. Online communities come with real-world consequences for individuals and societies. Commun Psychol 2, 71 (2024). https://doi.org/10.1038/s44271-024-00112-6
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s44271-024-00112-6