Recent years have seen an increase in calls for ethnography as a method to study Artificial Intelligence (AI). Scholars from diverse backgrounds have been encouraged to move beyond quantitative methods and embrace qualitative methods, particularly ethnography. As anthropologists of data and AI, we appreciate the growing recognition of qualitative methods. However, we emphasize the importance of grounding ethnography in specific ways of engaging with one’s field site for this method to be valuable. Without this grounding, research outcomes on AI may become distorted. In this commentary, we highlight three key aspects of the ethnographic method that require special attention to conduct robust ethnographic studies of AI: committed fieldwork (even if the fieldwork period is short), trusting relationships between researchers and participants, and, importantly, attentiveness to subtle, ambiguous, or absent-present data. This last aspect is often overlooked but is crucial in ethnography. By sharing examples from our own and other researchers’ ethnographic fieldwork, we showcase the significance of conducting ethnography with careful attention to such data and shed light on the challenges one might encounter in AI research.
Quantitative methods are commonly used to study various phenomena, including different types of Artificial Intelligence (AI). However, scholars from various disciplines are recognizing the limitations of this approach (Rahwan et al, 2019; Adadi and Berrada, 2018; Afnan et al., 2021; Bathaee, 2017; Carabantes, 2020). Importantly, Marda and Narayan (2021) argue that the uncritical and positivist use of quantitative methods fails to consider the contextual factors. Consequently, these methods often fail to uncover the underlying causes and power asymmetries that contribute to the outcomes produced by automated systems (Dourish and Bell, 2011; O’Neil, 2016; Köchling and Wehner, 2020). This lack of understanding contributes to the blackboxing of AI, leading to inaccurate perceptions of these technologies. As a result, people may have either excessive or insufficient trust in AI. Moreover, organizations and institutions can exploit this blackboxing to evade legal responsibility (Sartori and Theodorou, 2022).
Increasingly, ethnography is recognized as a method that could help scientists investigate AI in a more holistic and critical manner. In fact, the call for ethnography as a method to study AI has grown louder over the past decade (Suchman, 1987; Forsythe, 2002; Seaver, 2017; Marda and Narayan, 2021; Sartori and Theodorou, 2022). We agree with these and other authors that ethnography, a qualitative method par excellence, can offer insights into technology as a sociotechnical phenomenon (Hess, 2001; van Voorst, 2024; Ahlin, 2023). Additionally, rather than defining variables in advance and testing hypotheses, ethnographic fieldwork allows for surprising and unexpected discoveries. Researchers can conduct open-ended, in-depth interviews and engage in (participant) observation of how individuals and institutions design, create, and use technologies (Pols, 2012).
As the enthusiasm for ethnographically studying the dynamic and urgent field of algorithmic society grows, it becomes even more important to do justice to the depth of this methodology. This is an ongoing point of consideration as technologies also influence and reshape the norms of ethnographic research (Ahlin and Li, 2019).
In this commentary, we first highlight three key aspects of an ethnography of AI that will contribute to the success of ethnographic studies: committed fieldwork, establishing trustful relationships in the field, and developing an attentiveness for subtle and nuanced data. We argue that by taking these requirements seriously, AI researchers will be able to grasp the crucial contexts in which different types of AI are produced, deployed, and sometimes exploited.
By sharing examples from our own ethnographic experiences and those of other anthropologists, we also offer insights into some realistic challenges that researchers must consider when studying AI ethnographically, while promoting an encouraging approach to these endeavors.
Three methodological focus points
During committed fieldwork, researchers engage in participant observation, which involves actively participating in the everyday lives of the individuals they are studying. As Vidushi Marda and Shivangi Narayan put it, ethnography is about “making connections between what people say and what they do” (2021). This involves crosschecking data obtained through interviews with observations of people’s practices, along with the ethnographer’s personal reflections on their co-lived experiences. Tanja Luhrmann adds that ethnography is “the most fine-grained practice of observation and listening in the world” (2020). To do this effectively, participant observation often requires a significant amount of time. Anthropologists often spend several months or even years with the people they study. The same applies to anthropological studies that focus on (digital) materiality, such as computers, software systems, and algorithms (Seaver, 2017), as well as ethnographic studies that explore how people interact with objects (Latour, 2007; Lynteris and Poleykett, 2018) or, most relevant to this piece, how humans interact with computers (Pink et al., 2016; Christin, 2017; Hoeyer, 2023).
For example, Nick Seaver’s (2021) ethnographic fieldwork on the development of recommendation algorithms in the music industry spanned several years. This extended timeframe allowed him to observe significant shifts within the industry, including the emergence and subsequent decline of startups. In addition, Seaver immersed himself within the companies being studied, enabling him to gather data from individuals at all levels of these organizations, from entry-level interns to top-tier executives. Such deep involvement provided Seaver with insights into the specific strategies employed by engineers to balance both care and scale in the creation of AI systems. This is a new insight that differs from the prevalent understanding of care and scale as two values which essentially contradict each other.
Despite the importance of long-term fieldwork, shorter research periods are not necessarily less valuable. As Pink and Morgan (2013) explain, short fieldwork can be characterized by intense moments that result in deep and valid ways of understanding (see also Marcus and Okely, 2007; Vad Karsten, 2019). Nevertheless, as Pigg notes, spending attentive time with people through “a practice of patient ethnographic ‘sitting’ as a means of understanding” is crucial for gaining a deeper understanding of individuals’ beliefs and actions (Pigg, 2013, 127). Similarly, Luhrmann argues that the strength of ethnography lies in our ability to “sit and watch and listen, at length” (Luhrmann, 2020).
By investing time and energy, ethnographers can establish trustful relationships with their research participants. Spending extended periods of time in a particular field allows ethnographers to go beyond the initial awkwardness, social norms, and self-awareness that often characterize first meetings and interviews. In cases where long-term fieldwork is not possible, working closely with local research assistants who are respected and trusted within their community is essential. In such circumstances (for example, conducting research in unsafe environments), local research participants not only act as gatekeepers to the field but also assist in managing access and may even co-design or manage the research. This is significant work for which they should receive credit in subsequent publications, as long as it does not endanger their safety or well-being (van Voorst and Hilhorst, 2018).
Through intensive immersion and the establishment of trusted relationships valuable data can be obtained which may otherwise be easily missed. This data includes nuanced information that can be recognized in subtle hints and silences. In the following sections, we offer several concrete examples from our own fieldwork experiences, highlighting the types of nuanced and crucial information they have led us to uncover.
Sensitive or hidden data
In contrast to quantitative methods, ethnographic research sheds light on relationships, practices, and beliefs that are difficult to articulate, such as moments of silence during interviews or the inside jokes understood only by insiders (Wikkan, 1991; Marabello and Parisi, 2020). A case study conducted by the first author in a flood-prone slum in Jakarta, Indonesia, serves as an illustrative example. Only after living in the community for over a year she discovered why residents refused to use the government-provided flood protection technology, despite it being free of charge and highly praised by scholars and non-governmental organizations. It transpired that, more than being afraid of floods, the slum dwellers were afraid of being identified and traced by authorities through the personal data collected by these technologies. These fears were valid as many residents were undocumented or engaged in criminalized work, such as sex work or gang membership. If their personal data became known, they risked losing their homes, livelihoods, or even their freedom with the possibility of imprisonment. In the first few months of her fieldwork, the researcher struggled to comprehend these fears. Outsiders were generally mistrusted or seen as potential spies for the authorities, and this skepticism extended to outsider researchers. While slum dwellers politely responded to her questions, many of their answers later—once she established more trusting relationships or even friendships with the interviewees—revealed themselves to be socially correct. For example, it took the researcher 6 months to realize that she was living with an undocumented sex worker who carried out her work in utmost secrecy.
In an entirely different research context, Featherstone and Northcott (2021) conducted a 5-year ethnographic study within acute wards in the United Kingdom, focusing on individuals with dementia who required immediate and unplanned hospital care. The researchers followed these individuals throughout the admission process and closely observed hospital staff as they interacted with them across various shifts. The extensive duration of their fieldwork facilitated the identification of instances where individuals with dementia were unable to access the necessary assistance due to their inability to communicate in ways expected by ambulance and hospital personnel. For instance, people with dementia were unable to verbally indicate their pain level on a standardized “pain ladder from 1 to 10.” However, the ethnographers were attentive to their cries of agony, even during the late hours of the night when they were present on the ward. Their research provides a meticulous and unwavering ethnographic analysis of life within contemporary hospitals, shedding light on the institutional and ward cultures that shape the provision and organization of everyday care.
Gaps and silences
Another type of subtle knowledge that becomes available through well-conducted ethnography is shaped by the gaps and silences in conversations. This is similar to what Law (2002) and others have called the “absent present”: elements or factors that are invisible to the making of an object, such as a technological device, but that are essential to its functioning. This could also be extended to the practice of fieldwork (e.g., Gergen, 2002). For example, through fieldwork, an ethnographer may discover that a certain political discourse can be understood as a warning, and although it is not translated into law, it nevertheless impacts on people so that they adjust their behavior through censoring themselves (Yonusu, 2018).
Or, consider this example from our own fieldwork. When visiting nursing homes in the Netherlands, where she studies healthcare interventions for older people, the second author was surprised to discover that some care home residents enjoyed interacting with robot cats and dogs while others were similarly content with plush animals. The researcher came across a female resident who was particularly attached to a plush cat. Observation and a chat with her and her carers revealed that the cat had a very concrete therapeutic effect: it decreased the resident’s anxiety so efficiently that she did not need any medications for restlessness which had been severe beforehand. However, the plush animal and its impact did not feature in the woman’s healthcare records. On paper, the cat did not exist. A plush animal was therefore an efficient healthcare intervention, yet it was invisible as such in the healthcare policy and funding schemes. Ethnographic fieldwork can uncover such unexpected relations which in turn can lead to new insights about efficient interventions, technological or otherwise.
Complexity of human–nonhuman interaction
A final example highlights the crucial yet easily overlooked data that can be recognized through robust ethnographic fieldwork: the complexity of human–nonhuman interaction. As Seaver (2018) demonstrates, most studies on AI, even anthropological ones, tend to reinforce a binary narrative of humans versus algorithms, culture versus technology, or humans versus computers. However, this view oversimplifies the relationship between humans and AI, as they are intricately interconnected. Therefore, it is impossible to study one without considering the other. Merely filling the “analog slot” with anthropological insights on the social while centering the digital is inadequate and only perpetuates these dichotomies. Recognizing the complexity of human–AI relations requires moving beyond the stereotypical narratives that currently dominate discussions around technology.
Another case study illustrates this complexity. In the first author’s fieldwork on a health application in the United Kingdom, it was found that measurable data only told half the story. The application was implemented in a medium-sized organization to motivate employees to increase their physical activity, drink more water, and reduce their calorie intake. In exchange for “fit points” earned through following these health recommendations, employees received gift vouchers.
Over time, the ethnographer developed trusting relationships with the algorithm coders who supported the application by spending time with them at work and during informal occasions, joining them for lunches, afternoon drinks, events, and hikes, where conversations ranged from daily job matters to family lives and weather discussions. During one such informal meeting, the researcher discovered that while the application users believed that the algorithm determined who received gift vouchers, these decisions were actually made by people behind the application. The programmers explained that they had initially attempted to build the algorithm according to their managers’ requests, but it made numerous mistakes during the testing phase, making it more practical for them to manually decide about which users will receive vouchers. Problematically, the management was so pleased with the application that they considered scaling it up, which caused considerable anxiety among the programmers. Within a few weeks, they managed to develop a functioning algorithm.
This finding exemplifies the recent assertions made by Barchetta and Raffaetà (Forthcoming) regarding the benefits of ethnography in examining AI and other technological phenomena. By actively listening to discussions that occur in everyday conversations in the field, ethnographic research can expose the inherent tensions that arise during the development and implementation of technology in scientific knowledge creation. We further concur with Raffaetà et al. (2023) in that the utilization of in-depth ethnography, which encompasses the examination of technology alongside an understanding of human behaviors and social realities, facilitates interdisciplinary research on data, computation, algorithms, and AI (Raffaetà et al., 2023).
Conclusion
Agreeing with the increasing calls for more ethnographic research of AI, we argue that ethnography can make a significant contribution to the study of data and AI technologies. Specifically, we believe that ethnography is well-suited to exploring underlying causes and power asymmetries that are often difficult or impossible to uncover with other methods. The data produced through ethnographic research are not mere anecdotal stories, but rather crucial insights that can elucidate why automated systems produce the outcomes they do, as ethnography offers a deeper understanding of technology as a sociotechnical phenomenon.
At the same time, we maintain that robust ethnography requires specific approaches. In this commentary, we have outlined three key requirements that we believe offer the best chance of making a valuable contribution: (1) committed fieldwork with adequate time investment, or if time is limited, close attention and appropriate local support; (2) trustworthy relationships between researchers and their interlocutors; and (3) attentiveness to subtle, ambiguous, or absent-present data. It is crucial for all scientists interested in incorporating ethnography into their research design to keep these key points in mind. Failing to do so runs the risk of distorting the outcomes of their ethnographic research.
Finally, in line with the work of ethnographers mentioned in this piece, we advocate for research that examines not only human or technological elements in isolation, but rather explores the interactions, collaborations, frictions, and failures in human–machine relations.
Data availability
Data sharing is not applicable to this research as no individual data were generated or analyzed.
References
Adadi A, Berrada M (2018) Peeking inside the black-box: a survey on explainable artificial intelligence (XAI). IEEE Access 6:52138–52160
Afnan RH, Gomaa ME, Shaheen MM (2021) An overview of explainable artificial intelligence (XAI). Mach Learn Intell Commun 2:96–114
Ahlin T (2023) Calling family: digital technologies and the making of transnational care collectives. Rutgers University Press
Ahlin T, Li F(2019) From field sites to field events: creating the field with information and communication technologies (ICTs) Med Anthropol Theory 6:1–24
Barchetta L, Raffaetà R (Forthcoming) Data as environment, environment as data. One Health in collaborative data-intensive science. Big Data Soc
Bathaee Y (2017) The artificial intelligence black box and the failure of intent and causation. Harv. JL & Tech 31:889
Carabantes M (2020) Black-box artificial intelligence: an epistemological and critical analysis. AI & society 35(2):309–317
Christin A (2017) Algorithms in practice: comparing web journalism and criminal justice. Big Data Soc 4:1–14
Dourish P, Bell G (2011) Divining a digital future. Mess and mythology in ubiquitous computing. MIT Press
Featherstone K, Northcott A (2021) Wandering the wards. An ethnography of hospital care and its consequences for people living with dementia. Routledge Studies in Health and Medical Anthropology
Forsythe, D. (2002) Studying Those Who Study Us. An Anthropologist in the World of Artificial Intelligence. Stanford University Press
Gergen KJ (2002) The Challenge of Absent Presence. In: Katz JE, Aakhus MA (eds) Perpetual contract: mobile communication; private talk; public performance. CUP, p 227–241
Hess D (2001) Ethnography and the Development of Science and Technology Studies. In: Atkinson P, Coffey A, Delamont S, Lofland J, Lofland L (eds) Sage handbook of ethnography. Sage Publications, p 234–245
Hoeyer K (2023) Data paradoxes. The politics of intensified data sourcing in contemporary healthcare. MIT Press
Köchling A, Wehner M. C (2020) Discriminated by an algorithm: a systematic review of discrimination and fairness by algorithmic decision-making in the context of HR recruitment and HR development. Bus Res 13(3):795–848
Latour B (2007) Reassembling the social: an introduction to actor-network theory. OUP, Oxford
Law J (2002) On Hidden Heterogeneities: Complexity, Formalism and Aircraft Design. In: J. Law and A. Mol (eds) Complexities: Social Studies of Knowledge Practices. Duke University Press, Durham, NC, p 116–141
Luhrmann T (2020) Mind and spirit: a comparative theory. J R Anthropol Inst 26:1–166
Lynteris C, Poleykett B (2018) The anthropology of epidemic control: technologies and materialities. Med Anthropol 37:433–441
Marabello S, Parisi MP, Told I (2020) You the invisible can kill you”: engaging anthropology as a response in the COVID-19 outbreak in Italy. Hum Organ 79:250–258
Marcus GE, Okely J (2007) How short can a fieldwork be? Soc Anthropol 15:353–367
Marda V, Narayan S (2021) On the importance of ethnographic methods in AI research. Nat Mach Intell 3:187–189
O'Neil K (2016) Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown publishing Group
Pigg SL (2013) On sitting and doing: ethnography as action in global health. Soc Sci Med 99:127–134
Pink S, Morgan J (2013) Short-term ethnography: intense routes to knowing. Symb Interact 36:351–362
Pink S, Ardèvol A, Lanzeni D (2016) Digital materialities. Routledge
Pols, J. (2012) Care at a Distance: On the Closeness of Technology. Amsterdam University Press
Raffaetà R, Santanera G, Esposito F (2023) Entangling data while entangling disciplines: discussing the future of anthropological collaborations with data scientists. Anthropol Action 30(3):1–8
Rahwan I et al. (2019) Machine behaviour. Nature 568:477–586
Sartori L, Theodorou A, Sociotechnical A (2022) Perspective for the future of AI: narratives, inequalities, and human control. Ethics Inf Technol 24:4
Seaver N (2017) Algorithms as culture: some tactics for the ethnography of algorithmic systems. Big Data Soc 4:1–12
Seaver N (2018) What should an anthropology of algorithms do? Cult Anthropol 33:375–385
Seaver N (2021) Care and scale: decorrelative ethics in algorithmic recommendation. Cult Anthropol 36(3):509–537
Suchman L A (1987) Plans and situated actions: The problem of human-machine communication. Cambridge University Press
Vad Karsten MM (2019) Short-term anthropology: thoughts from a fieldwork among plumbers, digitalisation, cultural assumptions and marketing strategies. J Bus Anthropol 8:108–125
van Voorst R, Hilhorst D (2018) Key points of interactive research: an ethnographic approach to risk. In: Olofsson A, Zinn JO (eds) Researching risk and uncertainty. methodologies, methods and research strategies (critical studies in risk and uncertainty). Palgrave Macmillan, Cham, p 53–77
van Voorst R (2016) Natural hazards, risk and vulnerability: floods and slum life in Indonesia. Routledge
Van Voorst R (2024) Six in A Bed. The Future of Love - from Sex Dolls and Avatars to Polyamory. Polity Press, New York, London
Wikkan U (1991) Toward an experience-near anthropology. Cult Anthropol 6:285–305
Yonusu D (2018) The absent present law: an ethnographic study of legal violence in Turkey. Soc Leg Stud 27:716–733
Author information
Authors and Affiliations
Contributions
The first author produced the idea for this article and wrote the first and later drafts, which include several cases from her fieldwork. The second author contributed with writing, revision, and adding a case study from her fieldwork.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Ethical approval
This article is a commentary, entailing the ideas/opinions of the authors based on their long-term career in anthropology and ethnography. It does not share any direct data of human participants. The three examples of research that we use to highlight key points in ethnography of AI come from research conducted throughout our careers. The first example, on a poor neighborhood in Jakarta, comes from PhD research that was conducted between 2008 and 2014, when the author was affiliated to the University of Amsterdam and ethical approval was provided by the AISSR Ethics Advisory Board. The second and third case examples did not require ethical approval as the research was preliminary and did not involve the collection of direct quotes or personal data. This is a standard practice of conducting ethnographic research in the Netherlands.
Informed consent
Oral informed consent was obtained from all participants.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
van Voorst, R., Ahlin, T. Key points for an ethnography of AI: an approach towards crucial data. Humanit Soc Sci Commun 11, 337 (2024). https://doi.org/10.1057/s41599-024-02854-4
Received:
Accepted:
Published:
DOI: https://doi.org/10.1057/s41599-024-02854-4