Abstract
Innovative approaches to environmental communication are needed to transcend existing scientific knowledge, challenge individual value-action gaps, and engage more people in science. Within a co-created community science project, a case-control study was conducted to determine whether data visualization type could impact participant scientific learning, emotional response, behavioral outcomes, and environmental action. Two novel data sharing types were designed to communicate roof-harvested rainwater data to environmental justice communities: (1) A static booklet and (2) An interactive environmental art installation called Ripple Effect paired with a booklet. Our results indicate that environmental art can not only communicate complex scientific data effectively, but can also overcome barriers associated with traditional science communication by affecting people’s emotion and memory—which increases the likelihood of changing their behavior or taking new action in their environment. These results are consistent with the environmental psychology literature; however, we have successfully captured the role of memory and long-lasting impacts of environmental art on pro-environmental health behavior. This research further paves the way for others to create innovative environmental communication formats to communicate environmental health.
Similar content being viewed by others
Introduction
Pollution is the leading global cause of premature death and disease (Landrigan et al. 2018). Environmental health risks are disproportionately placed on low-income and/or communities of color and these environmental justice (EJ) communities are exposed to environmental burdens and experience additional socioeconomic challenges, such as political isolation, linguistic isolation, and/or information disparities (Bullard 1990; Ramirez-Andreotta et al. 2023a, 2016; Palawat et al. 2023a, Wilson et al. 2012). Effective communication, public outreach, and education are necesscommunication about the environment via data sharing shapes public perceptions, behavioral changes, and policy surrounding the natural world (Allen 2017, Geiger et al. 2017, Swim et al. 2018). The field employs many methods to engage communities with scientific data and promote risk-adverse behavior, such as reducing one’s exposure to contamination, or pro-environmental behavior, like reducing one’s use of water or energy. However, informational campaigns as a communication method have not been successful in generating behavioral changes (Keller et al. 2020). Referred to as the value-action gap in environmental psychology, increased environmental knowledge does not always translate into increased pro-environmental behavior (Klöckner 2015; Steg and Vlek 2009). Explanatory graphics, online web platforms, and static forms of presentations (e.g., printed reports, pamphlets, infographics) do not reach or engage all audiences. Alternative approaches to environmental communication are needed to determine whether they can have a more expansive impact on human behavior.
Visual art, which uses more emotive and personalized techniques, may help bridge the divide between scientific information and personal action (Roosen et al. 2018a, 2018b). Artists play an important role in environmental communication, particularly in the form of socially-engaged interactive art installations (Simoniti, 2018). Studies have shown that art helps disseminate scientific information while facilitating engagement and eliciting an emotional response, which can aid in communication between researchers, practitioners, and community members (Arce-Nazario 2016; Baldwin and Chandler 2010; Curtis et al. 2012; Marks et al. 2017). For example, Jacobs et al. (2013) recorded enhanced emotional responses when environmental data was enriched with an aesthetic and sensory experience. Sommer et al. (2019) observed that intentions to take pro-environmental action increased after audience members were given an immersive experience with polluted air. After conducting several studies on different artworks and art events, Curtis et al. (2014) concluded that environmental art presents information in an engaging way, which creates a sense of connection with natural spaces and encourages pro-environmental behavior. Roosen et al. (2018a, 2018b) further explained how contemporary art that addresses climate change can help audiences overcome psychological barriers via disrupting daily life routines, offering a space of reflection, and/or strengthening a sense of group identity among the visitors, thereby facilitating motivation for change.
Measuring emotional response to environmental art
The emotional response that art triggers may yield future behavioral change (Vessel et al. 2012 Curtis 2009, 2010, 2011; Curtis et al. 2014; Roosen et al. 2018a, 2018b; Simoniti 2018). However, measuring a person’s emotional state is challenging (Mauss and Robinson 2009). Efforts to measure emotional responses can include brain measuring devices like EEG or MRI scans; however, these forms of monitoring may not be practical or feasible in community-based projects.
With an understanding that emotion is built into every dimension of language, as a proxy, corpora linguists use sentiment analyses to understand emotional responses (Wilce 2009; Ochs 1990). For example, corpus linguists count “attitude adverbs” (e.g., amazingly, regrettably) and “evaluative adjectives” (e.g., cool, terrible) as markers of affective stance (Barbieri 2008). In this work, a sentiment analysis and qualitative analysis of participant discussions are used to determine participant emotional response to the data based on data visualization (vis) type.
Project design and research questions
This study is embedded within the University of Arizona’s Project Harvest (PH), a co-created community science research program developed in partnership with the Sonora Environmental Research Institute, Inc. (SERI). PH worked with three communities near legacy/active mining and one urban city; Dewey-Humboldt, Globe-Miami, Hayden/Winkelman, and Tucson, AZ, USA, respectively, to address environmental injustices and answer community-driven research questions, i.e., “What is the quality of my harvested rainwater?” PH environmental sampling ran from 2017- 2020 and the primary goals were to evaluate potential pollutants in harvested rainwater, as well as in irrigated soil and homegrown plants; increase community engagement in environmental decision-making; and increase environmental health literacy (Ramírez-Andreotta et al. 2019a, 2023; Moses et al. 2022; Davis et al. 2018, 2020). PH intentionally works with EJ communities, with the understanding that these communities are experiencing harm due to layered environmental injustices, such as disproportionate industrial pollution, socioeconomic burdens, and lack of access to political power (Wilson, 2009).
Participants were trained by community health workers, or promotoras (Davis et al. 2020), to collect harvested rainwater, soil, and plant samples on their own properties over a 2.5-year period for contaminant analyses (Palawat et al. 2023a, b; Villagomez-Marquez et al. 2023; Moses et al. 2023a, b). At the end of environmental sampling years 2018 and 2019, the PH team reported the data back in two ways, (1) a static booklet that will be referred to as “booklet-only” and (2) a booklet with an art experience, which will be referred to as “Ripple Effect.” Due to COVID-19, the data sharing events for 2020 were hosted virtually in May 2021. In addition to the above, a bilingual, interactive website was also prepared to further support data report back. Within the framework of environmental health literacy, our team conducted a case-control study by employing two distinct data visualizations during data sharing events, booklet-only and Ripple Effect; and then measured participant data interpretation, emotional response, behavioral outcomes, and memory recall after the first year (2018) of data sharing.
Ripple Effect was created to ground the data-driven environmental discussion in a sensorial, empathetic art experience (Kaufmann 2019; Kaufmann et al. 2021). The harvested rainwater data is transformed from rows and columns of numeric values to a moving, multi-sensory, 3-D installation. The primary medium in this artwork, water, was selected with the intention of creating a direct connection between the participant to the subject matter as it is found in nature, revealing the close interdependency between humans and water. Other distinct features of this environmental artwork are that it is interactive (participation is fundamental to the experience), it is 3-D (participants must navigate throughout different sound stations), and it is multisensory (the exhibition employs sight, touch, and sound).
To determine whether the type of data vis (booklet-only or Ripple Effect) had an impact on participants’ data interpretation, emotion, behavior and memory; we measured participants’ sentiment (how they described their emotion after interpreting the results), intention to change their everyday behavior, memory recall (how well they could recall the event or data after five-six months), and action (if they modified their behavior after five-six months) (Ramírez-Andreotta et al. 2019b). Here, we incorporate theory and findings from environmental psychology and corpora linguistics to empirically evaluate art as an innovative approach to science, environmental, and risk communication. Specifically, our research questions were:
-
1.
Which group (case or control) will have stronger intentions to change their behavior or take action in their environment? Will that group be more likely to follow through with their intentions following the data sharing event?
-
2.
How will participants interpret or find meaning in their results? Will this differ between case and control groups?
-
3.
What will be the different emotions that participants experience while viewing their data? Will this differ between case and control groups?
-
4.
How does participant emotional response impact participants’ intention to act?
-
5.
Which group will be able to remember the data and recall it in greater detail five-six months after the event?
-
6.
How does memory recall impact participant action?
Methods
Project description
Data sharing events were hosted at the end of each sampling year in each of the four communities, in their preferred language (English or Spanish). The in-person data sharing events gave people the opportunity to meet other participants in the project, analyze data together, ask questions directly to the research team, and exchange ideas for action. In both types of data sharing events, participants were provided food, introduced to PH, given a walk-through of the data sharing materials (booklet-only or Ripple Effect), and then time to interact with the data sharing materials. Data sharing events ended with focus groups and surveys.
For Year 1 of the project, participants were randomly assigned to a data vis type and event. The following year, the groups switched data vis type (from booklet-only to Ripple Effect or vice versa). In both groups, participant’s results were shown in relation to other community members’ as well as federal/state regulatory standards, recommendations, guidelines and/or advisories, when available.
Over the course of 2.5 years (2017–2020), over 150 community members participated in sample collection and learning research engagement. PH participants are socioeconomically diverse, with 25% speaking Spanish as their primary language and just over 50% self-identifying as: low-income or below based on the U.S. Department of Housing and Urban Development guidelines; non-white (predominantly Latina/o/x/Hispanic), and not having a college degree (Davis et al. 2020). For this study, see Supplementary Table A for the sociodemographic data of participants who attended Year 1 data sharing, by data vis type in 2018. Data sharing events continued in each community; however, they are not the focus of this study. Here, we primarily show the results from Year 1 data sharing events, as this is when participants saw the data vis types for the first time. This includes participants from the focus groups (N = 53) and follow-up interviews with only individuals who attend a Year 1 data sharing event (N = 26). The only section where Year 1 and Year 2 results will be combined is in the data vis preference section.
Booklet-only data sharing description
In the control or booklet-only group, participants received their household results through a booklet that contained data visualizations of contamination levels in the form of strip plots (also called individual value plots) and accompanying text. The figures, content and overall booklet layout went through a series of design-based research steps, i.e., formative evaluation and the principles of Equity-Centered Community Design (unpublished results). Each booklet (Fig. 1) contained descriptions of standards, recommendations, guidelines and/or advisories at the beginning of each contaminant section (Inorganic, Microbial, and Organic). Following the standards, recommendations, guidelines and/or advisories descriptions, each page contained one contaminant strip plot graph with accompanying descriptions of the contaminant.
Ripple effect data sharing description
In the case group, or Ripple Effect group, participants received their results through the same results booklet, plus an interactive art installation called Ripple Effect (Fig. 2). Ripple Effect is a socially-engaged environmental artwork that communicates rainwater quality data through soundwaves. The installation consists of sound stations (Fig. 3) that each have a speaker with a tray of water placed on top. Each row of stations represents a different contaminant, and each station corresponds to a standard, recommendation, guideline, and/or advisory (for example, United States Environmental Protection Agency Drinking Water Standard), which is labeled on a placard. The speaker plays each participant’s personal “data soundtrack,” which is provided to them via a flash drive. Participants hear and see the water vibrate based on the contaminant concentrations in their rainwater samples. The more active the water, the higher the contaminant concentration. Lining each speaker is an LED light strip, which lights up each time a data point exceeds a standard, recommendation, guideline, and/or advisory.
The Ripple Effect experience began by participants finding their flash drives in an acrylic cabinet and reading the provided didactic that explained the meanings of the sound, ripples, and light. Participants were also given a pamphlet (Supplementary B) that explained the data to sound process and the general set-up of the exhibition. After taking the flash drives with their data, participants were prompted to interact with the exhibition at their own pace, plugging their flash drives into the speaker to see their data in the water. PH team members were available to answer any questions or facilitate a participant’s experience through the exhibition.
Data collection and analyses
The study protocol described below was approved by the University of Arizona Institutional Review Board. Participants were consented under the University of Arizona IRB as an approved project.
Focus groups
Immediately after Ripple Effect or booklet-only data vis type, focus groups were conducted with participants to discuss their results in further depth (see Supplemental C for focus group script). Focus group participants were encouraged to contribute when they had a response to share. Moderators facilitated the focus groups to ensure a balanced conversation among participants, ensuring everyone had a chance to talk during the allotted time. The focus group questions were developed collaboratively by the learning research team to observe each participant’s: (1) data interpretation (sense-making strategies participants used to find meaning in their data) and reaction to their data (how they described their emotional state after receiving their data), (2) intention to change their behavior, and (3) general feedback on data-sharing methods. Focus groups were recorded and transcribed for the research team to code during data analyses.
Data analysis was performed using NVivo (QSR International Pty Ltd. Version 12, 2018) by a sub-team of three researchers. Team members used an inductive process to create the codebook, identifying themes and subthemes as they emerged within focus group conversations (Creswell and Poth 2017, Davidson 2018, Scammell 2010). Additionally, deductive methods were used when team members were seeking out specific themes or answers to research questions, such as, “Does data vis type affect a participant’s intention to change or modify their behavior in their environment?” Two team members coded each focus group independently and a Coding Comparison Query test was run. The test yielded a .52 kappa coefficient, suggesting a “fair to good agreement” (NVivo 12 for Mac, QSR International Pty Ltd.).
The focus group transcriptions were evaluated via a text sentiment analysis software, called Docuscope Global (version 1.8.4, Carnegie Mellon University, PA), that analyzed what words participants used to describe their data, their emotional state, and their decisions. Docuscope divides the strings of text it recognizes into a three-level taxonomy. The version we used contained 36 categories at the highest dictionary tier (which Docuscope terms “Clusters”), 3474 categories at the middle tier (called “Dimensions”) and 56,016 categories at the lowest tier level (called “Language Action Types” or LAT). These categories are derived from dictionaries, which can be loaded into the software for analysis. In this study, we loaded the default dictionary, which consists of over 40 million linguistic patterns of English, accounting for about 70% of the English language (Ishizaki and Kaufer 2012). Docuscope annotated all the transcribed focus group text and then outputted a spreadsheet containing the percentages of the text coded in a specific cluster, dimension, or LAT.
Follow up interviews
To determine the sustained impact of the data sharing event, semi-structured interviews were conducted in each community between May-June 2019, about 5-6 months after the Year 1 data sharing event. Fourteen in-person interviews were conducted at open house educational events in May 2019 and twenty-six interviews were conducted over the phone in June 2019 (see Supplementary D for semi-structured interview script). Over half of the participants interviewed attended the data sharing event and at least two participants were interviewed from each community. If an interviewee did not attend a data sharing event, their data were not included in the analysis. The participant interview prompts were developed collaboratively by the learning research team to solicit feedback on:
-
1.
Memory Recall/ Knowledge Retention: What participants remember most about the event or meaning of data?
-
2.
Motivation: What participants valued most about the data sharing events, why they attended the event and whether they shared their data or information with others in their community following the event?
-
3.
Environmental Action: What participants are now doing with the data after receiving their results?
Interviews were recorded and transcribed for analysis. To standardize coding practice, three team members independently coded two different interviews and a Coding Comparison Query test was run. The test yielded a 0.64 kappa coefficient, suggesting a fair to good agreement. Coded responses under the grandparent nodes of “Participant Action,” “Sharing with others,” “Value,” and “Memory recall” were then inputted into SPSS (version 27, IBM) for statistical analysis.
Statistical analysis and visualization
A chi-squared test was conducted in SPSS on coded responses under various grandparent nodes to assess the relationship between the coded responses and data vis type. To ensure that the participant’s harvested rainwater contaminant concentrations did not confound the analysis, a two-tailed Wilcoxon ranked sum test was performed comparing participant contaminant concentrations and data vis type. To compare the participant sentiment by data vis type, a two-tailed Wilcoxon ranked sum test was performed between data vis type and the coded specific cluster or dimension percentage. Statistical analysis for contaminant comparisons and sentiment analyses were conducting in R program software (version 3.6.2, Bell Laboratories) using RStudio (R Core Team 2020). The packages, “ggplot2” (Wickham 2009) and “networkD3” (Allaire et al. 2017) were used for data visualization.
Results
Focus group data
Data-vis type feedback and general discussion themes
In both types of events, participants formed meaning from their results through peer-to-peer negotiation, asking the research team questions, and studying their data. This led to comparing data, hypothesizing sources of contamination, sharing environmental histories, and determining the safe use of their harvested rainwater. In both data vis types, participants were informed and asked to discuss the implications of their results. Many participants had existing knowledge about environmental contamination and mining waste through lived experience that they shared with the group, yet they reported that PH added new scientific data that reshaped and deepened their exposure experience. As one participant described,
“And when I think about this study, we all come in with perceptions we have about the environment… and being able to participate in something that’s at my home, and I can see what happens there, and then get these type of laboratory results that show exactly what’s going on, is really valuable, … and if I have any perceptions that’s different than what this is, I have to reconsider, and say, these are the facts. And also, knowing that we’re going to do this over three years, four times a year, we’re going to see …if there’s a trend or anything. So as an individual participating, it’s very valuable information.”
Forty-one percent of participants in the case group (Ripple Effect group) mentioned that Ripple Effect taps into visual/spatial understanding, which benefited their learning experience by affecting their senses. One participant stated “I get it two ways, because I’m a visual learner …there is sound, it has light and it also has the vibrations to feel. In my case, it affected all of my senses. So, I loved that.” Another participant shared “I think it’s a great platform to take a 2D object, graph, and make it a 3D, more visible representation of your data. It just gives you one more way to look at it.” One participant emphasized “It’s good because you have the visual and you have kind of like a multisensory sort of touch, hearing, vision as well.” One participant with poor eyesight preferred listening to the data through Ripple Effect, explaining “for me it’s easier to hear it than to see.”
Twenty-four percent of case group participants mentioned that Ripple Effect had a greater impact than viewing the same information graphically. As one participant described “Well I was quite impressed to be able to see the sound samples. In the book, you just see a little dot there and here, you can see it … 3D or 4D… and so, it’s more exciting.” Another participant added “Maybe if we saw everything on paper, we would be falling asleep by now - boring! So this is more for us to be able to understand… and also more interesting.” One participant also voiced “I think the visual concept really helped, because you can understand the numbers, but when you actually see it and hear it, it makes a bigger impact than just looking at the numbers.”
When describing Ripple Effect, the participants used the words “suspenseful”, “fascinating”, “creative”, “educational”, “awesome”, “multisensory”, and “impressive.” When asked what helped their understanding of the data, out of 44 total case group responses, 30% said the sonic qualities of the exhibition were helpful, 23% said that since Ripple Effect was a new and engaging way to present the data so it helped them pay more attention to the data, 18% of participants said the LED lights assisted their learning, and 14% said the ripples/vibration in the water were helpful. One participant shared “It’s very interesting to see a big difference between barely vibrating to overflowing [water].” Others found the colored labels and flash drive/ cabinet set-up aided their understanding. One participant described “Going to the cabinet, looking at the different types of contaminants… made it more suspenseful. Finding your key, and then you plug it in to each one, and from the colored cards you realize the different standards for the different uses.” Additionally, many participants talked about how the data vis types complemented one another (n = 8). Five participants believed the booklet was essential because they could take it home with them to study it further.
“I like the book. This is my reference, and I learned to read graphs like this a long time ago, so it works very well for me. … And then, I can take this home, it’s my reference. I’ve got this permanent. The thing with the Ripple Effect, we’ve all got to be here. It’s an event, whereas you’re able to share a book… So I mean, it’s a lot of energy to participate and learn the Ripple Effect and see it and visualize it, but as a book, you know, we all have reference books.”
While other participants believed that they would not be able to interpret their data without the booklet (n = 3). When describing the booklet in the control group (booklet-only), participants used the words “helpful,” “easy to read,” “neat,” “a good resource” and “consistent.” Three participants mentioned that the information contained in the booklet was valuable. Five participants mentioned that the color coding in the booklet made the information more understandable. One participant affirmed “It is easy to read. There’s not too much data on a page. The color brings clarity. It’s consistent from start to finish… and a lot of time and attention was given to that so when you were dispersing to citizen scientists, that …they can repeat a process with some level of confidence.”
When asked what helped the control group participants understand their data, out of 25 total responses, 40% said the graphs were helpful, 24% said that color-use was essential to their understanding, and 20% said the accompanying text helped them. “Going over the first pages with all the symbols and guides and standards helped. As long as we know what … each symbol means, then we how know how safe [our samples] are.” Others appreciated the layout of the booklet and the legend for interpretation. Participants valued the booklet as a resource they could take home with them and show their family and neighbors.
How did the participants interpret the data and form meaning from their results?
Compared to booklet-only, a greater number of Ripple Effect participants compared their results by season, looking at how the data changed temporally throughout the four sampling windows (Fig. 4). More booklet-only participants reported comparing their results to other households in their community and the field blank samples. The same number of participants in both case and control groups compared their results to the standards, recommendations, guidelines and/or advisories (8 out of 36, 22.2% each). This was most likely due to the intention of both data sharing events directed around discussions about where participants’ data fell in relation to the standards, recommendations, guidelines and/or advisories.
How did participants emotionally respond to their data?
In Ripple Effect, compared to booklet-only, more participants were surprised by their data (Fig. 5). Some pictures taken during the events show participants had some animated reactions to the vibrating water, including open mouths, wide eyes, pointing, leaning down to see the water at eye level, and holding their hands above the water vibrations to catch some if the water droplets (Fig. 6).
In the booklet-only event, more people were pleased or relieved by their data compared to Ripple Effect. There were general trends of people who were concerned or surprised by chemical concentrations in Ripple Effect. It is important to note that no significant differences were observed in participant contaminant concentrations by data vis type.
Did participants intend to change their rainwater harvesting or gardening behavior following the data sharing event?
The participants who wanted to change their behavior following the data sharing event reported that they planned to either increase or modify their rainwater use in a way that would benefit their environmental health. For example, a participant said they would conserve their tap water by using rainwater for their hot tub since they learned that their rainwater’s contamination levels fell below the Arizona Department of Environmental Quality Surface Water standard for full body contact. Out of the 19 participants who gave responses, all of the Ripple Effect participants (n = 12) intended to take action based on their data (either increase or modify rainwater use), whereas 3 out of 7 booklet-only participants reported wanting to change their behavior (3 out of 19 total responses, 15.8%)—most intended not to do anything differently, reporting either that they would maintain a previous gardening behavior or not take any action (4 out of 19 total responses, 21.0%) (Fig. 7). Based on a chi-squared test, there is a significant relationship between intention to act and data vis type (p < 0.05).
How did participant’s emotional response impact participant’s intention to take action in their environment?
Participants who were surprised by their data were more likely to change their future rainwater use, by either increasing or modifying. Notably, all participants who voiced concern about their data intended to change their future behavior, with the majority of those participants intending to modify their rainwater use in a way that benefited their environmental health (e.g., upon a participant learning that their water sample’s contaminant concentration(s) were above a standard or recommendation, they voiced that they would find a new water source for their garden to protect their health). All the participants who were pleased or relieved by their data received their results through booklet-only and did not intend to take any new environmental action, even though, on average the booklet-only group had higher contaminant concentrations when compared to the Ripple Effect group (Supplementary Table E). Participants in both Ripple Effect and booklet-only groups reacted to their data in a variety of ways no matter what they compared their results to (Fig. 8).
To highlight participant exposure experiences (Adams et al. 2011), the Sankey diagram tracks participants from each data sharing type to understand their progression through the data sharing event—from comparing their results, to reacting to their data, and finally, their willingness to change their behavior. In Supplementary F and G, participant responses that were representative of a typical participant experience were highlighted by data vis type. One participant in the Ripple Effect event compared their results to standards, recommendations, guidelines and/or advisories (specifically, the United States Department of Agriculture’s Irrigation Water Recommended Maximum Concentration), was surprised that they had low chemical concentrations, and decided that in the future, they would increase their rainwater use by using the water on a nearby fruit tree orchard (Supplementary F). One participant in the booklet-only event compared their results to other households in the area, was pleased that the community’s contaminant data was “less than five” micrograms per liter and decided that they did not need to change their rainwater use or gardening behavior (Supplementary G).
Docuscope sentiment analysis
When conducting the sentiment analysis with the focus group texts, a total of 54 out of 63 clusters and dimensions were used, excluding 9 clusters/ dimensions that yielded 0% and “orphan” words that were not coded to any cluster or dimension. Due to the very low percentages of LATs found in the focus groups texts, we decided to exclude that metric from analysis. A heat map (Supplementarty H) shows all the Docuscope Global clusters and dimensions detected in the focus group texts, by percentage yielded for Ripple Effect participants and booklet-only participants. For the clusters and dimensions with significant differences by data vis type, we visualized the spread and distribution of the data with violin and box plots (Figs. 9–11). Booklet-only participants had significantly higher percentages of the cluster “Information states” (p < 0.05), which is defined as passive, stative, and auxiliary verbs (e.g., forms of “be” - “am,” “is,” “are,” “was,” “were;” forms of “do,” and forms of “have”) that indicate being in a state of reporting information.
Ripple Effect participants had significantly higher percentages of the cluster “FirstPerson” (p < 0.05), which captures when the speaker refers to themself or a group that includes the speaker (e.g., “I”,” “me,” “we,” and “us”). Ripple Effect participants had significantly higher percentages of the cluster “Metadiscourse Cohesive” (p < 0.05), which is defined as the use of words to build cohesive markers to help the reader (or listener) navigate the text (or what they are saying). Metadiscourse reveals the speaker’s awareness of the listener and their need for elaboration, clarification, guidance, or interaction. Words such as “frankly” “after all” “on the other hand” “to my surprise” “I believe” “perhaps” “must” “finally” “therefore” “however” fall under metadiscourse.
Follow-up interviews
Which group will be able to remember the data and recall it in greater detail five-six months after the event?
When asking participants what they remembered most about the data sharing event five-six months later, ten participants recalled the event or data specifically–sharing detailed accounts of their experience with their data–whereas six participants recalled the data or event in a more general sense. Eighty percent of the participants who recalled the data or event specifically were in the Ripple Effect group. One participant recounted “I remember how the water began to move when it had some contaminants… when we put in the key and the water began to dance. That is what impressed me because it is a way to teach us the differences of whether there is something in my water or soil, or if there is not.” Another participant described (Fig. 12):
“Well certainly the visual display of how the water would react to the amount of contaminant was a [memorable part] because that’s something to see. I appreciated the discussions we had …I was really pleased by how low it appeared most of our contaminants are. I was expecting Coliforms to be high in August because I saw the color of the water that came out of my tanks. And that just happens with heat and that’s just how it is. And everything - the only one that really caught my attention was Beryllium in that it had a bigger scatter. Everything else was pretty evenly distributed, pretty tightly clustered with a few outliers and just demonstrated that our rainwater here appears to be pretty safe.”
Eighty-three percent of the participants who recalled the data sharing event generally were in the booklet-only group. One participant recalled (Fig. 12) “What do I remember most…I think was that I didn’t have all the problems in my water and soil that I thought might be possible, that there were not some contaminants in there and I don’t remember which ones I was concerned about at that time.”
How does memory recall impact participant action?
A chi-squared test showed the relationship between memory recall and data vis type as statistically significant (p < 0.05). A greater number of Ripple Effect participants (n = 7) took action following the data sharing events, compared to two booklet-only participants. Participants (n = 9) reported that they took action because of what they observed in their data. Six participants increased their rainwater use due to safe results and three participants modified their rainwater use due to unsafe results. Based on their individual data, researchers determined that modified actions taken by participants reflected a correct interpretation of the data (Table 5 in Ramírez-Andreotta et al. 2023). To understand the effect that memory recall has on participant action, Fig. 12 tracks participants who recalled the data sharing event to see what kind of action or inaction they reported. Notably, all the participants who recalled the data or event specifically took environmental action whereas those who had a more general memory recall reported no new action.
All the participants who intended to act and then took action five-six months after the data sharing were in the Ripple Effect group. Those who intended to act and did not were all in the booklet-only group. However, it is important to note that for the interview cohort (different from the data sharing event cohort), the Ripple Effect participants had higher mean contaminant concentrations than the booklet-only group, which may have influenced follow-up action.
Participant values
In the follow up interviews, when reflecting on what was most valuable about the data sharing event, more Ripple Effect participants stated that they valued the visual representations (n = 8) and the interactive and hands-on elements (n = 9). One participant said “I thought it was really cool. As an electrical engineer I definitely appreciated the whole sound waves shown in the water and the lights lighting up. I thought was an effective way to show relative values. It seemed that when you had one sample in there and you were watching what that one did, it kind of blossomed over a period of time.” There were also more Ripple Effect participants who valued two or more aspects of data sharing (Fig. 13), pairing the visual and interactive aspects with getting results (n = 2) and meeting people (n = 2).
Year 1 and Year 2 data combined—Do participants have a data sharing preference?
After the second year of data sharing, where participants switched data vis type, the research team asked if participants had a preference. Of participants who attended both data sharing methods, 67% of said they preferred receiving their results through Ripple Effect (case group), whereas 33% preferred receiving booklet-only results (control group). One participant explained “To me, I would rather have both because you need one to test the other or compare, you know, in case some make mistakes or didn’t read right or if I missed something.”
Limitations
Study design
In Project Harvest, two different environmental monitoring types were employed: (1) Lab—where participants sent their samples to the lab to be analyzed and (2) DIY – where participants conducted the experiment at home and reported results to our team. Monitoring type was a potential confounding variable because participants who submitted lab samples received data on 23 contaminants, whereas the DIY method solely measured and reported estimated concentrations of arsenic and sulfur reducing bacteria. Receiving fewer or more results could have influenced participant data interpretation and action.
We anticipated that participants’ emotional reaction and intention to change behavior would be dependent on their individualized data and whether they had contaminant concentrations of concern. For the data sharing event cohort, booklet-only participants’ mean contaminant concentrations were overall higher than Ripple Effect participants, statistically significant for only two pollutants out of 23, lead and copper (Supplementary Table E).
Docuscope’s default dictionary is currently only available in English and does not account for linguistic nuances in our study cohorts, primarily due to cultural words/phrases and English spoken as a second language. This was a bilingual (Spanish and English) study, meaning that some of the data sharing events were conducted solely in Spanish (including booklets, presentations, and focus groups). Our team translated focus group audio to English text for the Docuscope analysis knowing that the default dictionary has not been validated for Spanish to English translation. As the Docuscope software continues to develop, dictionaries including multiple languages and scientific terminology would resolve this limitation. Our team took steps to mitigate this limitation by having multiple Spanish-speaking members of the team, including community health workers local to the area, present during focus groups. This helped prevent mistranslations, as those team members reviewed the translations for inconsistencies.
Finally, in this study, we evaluate one environmental art approach, Ripple Effect, not a variety of different environmental artworks. The generalizability of these results is limited by the single artwork under study. However, more evidence and discussion from studies on single exhibitions are needed to corroborate results (Sommer et al. 2019).
Challenges encountered during data sharing
Some participant responses were critical of certain aspects of Ripple Effect that hindered comprehension (n = 9). Of Ripple Effect participants, 44% said there were technical difficulties (e.g., amplifier volume had to be adjusted) and 33% said that sometimes the sound from their station overlapped the sound from another participant’s station, causing some confusion. Other participants noted that the messaging of the exhibition could be counter intuitive for some people, because participants received stimuli (water vibrating and light turning on) when contaminant concentrations were greater. Other feedback that was given was that while Ripple Effect was a novel experience, but without staff assistance, it would be difficult for them to understand how the exhibition worked (n = 4). An exhibition that is portable, not technology-based, and gives positive stimuli for lower contaminant concentrations may be more effective. Seven participant responses said that aspects of the booklet hindered their comprehension. Of booklet-only participants, 57% stated that overlapping standards, recommendations, guidelines and/or advisories lines hindered their data comprehension, followed by 29% confused by overlapping standards, recommendations, guidelines and/or advisories lines, and 14% stated that some colors were too light to see on the page.
Discussion
Environmental art presents data in a novel way, which causes participants to pay more attention
Participants acknowledged that the booklet was a familiar format for reading and viewing the data, which translated to a general sense of relief or pleasantness, even when the chemical concentrations were, in some cases, relatively elevated. Alternatively, the novelty of the Ripple Effect with its tactile and interactive features facilitated a sensorial response by participants, which caused them to pay more attention to the data and experience feelings of alarm and surprise. In summary, participants largely described their engagement with the art as an active experience requiring them to pay close attention, whereas the booklet allowed for a more passive reading and easy interpretation.
Environmental art engaged various senses, which gave participants spatial, temporal, and embodied understanding of their data
Another major difference between the data vis types identified by the participants was the distinct 3-D nature of Ripple Effect, (as opposed to the 2-D booklet), where participants physically moved through the exhibition. Many participants felt a sense of bodily, physical connection to the data through the motion or “dance” of the water. This was articulated by participants when they gave life-like, human, and/or corporeal descriptions of their data. As one participant stated, “Scientists tell you all the time how you can be part of your data or something like that… this is like a whole different consciousness and understanding of my environment.” This connection can further be described as embodied knowledge or learning that emerges through a subject’s body exploring and interacting with tangible environmental media (Davidson 2004). Also referred to as body-mind, John Dewey asserts that this knowledge is not simply an acknowledgement of the sensory input that goes to the brain, but is based upon the interaction of the subject within a complex and challenging environment (Boisvert 1999). This concept emerged in post-object minimalism and conceptual art of the 1960s (e.g., Robert Smithson, Hans Haacke), which demonstrated that environmental artists could use elements of landscape or location as an artistic medium itself framed in relation to the participant’s body (Simoniti 2018). Ripple Effect is unique in that participants witness their data within the water, a direct link to the harvested rainwater they interact with in their everyday life. As one participant states “Well, to me, it’s just like having results that I didn’t receive before, where I first saw rainwater - rainwater was rainwater. […] Our rainwater, you know, the sound and everything is telling me differently that I had never seen before.”
The sensorial aspects of Ripple Effect directly related to how participants emotionally responded to their data and what comparisons were made. Since Ripple Effect had strong temporal and spatial components, participants compared their results to the four sampling windows. In contrast, the booklet was static and therefore encouraged a greater number of comparisons to other household data points on the graph and other symbols on the legend.
Existing research has explored the knowledge gained from bodily interaction, also known as phenomenology (Merleau-Ponty 2010), and how physically engaging with an artwork affects human cognition (Duby and Barker 2017). Moreover, there is substantial research on the learning benefits of interacting with nature outside of a traditional classroom environment (Malone 2008, Braund and Reiss 2004, Larsen et al. 2017; Ryan and Deci 2017). The question that researchers have pivoted towards is what can be gained or lost from sharing data via interactive art vs. traditional static graphs?
The booklet is necessary for participants so they can share their results with others
Along with recognizing what can be gained through art, this study reveals what the art exhibition did not accomplish and where the printed booklet was essential. As pointed out by participants, the booklet is a portable and easy to distribute. Participants appreciated that they could take the booklet home and show it to their family, neighbors, and even their doctor. In its familiar format, the booklet is interpretable by a wide audience. Additionally, participants reported that the booklet grounded the environmental data in Ripple Effect – mentioning that they kept returning to the booklet during the art exhibition, as the vibrating water directed their attention to data points shown in the booklet. In this sense, many participants agreed that the art exhibition complemented the data booklet, as they referred to both data vis types to cross-validate the information. In a way, the booklet served as a boundary object or a means of translation (Star and Griesemer 1989). It was malleable enough to adapt to the participants’ needs and constraints, but structured enough to maintain its function and integrity across usages.
It is important to emphasize that the booklet was designed through formative evaluation with end-users in the project (unpublished results), meaning the booklet-only data vis type cannot be fully considered as a traditional form of science communication. If the booklet materials were prepared without following an equity centered community design framework (Creative Reaction Lab 2018) and without involving residents from environmental justice communities, the differences in learning outcomes, emotion, and action would have been more drastic.
Ripple effect participants have stronger intentions to act and follow-through
Studies have repeatedly identified the fleeting nature of people’s motivation to take environmental action, citing that motivation is lost after engaging with an environmental data sharing event (Sommer et al. 2019). A well-known finding in environmental psychology is that people’s attitudes and intentions often do not correspond to their behavior and actions (Klöckner 2015, Blake 1999). In contrast, our study observes that participants who attended Ripple Effect not only had stronger intentions to act, but were also more likely to follow through with their intentions five-six months after the data sharing event.
Following the data sharing event, participants who voiced their intention to act demonstrated that they were oriented to risk-adverse behavior (e.g., stopped watering edible plants with harvested rainwater) and/or pro-environmental behavior (e.g., conserving their tap water by using more rainwater throughout their property that was deemed safe by their results). The environmental actions taken by participants demonstrated a correct interpretation of the data (Ramírez-Andreotta et al. 2023) and in general, their decisions to act reflected a newfound awareness towards protecting their health and/or the health of their environment.
In terms of demographics, regardless of data vis type, 90% of women ended up changing their rainwater use, as compared to 60% of men. Additionally, 90% of adults (36–64 years) changed their behavior, as compared to 50% of seniors (65 years+). These observations support the need to further understand the connections between environmental quality, socio-demographic identities, and action.
Data visualization type impacts participant emotional experience
Captured through language, the sentiment analysis reveals that participants are emotionally responding as they process their data. In Ripple Effect, participants were more likely to place themselves and other people in the center of the discussion, using first person pronouns to relate the information back to themselves and their own lives. They more frequently discussed changes over time when compared to the booklet-only group. They looked to the past to evaluate environmental consequences of human actions. Participants in Ripple Effect addressed each other directly, guiding others through their thought processes.
In the booklet-only event, participants were more likely to place the data in the center of the discussion. The booklet-only event was a more formal learning environment where participants spoke with greater confidence about their data. In general, they looked towards the future to predict what could happen based on trends in the data as opposed to individual action.
The negative cluster – including the LATs of sadness, anger, and fear; was higher in the Ripple Effect group. The participants who felt surprised or concerned indicated they would take action, whereas reports of feeling pleased or relieved by the data translated to no action (Fig. 8). These findings are concurrent with the environmental psychology and emotion literature that describes the role that negative emotions can play in motivating action and/or individual behavioral changes (Rees et al. 2014).
Ripple Effect participants have a greater level of memory recall
Unsurprisingly, our research indicates that Ripple Effect was a memorable way to present data, and led to a more specific recall of results five-six months after the data sharing event. What this study further clarifies is that specific memory recall is positively associated with taking action. When participants remembered the data or event in detail, they changed their harvested rainwater use based on their results, whereas general memory recall typically led to no new participant action.
Conclusion
The four environmental justice PH communities had unique exposure experiences (Ramírez-Andreotta 2023, Adams et al. 2011), making it critical to report back environmental monitoring data and provide collaborative novel science communication strategies that expanded participant’s literacy around environmental health, monitoring, analysis, and policy (Ramírez-Andreotta et al. 2023). Our results indicate that environmental art can both communicate complex scientific data effectively and affect people’s emotion, memory, and everyday behavior; thus, overcome barriers associated with traditional science communication. We also observed that static, co-designed scientific materials is also significant, as these data visualizations and booklets can serve as boundary objects and positively complement environmental art. Heightened attention and engagement in environmental artworks can have a long-lasting impact, meaning that people are more likely to remember their data, which can translate into action. When compared to traditional science communication, this study observed that environmental art can change individuals’ environmental health mental map, which in turn, influences their actions. Our results are consistent with the findings of Curtis et al. (2012), Roosen et al. (2018a, 2018b), Baldwin and Chandler et al. (2010), however in this study and as highlighted as a need in Sommer et al. (2019), we also successfully address and capture the role environmental art plays in recall and memory, demonstrating the long-lasting impacts of this form of communication and art. By addressing social, cultural, and political issues, intuitively linking people to the natural environment, and providing multi-dimensional spaces for people to experience a range of sensory experiences, environmental art can close the value-action gap and play a major role in raising environmental health literacy.
Data availability
The datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.
References
Adams C, Brown P, Morello-Frosch R, Brody JG, Rudel R, Zota A, Dunagan S, Tovar J, Patton S (2011) Disentangling the exposure experience: The roles of community context and report-back of environmental exposure data. J Health Soc Behav 52(2):180–196. https://doi.org/10.1177/0022146510395593
Allaire JJ, Ellis P, Gandrud C, Kuo K, Lewis BW, Owen J, Russell K, Rogers J, Sese C, Yetman CJ (2017) networkD3: D3 JavaScript Network Graphs from R (0.4) [Computer software]. https://CRAN.R-project.org/package=networkD3
Allen M (2017) The SAGE Encyclopedia of Communication Research Methods. SAGE Publications, Inc. https://doi.org/10.4135/9781483381411
Arce-Nazario JA (2016) Translating land-use science to a museum exhibit. J Land Use Sci 11(4):417–428. https://doi.org/10.1080/1747423X.2016.1172129
Baldwin C, Chandler L (2010) At the water’s edge”: Community voices on climate change. Local Environ 15(7):637–649. https://doi.org/10.1080/13549839.2010.498810
Barbieri F (2008) Patterns of age-based linguistic variation in American English1. J Sociolinguistics 12(1):58–88. https://doi.org/10.1111/j.1467-9841.2008.00353.x
Blake J (1999) Overcoming the ‘value‐action gap’ in environmental policy: Tensions between national policy and local experience. Local Environ 4(3):257–278. https://doi.org/10.1080/13549839908725599
Boisvert RD (1999) John Dewey: Rethinking our time. Philos Q 49(195):270–272
Braund M, Reiss MJ (2004) Learning Science Outside the Classroom. Psychology Press
Bullard RD (1990). Dumping in Dixie: Race, class, and environmental quality. Westview
Creative Reaction Lab. (2018). Equity Centered Community Design Field Guide. https://crxlab.org/our-approach
Creswell JW Poth CN (2017) Qualitative Inquiry and Research Design: Choosing Among Five Approaches. SAGE Publications
Curtis DJ (2009) Creating inspiration: The role of the arts in creating empathy for ecological restoration. Ecol Manag Restor 10(3):174–184. https://doi.org/10.1111/j.1442-8903.2009.00487.x
Curtis DJ (2010) Plague and the Moonflower: A Regional Community Celebrates the Environment. Music Arts Action 3(1):65–85
Curtis DJ (2011) Using the Arts to Raise Awareness and Communicate Environmental Information in the Extension Context. J Agric Educ Ext 17(2):181–194. https://doi.org/10.1080/1389224X.2011.544458
Curtis DJ, Reid N, Ballard G (2012) Communicating Ecology Through Art: What Scientists Think. Ecology and Society, 17(2). https://www.jstor.org/stable/26269030
Curtis DJ, Reid N, Reeve I (2014) Towards ecological sustainability: Observations on the role of the arts. S A P I En S Surv Perspect Integrating Environ Soc 7.1:7.1. http://journals.openedition.org/sapiens/1655
Davidson J (2004) Embodied Knowledge: Possibilities and Constraints in Arts Education and Curriculum. In: Bresler L (Ed.) Knowing Bodies, Moving Minds: Towards Embodied Teaching and Learning. Springer, Netherlands, https://doi.org/10.1007/978-1-4020-2023-0_13 197–212
Davidson J (2018) Qualitative Research and Complex Teams. Understanding Qualitative Research. Oxford University Press, Oxford, New York, https://doi.org/10.1093/oso/9780190648138.001.0001
Davis LF, Ramirez-Andreotta MD, McLain JET, Kilungo A, Abrell L, Buxner S (2018) Increasing Environmental Health Literacy through Contextual Learning in Communities at Risk. Int J Environ Res Public Health 15(10):2203. https://doi.org/10.3390/ijerph15102203
Davis LF, Ramírez-Andreotta MD, Buxner S (2020) Engaging Diverse Citizen Scientists for Environmental Health: Recommendations from Participants and Promotoras. Citiz Sci: Theory Pract 5(1):1–27. https://doi.org/10.5334/cstp.253. 7
Duby M, Barker PA (2017) Deterritorialising the Research Space: Artistic Research, Embodied Knowledge, and the Academy. SAGE Open 7(4):2158244017737130. https://doi.org/10.1177/2158244017737130
Geiger N, Swim JK, Fraser J, Flinner K (2017) Catalyzing Public Engagement with Climate Change Through Informal Science Learning Centers. Sci Commun 39(2):221–249. https://doi.org/10.1177/1075547017697980
Ishizaki S, Kaufer D (2012) Computer-Aided Rhetorical Analysis [Chapter]. Applied Natural Language Processing: Identification, Investigation and Resolution; IGI Global. https://doi.org/10.4018/978-1-60960-741-8.ch016
Jacobs R, Benford S, Selby M, Golembewski M, Price D, Giannachi G (2013). A conversation between trees: what data feels like in the forest. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ‘13). Association for Computing Machinery, New York, NY, USA, 129–138. https://doi.org/10.1145/2470654.2470673
Kaufmann DB (2019) Ripple Effect. Available at: https://www.dorseykaufmann.com/rippleeffect
Kaufmann DB, Hamidi N, Palawat K, Ramírez-Andreotta MD. 2021. Ripple Effect: Communicating Water Quality Data through Sonic Vibrations. In Creativity and Cognition (C&C ‘21). Association for Computing Machinery, New York, NY, USA, Article 64, 1–7. https://doi.org/10.1145/3450741.3464947
Keller A, Sommer L, Klöckner CA, Hanss D (2020) Contextualizing information enhances the experience of environmental art. Psychol Aesthet, Creativity, Arts 14(3):264–275. https://doi.org/10.1037/aca0000213
Klöckner C (2015) The Psychology of Pro-Environmental Communication: Beyond Standard Information Strategies (271). https://doi.org/10.1007/978-1-137-34832-6
Landrigan PJ, Fuller R, Acosta NJR, Adeyi O, Arnold R, Basu N (Nil), Baldé AB, Bertollini R, Bose-O’Reilly S, Boufford JI, Breysse PN, Chiles T, Mahidol C, Coll-Seck AM, Cropper ML, Fobil J, Fuster V, Greenstone M, Haines A, … Zhong M (2018) The Lancet Commission on pollution and health Lancet 391(10119):462–512. 10.1016/S0140-6736(17)32345-0
Larsen C, Walsh C, Almond N, Myers C (2017) The “real value” of field trips in the early weeks of higher education: The student perspective. Educ Stud 43(1):110–121. https://doi.org/10.1080/03055698.2016.1245604
Malone K (2008) Every Experience Matters: An evidence based research report on the role of learning outside the classroom for children’s whole development from birth to eighteen years. Report commissioned by Farming and Countryside Education for UK Department Children, School and Families, Wollongong, Australia
Marks M, Chandler L, Baldwin C (2017) Environmental art as an innovative medium for environmental education in Biosphere Reserves. Environ Educ Res 23(9):1307–1321. https://doi.org/10.1080/13504622.2016.1214864
Mauss IB, Robinson MD (2009) Measures of emotion: A review. Cognition Emot 23(2):209–237. https://doi.org/10.1080/02699930802204677
Merleau-Ponty M Phenomenology of Perception. London, England: Routledge Classics
Moses A, McLain JET, Kilungo A, Root R, Abrell L, Buxner S, Sandoval F, Foley T, Jones M, Ramírez-Andreotta MD. (2022) Minding the gap: Socio-demographic factors linked to the perception of environmental contaminants, water harvesting infrastructure, and gardening characteristics. J Environ Stud Sci. https://doi.org/10.1007/s13412-022-00769-7
Moses A, Ramírez-Andreotta MD, McLain JET, Cortez LI, Kilungo A. (2023a) Assessing the impact of rainwater harvesting infrastructure and gardening trends on microbial indicator organism presence in harvested rainwater and garden soils. J Appl Microbiol; 134(6). https://academic.oup.com/jambio/article/134/6/lxad110/7180971
Moses A, Ramírez-Andreotta MD, McLain JET, Obergh V, Rutin E, Cortez I, Sandhaus S, Kilungo A (2023b) The Efficacy of Hydrogen sulfide (H2S) Tests for Detecting Microbial Contamination in Rooftop Harvested Rainwater. Environ Monit Assess 195:1398, https://link.springer.com/article/10.1007/s10661-023-11942-y
Palawat K, Root R, Cortez LI, Foley T, Carella V, Beck C, Ramírez-Andreotta MD (2023a) Patterns of contamination and burden of lead and arsenic in rooftop harvested rainwater collected in Arizona environmental justice communities. Environ Manag 337:117747. https://doi.org/10.1016/j.jenvman.2023.117747
Palawat K, Root R, Cortez LI, Foley T, Carella V, Beck C, Ramírez-Andreotta MD (2023b) Dissolved arsenic and lead concentrations in rooftop harvested rainwater: community generated dataset. Date Brief 48:109255
Qualitative Data Analysis Software | NVivo. (n.d.). Retrieved February 23, 2021, from. https://www.qsrinternational.com/nvivo-qualitative-data-analysis-software/home
R Core Team (2020). —European Environment Agency. (n.d.). [Methodology Reference]. Retrieved February 23, 2021, from. https://www.eea.europa.eu/data-and-maps/indicators/oxygen-consuming-substances-in-rivers/r-development-core-team-2006
Ramírez-Andreotta MD, Buxner S, Sandhaus S (2023) Co-created environmental health science: Identifying community questions and co-generating knowledge to support science learning. J Res Sci Teach. https://doi.org/10.1002/tea.21882
Ramirez-Andreotta MD, Brody JG, Lothrop N, Loh M, Beamer PI, Brown P (2016) Improving environmental health literacy and justice through environmental exposure results communication. Int J Environ Res Public Health 13(7):690–717
Ramírez-Andreotta MD, Abrell L, Kilungo A, McLain JET, Root R (2019a) Partnering for action: Community monitoring of harvested rainwater in underserved, rural, and urban Arizona communities. Water Resour IMPACT 21(2):12–15
Ramírez-Andreotta MD, Buxner S, Davis LF, Kaufmann D, Morales, AA, Sandhaus SA. 2019b. Characterizing the Role Art Can Play in Knowledge Retention and Environmental Self- and Community Efficacy: Placed-Based Data Sharing Efforts for and With Communities. American Geophysical Union, Fall Meeting 2019, abstract #ED51D-0830. Available at: https://ui.adsabs.harvard.edu/abs/2019AGUFMED51D0830R/abstract
Rees J, Klug S, Bamberg S (2014) Guilty conscience: motivating pro-environmental behavior by inducing negative moral emotions. Clim Change 130:439–452. https://doi.org/10.1007/s10584-014-1278-x
Roosen LJ, Klöckner CA, Swim JK (2018a) Visual art as a way to communicate climate change: A psychological perspective on climate change–related art. World Art 8(1):85–110. https://doi.org/10.1080/21500894.2017.1375002
Roosen LJ, Klöckner CA, Swim JK (2018b) Visual art as a way to communicate climate change: A psychological perspective on climate change–related art. World Art 8(1):85–110. https://doi.org/10.1080/21500894.2017.1375002
Ryan RM, Deci EL (2017) Self-determination theory. Basic psychological needs in motivation, development and wellness. Guilford Press, New York, NY, Revue québécoise de psychologie. 38. 231. 10.7202/1041847ar
Scammell MK (2010) Qualitative environmental health research: An analysis of the literature, 1991–2008. Environ Health Perspect 118(8):1146–1154. https://doi.org/10.1289/ehp.0901762
Simoniti V (2018) Assessing socially engaged art. J Aesthet Art Criticism 76(1):71–82. https://doi.org/10.1111/jaac.12414
Sommer LK, Swim JK, Keller A, Klöckner CA (2019) Pollution Pods”: The merging of art and psychology to engage the public in climate change. Glob Environ Change 59:101992. https://doi.org/10.1016/j.gloenvcha.2019.101992
Star S, Griesemer J (1989) Institutional Ecology, ‘Translations’ and Boundary Objects: Amateurs and Professionals in Berkeley’s Museum of Vertebrate Zoology, 1907-39” (PDF). Soc Stud Sci 19(3):387–420. https://doi.org/10.1177/030631289019003001
Steg L, Vlek C (2009) Encouraging pro-environmental behaviour: An integrative review and research agenda. J Environ Psychol 29(3):309–317. https://doi.org/10.1016/j.jenvp.2008.10.004
Swim JK, Vescio TK, Dahl JL, Zawadzki SJ (2018) Gendered discourse about climate change policies. Glob Environ Change 48:216–225. https://doi.org/10.1016/j.gloenvcha.2017.12.005
Vessel EA, Starr GG, Rubin N (2012) The brain on art: Intense aesthetic experience activates the default mode network. Front Human Neurosci 6. https://doi.org/10.3389/fnhum.2012.00066
Villagomez-Marquez N, Abrell L, Foley T, Ramírez-Andreotta MD (2023) Organic micropollutants measured in roof-harvested rainwater from rural and urban environmental justice communities in Arizona. Sci Total Environ 876:162662. https://doi.org/10.1016/j.scitotenv.2023.162662
Wickham (2009) ggplot2: Elegant Graphics for Data Analysis. Springer-Verlag. https://doi.org/10.1007/978-0-387-98141-3
Wilce JM (2009) Language and Emotion. Cambridge University Press. https://doi.org/10.1017/CBO9780511626692
Wilson SM (2009) An ecologic framework to study and address environmental justice and community health issues. Environ Justice 2(1):15–24. https://doi.org/10.1089/env.2008.0515
Wilson SM, Fraser-Rahim H, Williams E, Zhang H, Rice LS, Svendsen E, Abara W (2012) Assessment of the distribution of toxic release inventory facilities in metropolitan Charleston: An environmental justice case study. Am J Public Health 102(10):1974–1980
Acknowledgements
This research was funded by the U.S. National Science Foundation’s Division of Research on Learning—Advancing Informal STEM Learning Program, grant number 1612554. We would like to thank all PH participants who contributed to this study, as well as the PH promotoras, Armida Boneo, Imelda Cortez, Margaret Dewey, Theresa Foley, Palmira Henriquez, Miriam Jones, Lisa Ochoa, and Aviva O’Neil for their feedback on the data visualization designs and overall commitment to PH and community scientists. We would like to thank the PH learning research assistants: Leona Davis, Ariane Mohr-Felsen, Alma Anides Morales, Iliana Manjon, Norma Villagomez Marquez, and Nikki Skelton for conducting the focus groups and interviews and/or for their assistance in qualitative coding. Thank you to the other team members Leif Abrell, Aminata Kilungo, Jean Mclain, Arthur Moses, Victoria Obergh, Flor Sandoval, Jesus Solis Leon, and Rob Root for informing/conducting sample processing and analysis. We would like to give special thanks to the individuals who were a part of the creation of Ripple Effect - Nima Hamidi for building the MAX/MSP code that translated the data to sound, Nevan Madrid for the immense time, support, and help realizing the installation by using Arduino electronics platform to coordinate the audio and electrical components with LED lights; and finally, Addison Kaufmann for considering the user experience and extending the life of the exhibition by building the online Ripple Effect interface. Thank you to the Alliance for Art in Research Universities and University of Arizona’s Biosphere 2 for providing funding for materials at the beginning of the project. Lastly, thank you David Kauffer and Suguru Ishasaki for open communication, support, and sending the Docuscope Global software and default dictionary.
Author information
Authors and Affiliations
Contributions
DBK: Conceptualization, Methodology, Software, Validation, Investigation, Resources, Data curation, Writing-Original Draft, Writing- Review and Editing, Visualization. KP: Methodology, Software, Validation, Formal Analysis, Investigation, Data curation, Writing- Review and Editing, Visualization. SS: Investigation Sanlyn Buxner: Methodology. EM: Writing- Review and Editing, Supervision. MDR-A: Conceptualization, Methodology, Investigation, Resources, Writing- Review and Editing, Supervision, Project Administration, Funding Acquisition.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Ethics approval
The methodology and questionnaires for this study were approved by the University of Arizona Institutional Review Board (IRB), protocol number: 1507953512A001.
Informed consent
Informed consent was obtained from all participants and/or their legal guardians. Participants were consented under the University of Arizona IRB as an approved project.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Kaufmann, D.B., Palawat, K., Sandhaus, S. et al. Communicating environmental data through art: the role of emotion and memory in evoking environmental action. Humanit Soc Sci Commun 10, 940 (2023). https://doi.org/10.1057/s41599-023-02459-3
Received:
Accepted:
Published:
DOI: https://doi.org/10.1057/s41599-023-02459-3