Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

The effect of water immersion on vection in virtual reality

Abstract

Research about vection (illusory self-motion) has investigated a wide range of sensory cues and employed various methods and equipment, including use of virtual reality (VR). However, there is currently no research in the field of vection on the impact of floating in water while experiencing VR. Aquatic immersion presents a new and interesting method to potentially enhance vection by reducing conflicting sensory information that is usually experienced when standing or sitting on a stable surface. This study compares vection, visually induced motion sickness, and presence among participants experiencing VR while standing on the ground or floating in water. Results show that vection was significantly enhanced for the participants in the Water condition, whose judgments of self-displacement were larger than those of participants in the Ground condition. No differences in visually induced motion sickness or presence were found between conditions. We discuss the implication of this new type of VR experience for the fields of VR and vection while also discussing future research questions that emerge from our findings.

Introduction

Multiple sensory modalities such as vision, audition, the inner ear vestibular system, the skin’s somatosensory system and the proprioceptive system of muscles and joints constantly helps us make sense of our movements in the world1. The conscious and subjective experience of self-motion, called vection, is induced by the integration of cues from these sensory modalities2. This project adds to the literature on multisensory contributions to vection by comparing vection experienced through virtual reality when floating in water and when standing on the ground.

Sensory inputs to vection

Although multiple sensory modalities are involved in vection, vision plays a predominant role3,4. A popular example of vection induced by visual input is the train illusion which occurs when a person sitting on a stationary train observes the neighboring train leaving the station, inducing erroneous vection. The study of vection started almost 150 years ago with Mach’s study5 in which he placed stationary seated participants in a large drum rotating around a vertical axis with patterns on the inside, triggering an illusion of self-rotation. Since then scholars have investigated the influence of different aspects of optical flow on vection, such as speed6,7, pattern size and density7,8, location and size of the retinal areas stimulated6, and addition of jitter and oscillation to the visual field9,10,11. Besides the importance of optic flow in inducing vection, researchers have also highlighted the role played by perceived surface properties12,13. For example, researchers invited participants to watch a display presenting motion through 3D tunnels constructed from various materials. The results indicated that bark, fabric, stone, wood and leather induced stronger vection than ceramic, glass, fur and metal. They argued that the surface properties of the materials modulate vection as they showed that perceived depth, smoothness and rigidity were related to vection strength12. Non-visual contributions to vection have also been investigated in a wide range of settings. To explore the role of the vestibular system, participants have adopted different positions such as standing, sitting or supine along with different body tilt14,15,16 and have flown on parabolic flights, which simulate zero gravity17,18,19. Researchers have also looked at the role that the vestibular system could play in decreasing the latency of vection. In the natural environment body movements directly trigger perceived self-motion20 while between one to ten seconds are needed to trigger vection after the onset of visual motion21. This latency seems associated with the sensory conflict between the visual and non-visual cues with the vestibular information being crucial22. Vibration and Vestibular stimulation have been shown to reduce vection latency as argued that “adding noise to the vestibular system reduces the reliance on vestibular cues for self-motion perception”21, p. 82.

Scholars have exposed different surfaces of participants’ skin to wind in order to understand the role played by the somatosensory system23,24. The proprioceptive system has been primarily studied in relation to walking movements23,25,26. Attention has also been given to arm movements (e.g., standing and making breaststroke motions27;). Finally, auditory stimuli, often in the form of virtual spatialized sound presented through headphones, produces relatively weak vection on its own but contributes significantly to vection when presented with visual motion28,29,30.

Many measures of vection have been applied across the field, such as the number of instances of reported vection per stimulus, vection onset time7,31, vection intensity7,32 and convincingness of vection33. Besides self-report measures of vection per se, other researchers have used measures that emphasize the downstream effects of vection, such as perceived displacement32,34, perceived body tilt, or perceived velocity35. Moreover, visual motion induces actual head and body displacement that can be measured3,36.

In order to present the visual input to their participants, studies on vection have used various displays such as rotating cylinders with the participants seated in the center and looking at the cylinder rotating horizontally around them8, flat screens37, or curved screens29. Recently, virtual reality (VR) head-mounted displays have been used to study vection32.

Previous scholars (38, p. 47) reminded us that vection “can be affected by a wide range of parameters including attention, viewing patterns, the perceived depth structure of the stimulus, perceived foreground/background distinction (even if there is no physical separation), cognitive-perceptual frameworks, ecological validity, as well as spatial presence and involvement.” They encouraged more basic and applied research in order to “come closer to fulfilling the promise of VR as an alternate reality, that enables us to perceive, behave, and more specifically locomote and orient as easily and effectively in virtual worlds as we do in our real environment” (ibid). In this way, it would be interesting to explore how floating in water could transform vection while in VR compared to standing on the ground. Participants floating in water might experience greater vection as the proprioceptive inputs are ambiguous when floating, whereas in standing position the proprioceptive system (of muscles, joints and tendons) would be providing biomechanical information about the stationary position.

Relationship between vection, VIMS, and presence

Visually induced motion sickness (VIMS) is an overarching term for motion sickness symptoms driven by visual simulation in the absence of physical motion. Cyber sickness or simulator sickness are subcategories of VIMS dependent on the equipment used, in this case respectively VR and flight simulators. According to the sensory conflict theory, VIMS would be mainly caused by a conflict between the sensory inputs from the visual, vestibular, proprioceptive, and somatosensory systems39,40. This theory indicates a relationship between vection and VIMS, whereby reducing sensory conflict should facilitate vection and reduce VIMS41. As the vestibular system seems to play a key role in vection and in its latency due to sensory conflict between the visual and the vestibular system, it has been argued that the vestibular system might be important in the occurrence of VIMS. Researchers observed lower simulator sickness scores for participants receiving vestibular stimulation coupled with the visual motion, in this way suggesting that noisy vestibular stimulation can reduce simulator sickness42.

Other researchers have also highlighted a complex and weak relationship between vection and VIMS43. The Postural Instability Theory also describes the relationship between vection and VIMS, but emphasizes the motor control system rather than sensory conflict44. This theory suggests that changes in one’s postural stability is linked to VIMS, such that sickness is often preceded by increased head and body sway. According to this theory, people with weak postural control are at greater risk of suffering from VIMS. Along the same lines, environments that lead to poor postural control will trigger VIMS more easily. Vection triggers situations in which maintaining postural control becomes increasingly difficult, thus potentially triggering VIMS.

While many VR experiences aim to produce a compelling sense of vection among the users, it comes with the risk of inducing VIMS. In order to reduce the likelihood of VIMS, VR researchers and developers have shown a great deal of imagination to decrease conflict between the different sensory systems. To do so, they have investigated ways for the users to be physically active during exploration of a large virtual environment while preventing large displacement in the physical world. These strategies included walking in place45,46, stepping in place in a human-size hamster ball47, or walking on a directional treadmill48,49 or multidirectional surface made of tiles moving in the opposite direction of the user50. Another strategy, called redirected walking, consists of curving the walking trajectory through visual rotation of the virtual environment without being noticed by the users51,52 allowing them to explore a large virtual area while walking only in a limited space.

The feeling of being immersed and spatially present in an environment, a phenomenon called presence53, has been proposed by Riecke and colleagues54 to be influenced by vection as they demonstrated an increased spatial presence correlated with an enhanced convincingness of vection. Scholars55 supported this connection by demonstrating an association between vection and presence as well. While some researchers54,55 focused on visually-induced vection, others56 demonstrated a similar connection between presence and auditorily-induced vection.

The current body of research on the association between presence and VIMS tends to indicate a negative relationship between presence and cybersickness57. The mechanisms responsible for this inverse association could be attributed to the sensory conflict theory since when experiencing great presence, the participant’s attention might be directed away from the sensory conflict58. Moreover, this negative relationship seems to be mediated by vection, among other factors.

Latency, “the delay between a user’s action/motion and when that action is visible in the display”59, p. 142, is yet another important factor to consider in the relationship between vection, VIMS and presence as increased latency can decrease presence60. Moreover, Kim et al.61 exposed participants in VR to various levels of latency in pitch in order to study the effect of sensory conflict on presence, cybersickness and spatial perception. Their results indicated that cybersickness would increase while presence would decrease as a result of increased latency. The existing body of research indicates a complex relationship between vection, VIMS and presence, leading Weech and colleagues57 to encourage scholars to measure these three variables to further understand their relationship.

Research goals

Research into understanding vection, with and without VR, has taken various forms, employed expensive equipment and sophisticated methods (such as parabolic flights), and experimented with a wide range of sensory cues. This study utilizes waterproof VR head mounted displays to understand the effects of aquatic immersion on vection, VIMS and presence.

The study has three objectives. The first objective is to present and demonstrate feasibility of a new kind of VR experience, and share a detailed research method. Despite the numerous ingenious strategies designed to better align the physical activity of the participants to the visual motion experiences in VR while immersed in water have never been tested before, to the best of our knowledge. This paper aims to demonstrate the feasibility of studying the impact of VR with participants immersed in water and offers a recipe to help future researchers design aquatic VR studies.

The second objective is to present to the vection research community a new strategy to test vection while keeping the participants stationary and minimizing contact with the stable environment. Vection researchers have been forward thinkers with advanced methods in order to dissect the role of different sensory systems on vection. We aim to demonstrate that water can be the new frontier for vection research, as it will offer new possibilities to test the role played by the different sensory systems in ways that are difficult on the ground. In particular, floating in water removes contact between the viewer and the stable real environment, which would otherwise conflict with visually presented self-motion.

The third objective is to contribute to the vection literature by providing the first empirical results comparing vection when immersed in water and when standing on the ground. These first results offer a unique opportunity to raise new research questions and reflect on the potential ways forward for this new field of research.

Results

Vection

All analyses were carried out in R version 3.5.2. A Shapiro–Wilk Normality test62 indicated significant deviations from normal distribution (p < 0.001), which can be observed in the data distribution shown in Fig. 1. Consequently, a non-parametric Kruskal–Wallis one-way analysis of variance was conducted. The analysis revealed a significant difference of egocentric distance estimations (χ2 (14, n = 38) = 29.71, p = 0.008 n = 38) between Water (M = 3.10, SD = 1.61, Mdn = 2.74, IQR = 2.70) and Ground (M = 0.75, SD = 1.05, Mdn = 0.46, IQR = 0.73) conditions. Figure 1 presents the boxplot of the egocentric distance estimation per condition that reveals two outliers. Another Kruskal–Wallis test was run after removing the outliers and this test was also significant (χ2(13, n = 36) = 30.33, p = 0.004).

Figure 1
figure 1

Comparison of the distance estimation in meters between Ground and Water conditions. Boxplot presenting medians, interquartile ranges, minimum and maximum and potential outliers. The boxplot identified the means (red squares) for each condition and two outliers (black dots) in the Ground condition.

Visually induced motion sickness

The scale reliability was good (Cronbach’s alpha = 0.73) and allowed to create an average score based on the sum of the 16 items from the SSQ (SSQ score). A Shapiro–Wilk Normality test62 indicated significant deviations from normal distribution (p < 0.001), which can be observed in Fig. 2 presenting the distribution of the data points. Consequently, a non-parametric Kruskal–Wallis test was conducted and no significant difference was found for the SSQ score (χ2 (1, n = 38) = 0.863, p = 0.353) between Water (M = 2.94, SD = 2.69, Mdn = 2, IQR = 4.50) and Ground (M = 4, SD = 3.73, Mdn = 3, IQR = 3.25) conditions (Fig. 2).

Figure 2
figure 2

Comparison of the SSQ scores (sum of the 16 SSQ items) between Ground and Water conditions. Boxplot presenting medians, interquartile ranges, minimum and maximum and potential outliers. The boxplot also shows the means (red squares) for each condition and an outlier (black dot) in the Ground condition.

Exploratory analyses were conducted on individual questionnaire items. Setting alpha to 0.05 for each comparison, “eye strain” and “stomach awareness” were significantly greater for the Ground condition while “difficulty to concentrate” and “increased salivation” were significantly greater for the Water condition (Table 1). However, those differences are not significant after applying Bonferroni correction. A second exploratory approach was adopted by computing the Bayes factors for each SSQ item. Bayes factors associated with “eye strain” and “stomach awareness” items stand out as especially unlikely to be from the same distribution. This supports the significant difference that the Kruskal–Wallis test highlighted.

Table 1 Mean, standard deviation, median, interquartile range, Kruskal–Wallis’ χ2 (1, n = 38) and p-value, and Bayes factors for the 16 SSQ items for each condition.

Relationship between presence, VIMS and vection

Non-parametric Spearman’s rank correlation analyses were performed on the variables in order to evaluate their relationship in both conditions. As presented in Table 2, while most the relationships between the variables are weak and non-significant, it is interesting to notice the moderate but non-significant negative correlation between VIMS and presence.

Table 2 Correlation coefficients (rho) and associated p values for comparisons between VIMS, presence and vection.

Presence

A Shapiro–Wilk Normality test confirmed normal distribution of the presence scores (p = 0.592). There was no significant difference in presence (t = 0.986, p = 0.331, n = 38) between the Ground (M = 3.29, SD = 0.75) and the Water conditions (M = 3.5, SD = 0.55).

Discussion

The contribution of this study is three-fold. First, this paper provides the first empirical results comparing vection, VIMS and presence felt by participants in VR either standing on the ground or floating in water. Second, this paper presents a new method to explore vection and offers new research questions in relation to water immersion. Third, as the field of VR is quickly expanding, this study presents and demonstrates the feasibility of a new kind of VR experience in water.

Vection

In this study, we compared the impact of a VR experience on vection, VIMS and presence as the participants were either standing on the ground or floating in water. The results indicated enhanced vection in water as the distance estimation was significantly greater in the Water condition. Previous research on vection has described the relative contributions of different sensory modalities. In the case of this first study on vection while floating in water, it is important to discuss the different sensory inputs experienced by the participants in the two conditions.

The neutral buoyancy experienced when immersed in water has been widely used to study the physiological and psychological human reaction to weightlessness mimicking a reduced gravity field63. In this study, participants were placed in conditions mimicking different levels of gravity, from Earth gravity in the Ground condition to reduced gravity in the Water condition. The role of the visual-vestibular interaction has been studied in microgravity induced during parabolic flights when the vestibular system does not provide the “down” reference anymore18,19. In this case, the body relies more heavily on the visual input to evaluate self-motion, even during very brief exposure to microgravity18. Since weightlessness prevents appropriate vestibular input, other sensory systems such as vision are more heavily relied on19. Researchers18 demonstrated that vection magnitude as induced by visual stimuli is significantly greater during microgravity. Beside the greater reliance on the visual system when the vestibular system input is diminished during microgravity, Researchers18 also suggest that microgravity decreases the conflict between the visual and vestibular information. In this way, our findings align with previous studies on the positive impact of microgravity on vection, as the increased reliance on vision that occurs during parabolic flight also seems to occur during water immersion.

To the extent of our knowledge, the prone position adopted by participants in the Water condition has not been implemented in previous studies, making it difficult to infer the underlying mechanisms explaining how the different positions between Ground (standing) and Water (prone) conditions impacted vection. Although the prone position has not been previously tested, related research indicates that participants in supine position (lying on the back) demonstrate faster vection onset than seated participants, particularly for visual motion parallel to the ground14. Other research has shown that tilting the body away from vertical increases perceived vection parallel to the body axis16.

Second, the somatosensory input was different for our participants in the different conditions. The role of the somatosensory system has been explored by scholars23 investigating the impact of directional wind on the participants’ face suggesting forward self-motion combined with visual input. They found that participants experiencing visual and somatosensory motion cues experienced greater vection than participants experiencing visual cues only. Murata and colleagues24 combined somatosensory and proprioceptive systems while blindfolding the participants to avoid visual cues. The participants were seated on a horse-riding machine and swayed back and forth while directional wind was blowing on their whole body. Consistent with Seno’s and colleagues23 findings, the participants in the wind condition experienced greater vection than in the condition lacking somatosensory cues of motion. In our study, the participants in the Water condition had the somatosensory input from the water moving around their body. In this way, the water could have played a similar role than the wind previously used in studies23,24 and induced greater vection than in the Ground condition. Although in our case, it needs to be highlighted that the water movement was not directional.

Third, the proprioceptive system of muscles, joints, and tendons that provides biomechanical information about motion also contributes to vection and has been investigated mainly in relation to walking but also in relation to breaststroke movements. Researchers28 investigated how breaststroke movements with the arms and hands of standing participants would influence vection compared to passive viewing of an optic flow. They also investigated the impact of the congruence between the breaststroke movements and the optic flow. They showed that breaststroke movements (despite its congruence) increase vection compared to the passive condition and suggested that congruent breaststroke movement results in greater vection than incongruent movement. In our case, the participants in the Water condition were able to engage in swimming movements that could have a positive impact on vection. Participants’ behavior in the current study ranged from floating motionless to actively engaging in swimming motion, although these behaviors were not explicitly coded and analyzed. Another important proprioceptive difference between conditions is that participants in the Water condition had no contact with stable features of the environment. In contrast, participants in the Ground condition had constant contact between their feet and the stable ground, thereby providing biomechanical information about their stationary position.

The input from the different sensory systems involved in this study constitute a range of factors that could explain the significant difference in vection experienced between the two conditions. As this is the first study in the field of vection while floating in water while also in VR, this result opens up a wide range of new research questions that are now possible to investigate in an aquatic environment.

Visually induced motion sickness (VIMS)

VIMS was measured with the commonly used Simulator Sickness Questionnaire (SSQ64) containing 16 items which can be separated in three subscales indicative for disorientation, oculomotor and nausea. As the reliability of two of the three subscales were too low, we looked primarily at the overall SSQ score where there was no significant difference between conditions. As this is the first time this scale is used in a floating situation, we decided to explore the 16 items individually. We found out that two items scored higher in the Water condition (“increased salivation” and “difficulty concentrating”) and two other scored higher in the Ground condition (“eyestrain” and “stomach awareness”), although these differences did not withstand correction for multiple comparisons.

The higher score for the “increased salivation” item for participants in Water condition could potentially be attributed to the different buccal situations in the two conditions. In the Water condition, participants wore a mouth piece and breathed through their mouth. The unfamiliarity of the equipment might contribute to less swallowing of the saliva compared to a more natural situation in the Ground condition. In this way, this item might not contribute to VIMS when the participants are equipped with a mouthpiece.

The higher score for the “difficulty concentrating” item for participants in Water condition could be attributed to the unusual situation in which the participants are finding themselves. As participants are floating in the pool with unfamiliar equipment, the “difficulty in concentrating” item might be an indication of something else than VIMS and might thus not be a reliable item in this specific context.

The higher score for the “eyestrain” item in participants in the Ground condition might be due to the fact that the headset, although using the same phone, was different thus potentially creating a discrepancy in ocular comfort between the conditions. As in the Water condition, the water itself serves as a lens that is not present in the Ground condition. This difference of equipment between conditions could explain the difference observed on this SSQ item.

The difference in “stomach awareness” scores between Ground and Water conditions could potentially be an indication of greater VIMS in the Ground condition. Standing on the ground while experiencing visual motion might trigger sensory conflict that has been argued to cause VIMS39,40. In this way, this could be the very first indication that experiencing VR while immersed in water might lessen VIMS. As described earlier, latency can also contribute to VIMS22. In this study, as the same device was used in both conditions, all the participants were exposed to the same potential effect of latency. Moreover, the latency of the Samsung S8 remains approximately below 20 ms, a level of latency that seems unlikely to cause VIMS, although latency perception varies among people59,65.

Presence

Presence, the psychological state of feeling present in a virtual environment66 was measured with 6 questions. As previously demonstrated67 presence tends to be positively correlated to virtual immersion. As an example, in studies where participants are experiencing similar virtual activities with different levels of immersion such as VR headset versus a laptop computer, participants in the more immersive conditions tend to experience greater presence68. It has also been argued that a system is more likely to be immersive if it offers high fidelity simulation through multiple sensory modalities66. In this study, the participants in the Water condition were indeed subjected to a more immersive VR experience as the water immersion increased alignment between visual and non-visual sensory modalities. Surprisingly, the presence reported by the participants did not differ between conditions. Hypothetically, one can imagine that the participants in the Water condition might have been distracted by the equipment or that the lack of spatial grounding might have also had an impact on the presence. It would be interesting to investigate if giving them more time to acclimate to the procedure would mitigate this potential distraction. Further studies exploring the feeling of presence in aquatic VR are required to confirm our results.

Relationship between vection, VIMS and presence

While researchers are still trying to clarify the relationship between vection, VIMS and presence, the sensory conflict theory seems to play a key role in making sense of these relationships. A negative relationship seems to exist between VIMS and vection41 and between VIMS and presence57 which seems to be mediated by vection. This study aimed at contributing to the knowledge concerning these relationships by exploring potential correlations between the three variables in the two conditions. While the results did not reveal any significant correlation, and thus prevent us from drawing any strong conclusion, VIMS and presence showed a moderate negative correlation consistent with past research57.

Limitations and future studies

This very first study on the impact of aquatic VR on vection, VIMS and presence has some limitations that need to be addressed and taken into account in future studies. First, the study was canceled prematurely due to the COVID-19 pandemic. This resulted in a lower number of data points than we had hoped to collect, although the final data set is still larger than many studies published on the psychology of VR. For example, in Table 1 of the meta-analysis on immersion and presence by Cummings and Bailenson67, almost half (38 out of 83) studies had samples of 38 participants or fewer. Nonetheless, we recognize that we might be missing effects due to the low number of data points (type II error) and look forward to replicating and extending this work once the pandemic ends.

Another limitation might reside in the measurement tool used in this new environment to measure VIMS. The SSQ might not be the most appropriate tool for investigating VIMS in an aquatic environment and thus, we encourage future researchers to investigate other VIMS scales. A shorter scale focusing on the nausea aspect such as the misery scale69, adapted to study VIMS and cyber sickness70,71 might be a better choice as it focuses mainly on the stomach awareness aspect of motion sickness and should be investigated in future studies. The fast motion sickness scale, where the participants verbally rate the potential sickness from 0 (no sickness at all) to 20 (frank sickness) could also be used72. Besides being potentially more adapted to fit the reality of an experiment in the water, the misery scale and the fast motion sickness scale have the advantages of being shorter than the SSQ.

In this study, vection was measured solely through distance estimation. For future studies, an important addition to strengthen the measure of vection would be the intensity of vection that is a commonly measured variable in vection research7,32. To do so, after treatment, the participants would be asked to rate the intensity of vection on a given scale.

Another factor to pay attention to in future studies resides in the fact that in the Water condition two layers of transparent material sit between the screen of the phone and the eyes of the participants, namely the transparent plastic of the diving mask and a layer of water between the mask and the phone. These two layers are not present in the Ground condition and their impact should be investigated while tapping into research on vision through transparent layers73 and refractive structure74.

Scholars in the field of vection have shown how complex the relationship between vection, VIMS and presence is. While this study measured these three variables and looked into their relationships, nothing significant was found. In our future study, we will keep investigating how these three variables impact each other.

In this study, the visual and auditory inputs were the same across conditions while the proprioceptive cues differed between conditions. It would be valuable in future studies to independently examine the auditory and visual inputs in order to better understand how each of them contribute to vection, VIMS and presence.

Conclusion

For almost 150 years, perception scholars have been creative in finding ways to investigate the role of different sensory inputs on vection. This paper opens new possibilities in how to study vection in a new environment. The findings discussed above present new research questions that may prove interesting to scholars in this field. The methodology reported here will be valuable in making the aquatic environment a new component of the vection research field. Developing, testing, and implementing these new tools raises practical challenges, and we believe this work will serve as a blueprint for future research.

Vection has been a central question in the field of VR and VIMS has been a constant struggle, with the goal of making vection more seamless in VR. Moreover, the field of VR has been extremely creative in developing immersive experiences. Aquatic VR presents a new frontier in the field of virtual immersion that can be explored thanks to the equipment used in this study. While more research is needed, aquatic VR might trigger vection without the associated VIMS and thus make VR activities comfortable enough to keep pushing the boundaries of what can be experienced in VR.

This first empirical work investigating the impact of a VR experience for participants floating in water found that vection is enhanced by the aquatic environment. This study also suggests that presence and VIMS are not affected by the aquatic environment. The encouraging results concerning vection in water call for further exploration of the underlying sensory mechanisms involved in this enhanced vection. The findings also challenge the use of well-established tools to measure VIMS in a novel situation. In sum, this study raises a wide range of new empirical and methodological questions concerning the perceptual experiences associated with water immersion. This constitutes an encouraging invitation to the fields of vection and VR to explore further the affordances of being immersed in water while simultaneously immersed in a virtual environment.

Method

Participants

The study aimed at collecting data from 30 participants in each condition. Flyers and emails recruiting participants comfortable with a snorkel were distributed in the community surrounding the pool where the study would take place. The study was planned to run from February until the end of March 2020 and the time slots filled in quickly. After collecting data from 39 participants, the pandemic hit and the study had to be stopped, leaving fewer data points than expected. One participant decided to discontinue their participation and was excluded from the analysis. The final sample (n = 38) consisted of 23 females (n = 12 in Ground condition, n = 11 in Water condition) and 15 males (n = 8 in Ground condition, n = 7 in Water condition). Their age ranged from 18 to 45 years old (M = 24.8, SD = 7.5). Twenty identified as White Caucasian (52.6%), 6 identified as Chinese (15.8%), 3 identified as multiracial (7.9%), 2 identified as African American (5.3%), 2 identified as Indian (5.3%), 2 identified as Japanese (5.3%), 1 identified as Latino (2.6%), 1 identified as Middle Eastern (2.6%) and 1 declined to answer (2.6%). All the participants volunteered for the experiment and were paid an honorarium for their participation. All protocols were carried out in accordance with current guidelines and regulations. The methods were approved by Stanford’s Institutional Review Board and we obtained informed consent from all the participants.

Equipment

In both conditions, the virtual environment was displayed on a Samsung Galaxy S8 SM-G950F mobile phone which runs a custom operating system designed by Ballast Technologies, Inc. The operating system allows for seamless activation of the VR content experiences. The VR experience is activated and stopped by touching a near-field communication (NFC) card to the back of the phone (encased in different gears depending on the condition, see below) allowing for operation without the need for menu navigation or user interfaces. The researchers used two NFC cards, each assigned a special code to start or stop the VR experience entitled OceanDIVR, a 5-min long VR activity developed by Ballast Technologies, Inc. In this VR activity, the participant embarks on an approximately 285-m long drift dive (distance along the x-axis: 270.5 m, distance along the y-axis, − 90.5 m) in the ocean, visiting underwater wrecks, caves and submarines while encountering manta rays, sharks, and a pod of singing humpback whales. The virtual dive simulates a self-motion of 1.77 m per second.

Ground condition

In the Ground condition, the Samsung S8 mobile phone displaying the VR activity was encased in the Samsung Gear VR headset compatible with the Samsung 8. Headphones were connected to the phone for audio input.

Water condition

The Samsung S8 mobile phone displaying the VR activity, encased in a waterproof enclosure, was placed in the DIVR headset, designed to incorporate water between the lenses and the display to minimize buoyancy of the headset and maximize visual clarity while submerged in water (Fig. 3). The sound traveled from the phone to the participants through the water.

Figure 3
figure 3

DIVR headset. Image courtesy of Ballast Technologies, Inc.

The participants were equipped with a flotation belt around their hips and a snorkel so they could comfortably float on their stomach. Each participant was assigned an interchangeable mouthpiece that was sanitized and recycled after use. In order to prevent the participants from swimming around the pool or drifting away, the flotation belt clipped into an elastic tether which was attached to an anchor placed on the pool floor. The elastic tether had a medium resistance tension, so that the users would not be aware of any abrupt tension on the tether. With their vision obscured by the headset, the participants were not aware of being held in place by the tether and anchor as the researcher hooked the tether to the flotation belt after the participant put the headset on. Figure 4 illustrates the participants with the equipment.

Figure 4
figure 4

Equipment for the Water condition. The participant was equipped with the DIVR headset (A) encasing a Samsung S8 mobile phone displaying the VR activity. During the activity, the participant with their head underwater breathed through a snorkel (B). In order to keep the participant positively buoyant, a flotation belt (C) was placed around their hips. The flotation belt clipped into an elastic tether (D) attached to a 40-lb anchor placed on the pool floor (F). A pair of orange armbands (E) were attached to the tether mid-water and were used as an orange mark to draw the participant's attention to where they would start the VR activity before entering the water.

Procedure

The study was run in February 2020. Participants who signed up for the study were invited to meet the researcher on the deck of a local swimming pool they had free access to (Fig. 5). Upon their arrival, participants were instructed to read and sign the consent form along with a form for paid participation. Since international participants were expected to join the study, the researcher asked them which unit of length they were most comfortable with: meters or yards. By letting the participants choose, we tried to limit the inaccuracy due to a lack of familiarity with an imposed unit of length. Depending on the participants' choice, the researcher presented a ruler of 1 m or 1 yard and instructed the participants to look at it for 5 s. The participants were then asked to sit down and, wearing headphones to avoid background noise distraction, took a pre-survey including demographic questions. After completing the pre-survey, the participants were assigned to their condition; Ground or Water.

Figure 5
figure 5

Bird’s eye view of the pool where the study was conducted. The participants reached the deck of the pool through the entrance and walked toward the bench (A) where they completed the pre-survey. Participants assigned to the Ground condition were invited to walk to the orange mark on the ground (D) where they started their VR experience. Participants assigned to the Water condition were first invited to go to the edge of the pool (B) where the researcher in charge of the Water condition briefed them and presented the equipment. The participants then entered the water through the stairs (C) and swam to the orange mark (E) where they started their VR experience.

Ground condition

The researcher pointed at an orange duct tape cross on the floor situated approximately 15 m away and asked the participant: “Do you see the orange mark on the floor?” After receiving an affirmative answer from the participant, the researcher said: “This is where you will start your VR experience” and invited the participant to follow them to the orange mark. The participant was informed that they would watch a 5-min video with a VR headset. The researcher explained that the participant could look all around during the experience. The researcher also made clear that the participant could stop the activity at any point if anything was uncomfortable. Finally, the researcher instructed the participant to indicate when the VR activity was over but to not remove the headset on their own. The researcher helped the participant adjust the headset and placed the unplugged headphones on the participant head then placed the activation card labeled “Ocean DIVR” in front of the headset to start the experience. When the researcher heard the sound indicating that Ocean DIVR had started, they plugged the headphones into the headset. The researcher stayed next to the participant during the duration of the VR experience for safety purposes.

When the participant informed the researcher that the activity was over, the researcher removed the headphones and reminded the participant to keep the headset on. Then the researcher read the following instructions to the participant:

Before removing the headset, I would like you to think about your position in the real world. Can you tell me which unit of length you selected earlier?

The participant then reminded the researcher whether they selected meters or yards.

Remember that I showed you the length of a [meter or yard]. How far away in the physical world do you think you are from the orange mark where you started the VR activity?

The participant orally gave their answer that was recorded by the researcher. Then, the researcher removed the headset and invited the participant to walk back to the bench to take a second survey while wearing headphones to avoid background noise distraction. Finally, the participant was thanked for their participation and informed that they would receive their compensation by the end of the study.

Water condition

The participant was invited to come to the edge of the pool to be briefed about the experience. The researcher informed them that they would watch a 5-min video with a VR headset while floating horizontally in the pool. The researcher told the participants that they could look all around during the experience. The researcher made clear that the participant could stop the activity at any point if anything was uncomfortable. The headset, the snorkel, and the flotation belt were shown to the participant along with a picture of a fully-equipped individual in the water so that the participant could picture themselves in the situation. The participant was also told that the researcher would hook them up to a safety line that will help them follow the participant around and keep them safe. The participant was also informed that the procedure would include minimal physical contact such as holding their hand in order to help the participant maintain their balance when moving between the floating and standing position in water. The researcher asked if they would be okay with this minimal physical contact and all participants agreed. The participant was then given the opportunity to ask questions. Finally, the researcher invited the participants to get ready to go in the water by removing the clothes they were wearing on top of their swim suit.

Instead of an orange duct-tape cross in the Ground condition, the researcher pointed at two orange armbands hooked to the anchor and situated approximately 15 m away and asked the participant: “Do you see the orange mark in the water?” After receiving an affirmative answer from the participant, the researcher said: “This is where you will start your VR experience” and invited the participant to follow them to the orange armbands in the pool.

Before starting the VR activity, the researcher assisted the participant in putting the headset on and adjusting it. The participant was then instructed to place the snorkel in their mouth and to float horizontally with their head underwater to test the mask and the snorkel. If the equipment did not feel right, the researcher would assist in adjusting the seal of the mask, tightening the mask or making sure the snorkel was correctly positioned. Once the participant felt like the equipment was positioned correctly, the researcher started the Ocean DIVR experience by placing the activation card in front of the headset, and invited the participant to go back to the horizontal floating position. At that moment, the researcher hooked up the flotation belt to the anchor and stayed next to the participant during the duration of the VR experience for safety purposes. When the participant tried to stand up at the end of the activity, the researcher grabbed their hand and helped them land on their feet without touching the anchor.

When the participant was securely standing, the researcher read the same script as in the Ground condition in order for the participant to estimate how far away from their starting point they thought they were standing. Then, the headset was removed and the participant was invited to exit the pool and to use the provided towels and robes to dry out and get warm before taking a second survey, while wearing the headphones to avoid background noise distraction. Finally, the participant was thanked for their participation and informed that they would receive their compensation by the end of the study.

Measure

Vection

We measured vection through a perceived body displacement technique known as egocentric distance estimation. This technique quantifies how far away from the initial position participants thought they were when they completed the VR experience. Similar to the reference point method used by previous researchers28, the participants started the VR activity at a reference point (a large orange “X” on the floor for the Ground condition and two orange armbands floating midwater for the Water condition). At the end of the VR activity, before removing their headset, the participants were asked to estimate in a familiar unit of length, how far away they were standing from their reference point75.

Visually induced motion sickness (VIMS)

The Simulator Sickness Questionnaire (SSQ64:) was used to measure VIMS. This questionnaire has been frequently used in the field of VR and vection. SSQ includes 16 items surveying the occurrence of motion sickness symptoms such as nausea, eyestrain, stomach awareness and general discomfort rated on a 4-point Likert scale ranging from “not at all” to “severe”. The participants filled out the SSQ right after the experimental treatment. Kennedy et al.64 suggested clustering the items into three factors: Nausea, Oculomotor and Disorientation. This method was not satisfactory as the reliability of two out of the three scales were low (Nausea: Cronbach’s alpha = 0.26 [M = 0.25, SD = 0.22], Oculomotor: Cronbach’s alpha = 0.57 [M = 0.28, SD = 0.27], Disorientation: Cronbach’s alpha = 0.72 [M = 0.23, SD = 0.3]). Instead of using the three factors from Kennedy et al. (2003), and since this is an exploratory study, we took two approaches. First, as the overall reliability for the 16 items together was good (Cronbach’s alpha = 0.73 [M = 0.23, SD = 0.22]) we looked at the SSQ score (method previously used76). Second, we analyzed each item separately (as previously done77).

Presence

The presence scale included six items on a 5-point Likert-scale from “Not at all” to “Very strongly” and adapted from Nowak and Biocca78. This scale included items such as “How much did it feel as if you visited another place?” and “How much was the virtual world like the real world?” Since the experiment took place outdoors, an item inquired the extent to which the weather conditions distracted the participants. This item was reversed coded. The reliability of the scale was good with a Cronbach’s alpha = 0.81 (M = 3.66, SD = 0.59).

Approval for human experiments

All the participants volunteered for the experiment and were paid an honorarium for their participation. All protocols were carried out in accordance with current guidelines and regulations. The methods were approved by Stanford’s Institutional Review Board and we obtained informed consent from all the participants.

References

  1. 1.

    Gibson, J. J. The Senses Considered as Perceptual Systems (Houghton Mifflin, Boston, 1966).

    Google Scholar 

  2. 2.

    Rieser, J. J., Pick, H. L., Ashmead, D. H. & Garing, A. E. Calibration of human locomotion and models of perceptual-motor organization. J. Exp. Psychol. Human 21, 480–497 (1995).

    CAS  Article  Google Scholar 

  3. 3.

    Dichgans, J. & Brandt, T. Visual-vestibular interaction: Effects on self-motion perception and postural control in Handbook of Sensory Physiology Vol. VIII: Perception (eds. Held R., Leibowitz H. and Teuber H. L.) 755–804 (Springer, Heidelberg, 1978).

  4. 4.

    Howard, I. P. Human Visual Orientation (Wiley, New York, 1982).

    Google Scholar 

  5. 5.

    Mach, E. Grundlinien Der Lehre Von Der Bewegungsempfindung (Engelmann, Berlin, 1875).

    Google Scholar 

  6. 6.

    Brandt, T., Dichgans, J. & Koenig, E. Differential effects of central versus peripheral vision on egocentric and exocentric motion perception. Exp. Brain Res. 16, 476–491 (1973).

    CAS  PubMed  Article  PubMed Central  Google Scholar 

  7. 7.

    Keshavarz, B., Phillip-Muller, A. E., Hemmerich, W., Riecke, B. E. & Campos, J. L. The effect of visual motion stimulus characteristics on vection and visually induced motion sickness. Displays 58, 71–81 (2019).

    Article  Google Scholar 

  8. 8.

    Brandt, T., Wist, E. R. & Dichgans, J. Foreground and background in dynamic spatial orientation. Percept. Psychophys. 17, 497–503 (1975).

    Article  Google Scholar 

  9. 9.

    Palmisano, S., Gillam, B. J. & Blackburn, S. G. Global-perspective jitter improves vection in central vision. Perception 29, 57–67 (2000).

    CAS  PubMed  Article  PubMed Central  Google Scholar 

  10. 10.

    Palmisano, S., Allison, R. S., Kim, J. & Bonato, F. Simulated viewpoint jitter shakes sensory conflict accounts of vection. Seeing Perceiv. 24, 173–200 (2011).

    Article  Google Scholar 

  11. 11.

    Kim, J. & Palmisano, S. Effects of active and passive viewpoint jitter on vection in depth. Brain Res. Bull. 77, 335–342 (2008).

    PubMed  Article  PubMed Central  Google Scholar 

  12. 12.

    Morimoto, Y., Sato, H., Hiramatsu, C. & Seno, T. Material surface properties modulate vection strength. Exp. Brain Res. 237(10), 2675–2690 (2019).

    PubMed  Article  PubMed Central  Google Scholar 

  13. 13.

    Kim, J., Khuu, S. & Palmisano, S. Vection depends on perceived surface properties. Atten. Percept. Psychophy 78(4), 1163–1173 (2016).

    Article  Google Scholar 

  14. 14.

    Kano, C. The perception of self-motion induced by peripheral visual information in sitting and supine postures. Ecol. Psychol. 3, 241–252 (1991).

    Article  Google Scholar 

  15. 15.

    Young, L. R., Shelhamer, M. & Modestino, S. M.I.T./Canadian vestibular experiments on the Spacelab-1 mission: 2. Visual vestibular tilt interaction in weightlessness. Exp. Brain Res. 64, 299–307 (1986).

    CAS  PubMed  PubMed Central  Google Scholar 

  16. 16.

    Nakamura, S. & Shimojo, S. Orientation of selective effects of body tilt on visually induced perception of self-motion. Percept. Mot. Skills 87, 667–672 (1998).

    CAS  PubMed  Article  PubMed Central  Google Scholar 

  17. 17.

    Allison, R. S., Zacher, J. E., Kirollos, R., Guterman, P. S. & Palmisano, S. Perception of smooth and perturbed vection in short-duration microgravity. Exp. Brain Res. 223, 479–487 (2012).

    PubMed  Article  PubMed Central  Google Scholar 

  18. 18.

    Cheung, B. S. K., Howard, I. P. & Money, K. E. Visually-induced tilt during parabolic flights. Exp. Brain Res. 81, 391–397 (1990).

    CAS  PubMed  Article  PubMed Central  Google Scholar 

  19. 19.

    Young, L. R. & Shelhamer, M. Microgravity enhances the relative contribution of visually-induced motion sensation. Aviat. Space Envir. Md. 61, 525–530 (1990).

    CAS  Google Scholar 

  20. 20.

    Dichgans, J. & Brandt, T. Visual–vestibular interaction: effects on self-motion perception and postural control: In Perception (eds. R. Held, G. W. Leibowitz and H.-I. Teuber), Handbook of Sensory Physiology, Vol. 8, 755–804. Springer, Berlin 1978).

  21. 21.

    Weech, S. & Troje, N. F. Vection latency is reduced by bone-conducted vibration and noisy galvanic vestibular stimulation. Multisens. Res. 30(1), 65–90 (2017).

    Article  Google Scholar 

  22. 22.

    Israël, I. & Warren, W. H. (2005). Vestibular, proprioceptive, and visual influences on the perception of orientation and self-motion in humans. In Head Direction Cells and the Neural Mechanisms of Spatial Orientation (eds. S. I. Wiener and J. S. Taube) 347–381. (MIT Press, Cambridge, 2005).

  23. 23.

    Seno, T., Ogawa, M., Ito, H. & Sunaga, S. Consistent air flow to the face facilitates vection. Perception 40, 1237–1240 (2011).

    PubMed  Article  PubMed Central  Google Scholar 

  24. 24.

    Murata, K., Seno, T., Ozawa, Y. & Ichihara, S. Self-motion perception induced by cutaneous sensation caused by constant wind. Psychology 5, 1777–1782 (2014).

    Article  Google Scholar 

  25. 25.

    Ash, A., Palmisano, S., Ahorp, D. & Allison, R. S. Vection in depth during treadmill walking. Perception 42, 562–576 (2013).

    PubMed  Article  PubMed Central  Google Scholar 

  26. 26.

    Riecke, B. E., Freiberg, J. B. & Grechkin, T. Y. Can walking motions improve visually induced rotational self-motion illusions in virtual reality?. J. Vis. 15, 1–15 (2015).

    Article  Google Scholar 

  27. 27.

    Seno, T., Funatsu, F. & Palmisano, S. Virtual swimming - Breaststroke body movements facilitate vection. Multisens. Res. 26, 267–275 (2013).

    PubMed  Article  PubMed Central  Google Scholar 

  28. 28.

    Kapralos, B., Zikovitz, D., Jenkin, M. R. & Harris, L. R. Auditory cues in the perception of self-motion. In Proceedings of the 116th AES convention 1–14 (AES, 2004).

  29. 29.

    Keshavarz, B., Hettinger, L. J., Vena, D. & Campos, J. L. Combined effects of auditory and visual cues on the perception of vection. Exp. Brain Res. 232, 827–836 (2014).

    PubMed  Article  PubMed Central  Google Scholar 

  30. 30.

    Shayman, C. S. et al. Frequency-dependent integration of auditory and vestibular cues for self-motion perception. J. Neurophysiol. 123, 936–944 (2020).

    PubMed  Article  PubMed Central  Google Scholar 

  31. 31.

    Palmisano, S. & Riecke, B. E. The search for instantaneous vection: An oscillating visual prime reduces vection onset latency. PLoS ONE 13, e0195886 (2018).

    PubMed  PubMed Central  Article  CAS  Google Scholar 

  32. 32.

    Riecke, B. E., Feuereissen, D., Rieser, J. J. & McNamara, T. P. Self-motion illusions (vection) in VR—Are they good for anything? In Proceedings of IEEE Virtual Reality 2005. 35–38 (IEEE, 2012).

  33. 33.

    Riecke, B. E., Väljamäe, A. & Schulte-Pelkum, J. Moving sounds enhance the visually-induced self-motion illusion (circular vection) in virtual reality. ACM T. Appl. Percept. 6, 7:1-7:27 (2009).

    Google Scholar 

  34. 34.

    Jürgens, R. & Becker, W. Perception of angular displacement without landmarks: evidence for Bayesian fusion of vestibular, optokinetic, podokinesthetic, and cognitive information. Exp. Brain Res. 174, 528–543 (2006).

    PubMed  Article  PubMed Central  Google Scholar 

  35. 35.

    Hettinger, L. J., Schmidt-Daly, T. N., Jones, D. L. & Keshavarz, B. Illusory self-motion in virtual environments. In Handbook of Virtual Environments: Design, Implementation, and Applications (eds. Hale, K., & Stanney, K. M.) 435–465. (Taylor & Francis Group, 2002).

  36. 36.

    Lee, D. N. & Lishman, J. R. Visual proprioceptive control of stance. J. Hum. Mov. Stud. 1, 87–95 (1975).

    Google Scholar 

  37. 37.

    Klein, E., Swan, J. E., Schmidt, G. S., Livingston, M. A. & Staadt, O. G. Measurement protocols for medium-field distance perception in large-screen immersive displays. Proc. IEEE Virt. Real. 2009, 107–113 (2009).

    Google Scholar 

  38. 38.

    Riecke, B. & Schulte-Pelkum, J. Perceptual and cognitive factors for self-motion simulation in virtual environments: How can self-motion illusions (“vection”) be utilized? In Human Walking in Virtual Environments: Perception, Technology, and Applications (eds. Steinicke, F., Visell, Y., Campos, J., and Lecuyer, A.) 27–54 (Springer, New York, 2013).

  39. 39.

    Reason, J. T. Motion sickness adaptation: a neural mismatch model. J. R. Soc. Med. 71, 819–829 (1978).

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  40. 40.

    Bos, J. E., Bles, W. & Groen, E. L. A theory on visually induced motion sickness. Displays 29, 47–57 (2008).

    Article  Google Scholar 

  41. 41.

    Keshavarz, B., Riecke, B. E., Hettinger, L. J. & Campos, J. L. Vection and visually induced motion sickness: How are they related?. Front. Psychol. 6, 1–11 (2015).

    Google Scholar 

  42. 42.

    Weech, S., Moon, J. & Troje, N. F. Influence of bone-conducted vibration on simulator sickness in virtual reality. PLoS ONE 13(3), 1–21 (2018).

    Article  CAS  Google Scholar 

  43. 43.

    Palmisano, S., Mursic, R. & Kim, J. Vection and cybersickness generated by head-and-display motion in the Oculus Rift. Displays 46, 1–8 (2017).

    Article  Google Scholar 

  44. 44.

    Riccio, G. & Stoffregen, T. An ecological theory of motion sickness and postural instability. Ecol. Psychol. 3, 195–240 (1991).

    Article  Google Scholar 

  45. 45.

    Templeman, J. N., Denbrook, P. S. & Sibert, L. E. Virtual locomotion: walking in place through virtual environments. Presence 8, 598–617 (1999).

    Article  Google Scholar 

  46. 46.

    Lee, J., Ahn, S. C. & Hwang, J. I. A walking-in-place method for virtual reality using position and orientation tracking. Sensors 18, 2832 (2018).

    Article  Google Scholar 

  47. 47.

    Medina, E., Fruland, R. & Weghorst, S. (VIRTUSPHERE: Walking in a human size VR “hamster ball”. In Proceedings of the Human Factors and Ergonomics Society 52nd Annual Meeting 2008. 2102–2106 (2008).

  48. 48.

    Bouguila, L. & Sato, M. Virtual locomotion system for large-scale virtual environment. Proc. IEEE Virt. Real. Conf. 2002, 291–292 (2002).

    Google Scholar 

  49. 49.

    Souman, J. L. et al. CyberWalk: enabling unconstrained omnidirectional walking through virtual environments. ACM T. Appl. Percept. 8, 4:1-4:22 (2011).

    Google Scholar 

  50. 50.

    Iwata, H., Yano, H., Fukushima, H. & Noma, H. CirculaFloor. IEEE Comput. Gr. 25, 64–67 (2005).

    Article  Google Scholar 

  51. 51.

    Steinicke, F. et al. Real walking through virtual environments by redirection techniques. J. Virt. Real. Broadcast. 6, 999–1004 (2009).

    Google Scholar 

  52. 52.

    Rothacher, Y., Nguyen, A., Lenggenhager, B. & Kunz, A. Visual capture of gait during redirected walking. Sci. Rep. 8, 1–13 (2018).

    Article  CAS  Google Scholar 

  53. 53.

    Heeter, C. Being there: The subjective experience of presence. Presence Teleoper. Virt. Environ. 1, 262–271 (1992).

    Article  Google Scholar 

  54. 54.

    Riecke, B., Schulte-Pelum, J., Avraamides, M. N., Von Der Heyde, M. & Bulthoff, H. Cognitive factors can influence self-motion perception (vection) in virtual reality. ACM Trans. Appl. Percept. 3, 194–216 (2006).

    Article  Google Scholar 

  55. 55.

    Chertoff, D. B. & Schatz, S. L. Beyond presence: how holistic experience drives training and education, in Handbook of Virtual Environments: Design, Implementation, and Applications (eds Hale K. S. and Stanney K. M.) 857–872 (Taylor & Francis Group, 2014).

  56. 56.

    Larsson, P., Västfjäll, D., & Kleiner, M. Perception of self-motion and presence in auditory virtual environments. In Proceedings of 7th Annual Workshop Presence 2004. 252–258 (2004).

  57. 57.

    Weech, S., Kenny, S. & Barnett-Cowan, M. Presence and cybersickness in virtual reality are negatively related: a review. Front. Psychol. 10, 1–19 (2019).

    Article  Google Scholar 

  58. 58.

    Cooper, N. et al. The effects of substitute multisensory feedback on task performance and the sense of presence in a virtual reality environment. Perception. 45, 332–333 (2016).

    Google Scholar 

  59. 59.

    Meehan, M., Razzaque, S., Whitton, M. C. & Brooks, F. P. Effect of latency on presence in stressful virtual environments. Proc. IEEE Virt. Real. 2003, 141–148 (2003).

    Google Scholar 

  60. 60.

    Welch, R. B., Blackmon, T. T., Liu, A., Mellers, B. A. & Stark, L. W. The effects of pictorial realism, delay of visual feedback, and observer interactivity on the subjective sense of presence. Presence Teleoper. Virt. Environ. 5, 263–273 (1996).

    Article  Google Scholar 

  61. 61.

    Kim, J., Luu, W. & Palmisano, S. Multisensory integration and the experience of scene instability, presence and cybersickness in virtual environments. Comput. Hum. Behav. 113, 106484 (2020).

    Article  Google Scholar 

  62. 62.

    Shapiro, S. S. & Wilk, M. B. An analysis of variance test for normality (complete samples). Biometrika 52, 591–611 (1965).

    MathSciNet  MATH  Article  Google Scholar 

  63. 63.

    Duddy, J. H. Weightless Simulation Using Water Immersion Techniques: An Annotated Bibiography. (Lockheed Missiles & Space Company, 1967).

  64. 64.

    Kennedy, R. S., Lane, N. E., Berbaum, K. S. & Lilienthal, M. G. Simulator sickness questionnaire: an enhanced method for quantifying simulator sickness. Int. J. Aviat. Psychol. 3, 203–220 (1993).

    Article  Google Scholar 

  65. 65.

    Elbamby, M. S., Perfecto, C., Bennis, M. & Doppler, K. Toward low-latency and ultra-reliable virtual reality. IEEE Netw. 32(2), 78–84 (2018).

    Article  Google Scholar 

  66. 66.

    Slater, M. & Wilbur, S. A framework for immersive virtual environments (FIVE): Speculations on the role of presence in virtual environments. Presence Teleoper. Virt. Environ. 6, 603–616 (1997).

    Article  Google Scholar 

  67. 67.

    Cummings, J. J. & Bailenson, J. N. How immersive is enough? A meta-analysis of the effect of immersive technology on user presence. Media Psychol. 19, 272–309 (2016).

    Article  Google Scholar 

  68. 68.

    Oh, C., Herrera, F. & Bailenson, J. The effects of immersion and real-world distractions on virtual social interactions. Cyberpsychol. Behav. Soc. Netw. 22, 365–372 (2019).

    PubMed  Article  PubMed Central  Google Scholar 

  69. 69.

    Bos, J. E., MacKinnon, S. N. & Patterson, A. Motion sickness symptoms in a ship motion simulator: effects of inside, outside, and no view. Aviat. Space Environ. Med. 76(12), 1111–1118 (2005).

    PubMed  PubMed Central  Google Scholar 

  70. 70.

    Bos, J. E., de Vries, S. C., van Emmerik, M. L. & Groen, E. L. The effect of internal and external fields of view on visually induced motion sickness. Appl. Ergon. 41, 516–521 (2010).

    PubMed  Article  PubMed Central  Google Scholar 

  71. 71.

    van Emmerik, M. L., De Vries, S. C. & Bos, J. E. Internal and external fields of view affect cybersickness. Displays 32, 169–174 (2011).

    Article  Google Scholar 

  72. 72.

    Keshavarz, B. & Hecht, H. Validating an efficient method to quantify motion sickness. Hum. Fact. 53(4), 415–426 (2011).

    Article  Google Scholar 

  73. 73.

    Dövencioğlu, D. N., van Doorn, A., Koenderink, J. & Doerschner, K. Seeing through transparent layers. J. Vis. 18(9), 1–19 (2018).

    Article  Google Scholar 

  74. 74.

    Agarwal, S., Mallick, S. P., Kriegman, D. & Belongie, S. On refractive optical flow. Lect. Notes Comput. Sci. Include. Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinform. 3022, 483–494 (2004).

    MATH  Google Scholar 

  75. 75.

    Loomis, J.M. & Knapp, J.M. Visual perception of egocentric distance in real and virtual environments. In Virtual and Adaptive Environments (eds. Hettinger L. J. & Haas M. W0 21–46 (2003).

  76. 76.

    Rosenberg, R. S., Baughman, S. L. & Bailenson, J. N. Virtual superheroes: using superpowers in virtual reality to encourage prosocial behavior. PLoS ONE 8, e55003 (2013).

    ADS  CAS  PubMed  PubMed Central  Article  Google Scholar 

  77. 77.

    Bruck, S. & Watters, P. A. Estimating cybersickness of simulated motion using the Simulator Sickness Questionnaire (SSQ): A controlled study. In Proceedings of the 2009 6th International Conference on Computer Graphics, Imaging and Visualization: New Advances and Trends, CGIV2009. 486–488 (2009).

  78. 78.

    Nowak, K. L. & Biocca, F. The effect of the agency and anthropomorphism on users’ sense of telepresence, copresence, and social presence in virtual environments. Presence Teleoper. Virt. Environ. 12, 481–494 (2003).

    Article  Google Scholar 

Download references

Acknowledgement

This research was partially supported by two National Science Foundation grants (AISL # 1800922 and 1906728) and by the Knut och Alice Wallenbergs Stiftelse #20170440. Thanks to Ballast Technologies, Inc for developing and loaning us the aquatic equipment, and for brainstorming and preparing this study with us. Thanks to Allan Evans for providing early insights into experimental design. Thanks to the research assistants, Manya Bansal, Lily Bigley Barnett, Chloe Huang, Sydney J. Maly, and Michael Ray Williams III., for helping run this study.

Author information

Affiliations

Authors

Contributions

G.F., A.Q, E.W. and J.B. conceived the project; all authors designed the experiments; the experiment was carried out by G.F. with the help of A.Q. and E.W.; G.F. analyzed the data with the help of A.Q.; G.F. wrote the paper with the help of A.Q., J.B., J.K. and all authors checked the paper.

Corresponding author

Correspondence to Géraldine Fauville.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher's note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Fauville, G., Queiroz, A.C.M., Woolsey, E.S. et al. The effect of water immersion on vection in virtual reality. Sci Rep 11, 1022 (2021). https://doi.org/10.1038/s41598-020-80100-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1038/s41598-020-80100-y

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing