In theory, navigation toward a goal could be guided by several visual cues. The direction of a goal could be calculated with respect to the body (the 'egocentric direction hypothesis'). Alternatively, people might move in the direction that minimizes the error between cues from the expanding radial pattern on the retina ('optic flow' is produced by self-movement) and the goal. In practice, these two strategies for visual control are difficult to distinguish because they predict the same behavior. However, Warren and colleagues from Brown University (page 213) created a virtual reality environment that allows their subjects to walk through a world where the laws of optics are under experimental control. The authors can create conditions that never occur in the natural world, such as displacing the optic flow field from the actual direction of walking. In such a virtual world, the egocentric direction and optic flow hypotheses make different predictions about the shape of a subject's path toward a goal. The authors found that when little optic flow information was available, subjects' behavior was consistent with the egocentric direction hypothesis. However, when the environment was made more complex, for example by adding textured floors and ceilings, optic flow information increasingly dominated behavior. These results demonstrate that the visual system can control locomotion robustly under a variety of environmental conditions, and that optic flow cues are used to control human walking when they are available.