Figure 1 : The EyeMusic.

From: Cross-sensory transfer of sensory-motor information: visuomotor learning affects performance on an audiomotor task, using sensory-substitution

Figure 1

(a) An illustration of the EyeMusic SSD. The user is wearing a mobile camera on a pair of glasses, which captures the colorful image in front of him. The EyeMusic algorithm translates the image into a combination of musical notes, conveyed via scalp headphones. The sounds enable the user to reconstruct the original image, and, based on this information, perform the appropriate motor action. Inset: a close up of the EyeMusic head gear. (b) An illustration of the experimental setup. The participant is seated in front of a table, controlling the joystick with his right hand. His hand and forearm are occluded from view, and he sees the target and cursor location on the computer screen (on VIS trials). Via headphones, he hears the soundscapes marking the target location (on SSD trials).