a, Schematic of the optic pathway. The standard upright microscope illuminates the specimen (S) from behind and captures the image rays with a conventional objective. These rays propagate upward, in a collimated state, towards the oculars. A teaching module (Nikon Y-IDP) with a beam splitter (BS1) was inserted into the optical pathway in the collimated light space. This module was modified to accept a microscope camera (C), so that the specimen image relayed from BS1 was in focus at the camera sensor when the specimen was also in focus for the microscope user. A second customized teaching module (Nikon T-THM) was inserted between the oculars and the first teaching module. The beam splitter in this module (BS2) was rotated 90° to combine light from the specimen image (SI) with that from the projected image (PI) from the microdisplay (P). The augmented reality display includes a microdisplay and collimating optics, which were chosen to match the display size with the ocular size (22 mm). In this prototype, we tested two microdisplays—one that supports arbitrary colors (RGB) and another, brighter, display that supports only the green channel. The position of the collimator was adjusted to position the microdisplay in the virtual focal plane of the specimen. This collocation of SI and PI in the same plane minimizes relative motion when the observer moves, a phenomenon known as parallax. Note that BS1 needs to precede BS2 in the optical pathway from objective to ocular, so that camera C sees a view of the specimen without the projection PI. The observer looking through the eyepiece (EP) sees PI superimposed onto SI. b, Photograph of the actual implementation labeled with the corresponding modules.