Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.


Accurate maps of visual circuitry

Such is the brain's complexity that even small neural circuits contain hundreds of neurons making thousands of connections. Connectivity and optical analyses provide close-up views of two such circuits. See Articles p.168, p.175 & Letter p.212

Understanding the biological machinery from which perception, action and thinking are built is not an easy undertaking. A big difficulty is that neuroscientists must deal with a problem of spatial scale — one in which components range from nanometre-sized synaptic junctions between neurons to centimetre-long connections between brain regions — and study these scales simultaneously. Three papers1,2,3 in this issue attack these problems of scale. Two of them (by Helmstaedter et al.1 and Takemura et al.2) use computational techniques to expand the field of neurons that can be encompassed in a high-resolution view. The third study (by Maisak et al.3) combines genetic and optical methods to record the activity of neurons that until now have been impossible to monitor owing to their small size. All three take as a model the retina, the first image-processing element in the chain that leads to visual perception.

The mammalian retina contains more than 60 different types of neuron, each of which has a distinct morphology and carries out a different function4. Within the retina, photoreceptor cells sense light, and their output is processed by amacrine, horizontal and bipolar cells. Downstream, roughly 20 different types of retinal ganglion cell transmit the final coded signal — 20 different representations of the visual input — to the brain. Unsurprisingly, therefore, sorting out neuronal connectivity in the retina has been a daunting task. Helmstaedter et al. (page 168) now report a connectome (a list of all synaptic connections) for an inner layer of the mouse retina. They achieve this by serial tissue sectioning and electron microscopy, followed by digital reconstruction of cells within the virtual three-dimensional solid that results.

The analysis reveals patterns of connections that could account for the stimulus selectivity of two types of ganglion cell. More fundamentally, the reconstruction, which contains 950 neurons (Fig. 1a), allows definitive classification of the types of bipolar cell. With only a slight refinement, the new classification matches extremely well with the existing understanding of these cells5, which was based largely on the identification of molecular markers using light microscopy. Helmstaedter and colleagues' work, however, adds value by providing enormously more precise descriptions of bipolar-cell structure and by effectively acting as a positive control, increasing confidence that analysis of the amacrine and ganglion cell types, which have resisted classification by previous techniques, will be equally definitive. And this is just the beginning: once these cell types are classified, the same basic methods should allow the synaptic connections among them to be deciphered.

Figure 1: The mechanism of motion discrimination in the visual system1,2,3.


a, Reconstruction of 24 out of 950 mouse neurons between two retinal layers, based on an electron microscopy data set. b, Photoreceptor cells slightly separated in space mediate inputs to Mi1 and Tm3 cells, through intermediary L1 and L2 cells. Their outputs converge on the T4 cells, which, because of the spatial separation of the inputs, can discriminate movements occurring in different directions. A similar mechanism is thought to occur in T5 cells, although the intermediary cells are unknown. T4 and T5 cells are selectively responsive to light edges (ON) or dark edges (OFF), respectively. Thus, at the level of these cells, the visual input is broken down into eight separate components, each representing ON or OFF activity and one of four directions of motion (only forward motion is shown; purple arrows). This information is then recombined by the tangential cells, each of which is sensitive to one of four cardinal directions, and to both ON and OFF edges. For anatomically correct details of the diagram, see Figure 4 of ref. 2.

Takemura et al. (page 175) and Maisak and co-workers (page 212) report progress on a classic problem of neural computation — the detection of visual motion. Their test system is the eye of the fruitfly, an animal that must rapidly navigate during flight and that is especially effective at dodging predators (doubters are invited to swat one). It is easy to make simple models of motion detection6,7, but pinning the mechanism to precise neural events has been much harder. Whereas photoreceptor cells cannot detect direction, downstream neurons called tangential cells are robustly tuned to the direction of movement. Somewhere in between lies the neural mechanism that creates the directional discrimination, but the crucial neurons, called T4 and T5, are too small for ordinary electrical recording. Maisak et al. got around this difficulty by recording activity optically, using an indicator protein introduced into the cells by genetic techniques.

The authors demonstrate that T4 and T5 detect visual movement, with subsets of each being selective for one of four cardinal directions: upward, downward, front to back, and back to front. Furthermore, these cells are sensitive to opposite visual contrasts — T4 cells respond to light ON and so are sensitive to light edges, whereas T5 cells respond to light OFF and are sensitive to dark edges. The authors' genetic-knockout experiments not only confirm this optical observation but also show that T4 and T5 are the sole pathways mediating these functions, with no other cells being able to step in and carry the message. Thus, a fly initially breaks down moving visual inputs into a total of eight components: bright edges moving up, down, forward or backward, and dark edges moving along the same four axes.

But how do T4 and T5 actually detect the direction of movement? Takemura and collaborators' fly connectome suggests a place to look for the answer. They show that just upstream of T4 lies a pair of neurons, termed Mi1 and Tm3, which report on narrowly separated points in visual space. Because of that separation, the pair could provide the inputs that T4 needs to discriminate direction (Fig. 1b). Using Maisak and colleagues' recording technology, it may be possible to optically record from Mi1 and Tm3. If all goes well, this could bring the 50-year search for the mechanism of direction selectivity to an end.

The connectomic approach has now proved its importance for studying fly eyes and mouse retinas, but sceptics will still doubt that we can make the jump from these miniature neuronal circuits to 'real brains'; the intrinsic circuits of the cerebral cortex are some ten times larger than those of the retina, and this spatial scale is dwarfed again by the distances that connect different brain regions.

One obstacle is the need for very large tissue sections. Another difficulty is that of image segmentation, which is required for tracing thin neuronal processes through the thicket of neighbours in serial sections. Because digital solutions have failed, the task is currently assigned to large teams of human observers, but this will be impractical on larger spatial scales. Improvements in fixation and staining might make the processes more easily discriminable, and digital technology may yet save the day, because in principle any task that can be done by human observers can be done by a computer. Tracing processes is essentially a problem of pattern recognition, for which the technology is evolving rapidly.

A final question concerns the cost-effectiveness of the connectomic approach. Is it to be the exclusive province of a few deep-pocketed laboratories? Here, the answer is clear: the researchers involved have stressed that the connectomic reconstructions will be public resources, which can be used for different purposes by anyone. To be useful, an archive may need to be accompanied by a user-oriented interface, because computer code this complex will be hard for workers other than its creators to use. The effort involved in creating and curating such a public resource should be worth it, because the archives could be the biggest contribution of this work to neuroscience. Very many structural problems could be attacked using the same original material.


  1. 1

    Helmstaedter, M. et al. Nature 500, 168–174 (2013).

    ADS  CAS  Article  Google Scholar 

  2. 2

    Takemura, S. et al. Nature 500, 175–181 (2013).

    ADS  CAS  Article  Google Scholar 

  3. 3

    Maisak, M. S. et al. Nature 500, 212–216 (2013).

    ADS  CAS  Article  Google Scholar 

  4. 4

    Masland, R. H. Neuron 76, 266–280 (2012).

    CAS  Article  Google Scholar 

  5. 5

    Wässle, H., Puller, C., Müller, F. & Haverkamp, S. J. Neurosci. 29, 106–117 (2009).

    Article  Google Scholar 

  6. 6

    Reichardt, W. in Sensory Communication (ed. Rosenblith, W. A.) 303–317 (MIT Press, 1961).

    Google Scholar 

  7. 7

    Barlow, H. B. & Levick, W. R. J. Physiol. (Lond.) 178, 477–504 (1965).

    CAS  Article  Google Scholar 

Download references

Author information



Corresponding author

Correspondence to Richard H. Masland.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Masland, R. Accurate maps of visual circuitry. Nature 500, 154–155 (2013).

Download citation

Further reading


By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.


Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing