Main

Nonscientists can help with sophisticated neurobiology challenges. Researchers willing to follow in the tracks of labs pioneering this approach need to know that working with crowds means more than sharing neuroscience and computing knowledge. Initiating and managing peopled projects takes stamina, management expertise, unconventional recruiting methods and slightly more-than-healthy amounts of pizza and caffeinated soft drinks.

In 2009, when neuroscience crowdsourcing ventures began, skepticism reigned about the prospect of mapping neurons of entire circuits; crowdsourcing this challenge to interested volunteers seemed to arouse even more skepticism. “That is a good sign. If you're doing something new, there should be measured skepticism—otherwise it is not new enough”, says Moritz Helmstaedter, a neuroscientist at the Max Planck Institute (MPI) of Neurobiology. He and Sebastian Seung, a computational neuroscientist at Massachusetts Institute of Technology (MIT), and a few other neuroscientists are crowdsourcing in connectomics, a field in which researchers seek to generate and analyze detailed anatomical maps of entire nervous systems.

As projects have drawn crowds and generated publications, researchers now face challenges in scaling up the ways crowds follow neurons through a stack of electron microscopy (EM) images. The ability to perform this tracing is “a key bottleneck right now in connectomics,” says Helmstaedter. Crowdsourcing is the only way to move ahead for now. “Unusual means may get us to some really cool goals,” he says.

Neuroscientist Moritz Helmstaedter (middle) and Brainflight programmers want crowdsourced neuroscience to be fun. Credit: MPI for Neurobiology

Being 'bold' and 'ambitious' is now fashionable in neuroscience, as illustrated by large-scale efforts to map the human brain and accelerate technology development in neuroscience—such as the Brain Research through Advancing Innovative Neurotechnologies Initiative launched by US President Barack Obama. Anatomical mapping of the brain's connections is one of the initiative's top priorities as described in the first report recently issued to the director of the US National Institutes of Health.

Separately, scientists have found that crowdsourcing is a promising way to analyze increasingly large data sets generated in labs that perform neuroanatomical mapping.

EyeWire

Crowds in neuroscience are already helping researchers to analyze high-resolution EM images of brain slices and to turn the two-dimensional (2D) image information into connectivity maps of how the neurons run, branch and connect throughout a section of the brain. EyeWire is the father of crowdsourcing in this field, a gaming project initiated by Seung and the first to “crowdsource the connectome,” he says.

The online project invites volunteers to compete against one another to map the neurons in a region of the mouse retina. The goal is to better understand how visual information is processed in the brain and to help train computers to get better at this kind of tracing.

Around 82,000 participants, who are almost all citizen scientists as opposed to formally trained scientists, have played the game so far. As EyeWire's creative director Amy Robinson says, their surveys show that 'EyeWirers' range from high school students to retirees and hail from basically all walks of life. Some players are researchers, but they are mainly just people interested in science, says Seung. The top players, he says, are a mix of men and women. There is even a slight skew toward more female players among EyeWirers. “It's pretty hard to predict who will like this game.”

The project takes two long-standing neuroscience traditions of manual labor into the realm of computer games. One of these is skeletonization, in which researchers trace the likely paths of neurons through stacks of EM images, he says. The other is one in which scientists trace and render the neurons and their paths in three dimensions, precisely outlining every neuron's contour and curve.

Sebastian Seung's EyeWire is the father of crowdsourcing in connectomics. Credit: K. Krug/PopTech

Performed with “human intelligence,” both these tasks done manually are painstakingly slow, says Seung. His idea has been to accelerate the process with many volunteers competing with one another and by harnessing artificial intelligence. In the lab, his team uses software called Omni to render neurons in three dimensions. EyeWire uses a slightly simplified version of Omni that lets players see one 3D and one 2D view.

There is also a trained convolutional neural network at the heart of EyeWire, the code for which is part of the paper Seung coauthored with Helmstaedter and others, in which the team presented a connectomic reconstruction of a layer of the mouse retina1. EyeWire taps into the crowd to follow the path of neurons and perform volume reconstruction of neurons and the circuits to which they belong. “It's like coloring in a neuron with the help of the computer,” Seung says.

Separately, he and his team also run algorithm development competitions. “We put a data set online, and we let people try to submit their algorithms to reconstruct from it, and we score them and we see who the winner is,” Seung says. The plan for the future is to feed this algorithmic knowledge gained back into EyeWire.

Seung is happy about the success of the project thus far in garnering so many participants for the game and the competitions. He acknowledges that his project's players are far fewer than the millions of players reached with games unrelated to science. Laughing, he points out that the game Angry Birds has 2 billion downloads across all platforms. “If you compare EyeWire with a commercial game, it looks kind of pathetic,” he says. “It needs to be much more fun, and that's a challenge.”

It is hard to recruit game developers to work on neuroscience. Helmstaedter—who is also trying to use gaming to grow the scale of neural circuit tracing in his lab—takes unusual approaches to recruiting potential programmers.

Finding the crowd

In 2009, Helmstaedter and some students positioned themselves at the bottom of the staircase leading to the cafeteria of the University of Heidelberg. A large projection screen showed what one student was doing on a computer: tracing neurons in EM images.

“It got so much attention,” Helmstaedter says of the recruiting session for his neuron-tracing project. People stopped, asked questions, picked up flyers. Excitement mounted, and the crowd of his volunteers grew its ranks to the list of 224 names that accompany his latest Nature paper, completed with Seung, Winfried Denk from the Max Planck Institute for Medical Research and researchers in the UK, Germany and the USA1.

The recruited crowd was made up of paid undergraduate and graduate students. Applying a software tool called KNOSSOS2 developed at the MPI, they helped to reconstruct the circuit of 950 neurons in a layer of the mouse retina called the inner plexiform that had been imaged through serial block-face EM. “KNOSSOS is made to 'fly along' neurites” to reconstruct skeletons of neurons, Helmstaedter says. “EyeWire 'paints' neurons,” generating volume reconstructions.

Around 82,000 participants have played EyeWire thus far. Credit: EyeWire, MIT/Seung lab

He distributed hard disks of EM data sets and the KNOSSOS software to all the team members. After some training sessions, the crowds set out to establish connectivity between neurons and annotate the microscopy data. With the skeletons, Helmstaedter was then able to apply the machine learning algorithm used in EyeWire to “blow up the skeletons into full volumes,” says Seung.

Helmstaedter believes that anyone can run a crowdsourcing project, but the project leaders will need to fully embrace it. After finishing his master's degree in physics, Helmstaedter spent a few months working for a consulting company, which offered “great experience in how to manage very large teams, how to motivate people,” he says.

Frequent meetings and benchmarking were part of the project for the consulting company, in which he learned to foster an environment with a group “driven by the goal, not by hierarchies,” he says. Helmstaedter then switched back to science and completed an MD-PhD; he has consequently brought his management experience to neuroscience crowdsourcing.

Volunteers compete to map and reconstruct neurons in EyeWire. Credit: EyeWire, MIT/Seung lab

For the volunteers to begin their neural reconstruction tasks, scientists preselected points in the EM images where the volunteers were to start tracing. Then, Helmstaedter says, the volunteers followed a neurite for as long as they could, and the results were returned to the research team for evaluation. Scientists could cross-check the delivered results because multiple students were assigned the same neuron, he says. More than 20,000 annotator hours led to 2.6 meters of skeleton.

Human brains are better than machines at seeing the less distinct boundaries between neurons. “We're very good at figuring out those difficult occasions; however, we're very bad at being always and constantly attentive,” he says. As he followed the crowd's progress, he saw that the attention span and exhaustion limit for neural tracing was around 4 hours, including breaks.

Attention-related errors occur when, for example, volunteers miss a neurite branch. “It's rather related to your personal caffeine intake on that day or level of distraction you have related to the data,” he says.

Because the errors are attention related and not data related, a second person tracing the same neurite is less likely to make the same mistake. This pattern allowed the scientists to average out the mistakes by comparing multiple students' tracings. A software tool called the redundant- skeleton consensus procedure (RESCOP), also developed in the lab, was used to reconcile the students' tracings.

“A lot of pizza was involved,” says Moritz Helmstaedter. Credit: MPI Neurobiology

To encourage the crowds, every 2–3 months Helmstaedter met with all the members to discuss progress. “A lot of pizza was involved,” he says. They validated results in the group setting. “We wouldn't tell them 'you were wrong',” he says. The group of peers found errors together and addressed them together to avoid embarrassing individual members.

“What Moritz did is a highly significant innovation, which is to demonstrate you can use a lot of people's effort,” says Seung. And now Helmstaedter is scaling his crowd and tools further by embedding the science and the tracing activity into a gaming approach that is intended to be unlike EyeWire. But it takes testing, testing and more testing to turn science into a game.

Testing, gaming

Dominic Bräunlein is testing several games—all in the prototype stage—that are intended to help annotate neuroscience data. His start-up, named scalable minds, is programming the games in a team Helmstaedter assembled. He encouraged them to start a company, doing so also as a strategy to recruit software engineers to neuroscience.

Bräunlein founded the firm with four others in late 2011; it is part of a project called Brainflight. The other company founders—Tom Bocklisch, Tom Herold, Norman Rzepka, and Thomas Werkmeister—were Bräunlein's classmates and friends at the computer science–focused Hasso Plattner Institute, where they recently completed software engineering degrees.

The group is tight-lipped about the actual plots of the neuroscience games under development. But Bräunlein shares that they are not shoot-up or kill-the-alien games and that they appeal to different age groups. “One idea is more quiet and more for older persons, more like a puzzle,” Bräunlein says. “The other is more like a typical 3D game that appeals to guys in my age group or teenage boys,” he says. “Or girls,” he adds.

With trepidation, Bräunlein recently showed a middle-aged woman a newly designed computer game. She didn't like it. And so the software engineer had to go back to the drawing board, and his team had to rethink the game—as he has done plenty of times before with other volunteer testers.

Building software is tough, but turning a scientific problem into game play is perhaps harder, says Helmstaedter. The goal is to imagine an app that people play while waiting at a bus stop and in that time “help us massively solve our big reconstruction problems.”

Bräunlein says that Brainflight wants to take a more playful approach than EyeWire to crowdsource tracing. “People need to play it without feeling like they are doing good work for scientists,” he says. “My personal goal is to have it feel like you would want to play it irrespective of the science-related task.”

It is not easy to recruit game developers to work on neuroscience. Credit: Thinkstock

Seung looks forward to the Brainflight games and acknowledges the friendly competition. But he also teases the project developers about the fact that their games have not yet been released. “There is plenty of room for scalable minds or for many other people to come in,” he says. No group has yet scored a chart-breaking hit.

As a contractor for Helmstaedter's lab, the start-up is also helping build a scalable version of the annotation tool KNOSSOS, called Oxalis, so that the software can be used by crowds working through a Web browser. With this step, the crowdsourcing project takes a tool that was formerly used by scientists and makes it possible for nonscientists to use it. And in this next phase the crowds will no longer need to have the EM image data on a hard disk.

The plan is to make Oxalis freely available for researchers, and it might be open source. The group is still making those decisions.

One software challenge facing the team is that the data are big, terabyte-sized images. As a consequence the software cannot place too much data into the browser at any one time. Bräunlein and his colleagues engineered the software to anticipate users' movements through the images as they annotate. “We load small cubes of data,” he says. That way no one stares at a blank screen because the data are still loading. The software performs a kind of local forecasting, looking at both the direction and speed with which the analysis is likely to unfold by an individual using the software.

Folding as play

Neuroscientists interested in crowdsourcing can look to the success of similar projects in other research disciplines. Galaxy Zoo lets participants help classify galaxies in telescope images from the Sloan Digital Sky Survey. Folding@home began at Stanford University and has allowed researchers to tap into over 260,000 computers of volunteers to explore how proteins fold. The project also includes a crowdsourced protein-folding game, Foldit, in which participants solve puzzles and fold proteins for points and thereby help scientists to predict protein folding or design proteins3.

Folding@home's initiator is Stanford University chemist Vijay Pande, who is happy to see crowdsourcing ventures in neuroscience. “I think there will be more and more successes like these,” he says. It surprised him to see “just how much the general public really wants scientists to succeed and would like to help us, either for the advancement of science or just the joy and thrill of participating in it.”

Many technical challenges have to be tackled to crowdsource projects, often in ways that are specific to a given scientific field. But scientists generally need to become more comfortable with crowdsourced solutions, says Pande. In the long term that approach will make it easier for other researchers to use these methods, too.

Internally, he says, reengineering the scientific problem and the software at the heart of Folding@home in order to scale it has been a challenge. But he and many of his colleagues like to “think as big as we can, so scaling up is always the most exciting part of all of this.”

The software development for this project has all been performed at Stanford, but “we've moved to employing more and more professional software engineering practices,” he says. “Company connections have been very helpful to get Folding@home into people's hands.”

His project has many nonprofit, government and commercial supporters, and he sees a number of ways for crowdsourced science projects to grow. It might begin in a lab and involve companies later. “Bringing in companies is a bit more unorthodox, but they can help scale up, too,” says Pande.

Similarly to the neuroscience projects, Folding@home also originally faced skepticism because crowdsourcing is an unusual approach in the often conservative world of science. And there is good reason for the conservatism, he says, because “the integrity of the results is paramount.” That standard means more complexities for crowdsourced projects, such as the need to address reproducibility. “The burden,” he says, is on the scientists using these methods “to ensure these are not concerns.”

Crowdsourcing in neuroscience might become more common as data sets grow in size and as the neuroscience community launches large-scale projects. In a lab, current image data sets can be hundreds of terabytes large—but they are bound to get larger. Harvard University neurobiologist Jeff Lichtman likes the idea of crowdsourcing image analysis.

He and Winfried Denk are collaborating with microscope manufacture Zeiss as the company develops a prototype scanning electron microscope that uses 61 electron beams on a sample, as opposed to the typical single electron beam.

Currently, the Lichtman lab generates around a terabyte of image data per day. “That device will allow us to speed that up 60-fold, roughly, which will be more like three terabytes an hour,” he says. But, he adds, and laughs as he speaks, as these data set dimensions become common, he wonders whether “there are enough people out there” to crowdsource the analysis of such large data sets. Researchers in neuroscience and the crowds they recruit will have their work cut out for them.