Over-reliance on automated tools is hurting science, says David W. Piston.
As head of Vanderbilt University's core microscopy labs, I recently met a colleague and his student to discuss their confusing results from an experiment studying protein interactions in cells. After applying a treatment that should have disrupted the interaction of two particular proteins inside mitochondria, they still saw the proteins interacting. The student said that to measure the interaction he had used a commercial automated image-analysis system. He didn't understand how it worked, so he just used a colleague's settings from a different experiment. But, without him realizing, this had masked all of the cell except for the mitochondria. If he had modified the settings to leave the entire cell unmasked, he would have seen that the proteins were now present within the mitochondria in relatively small amounts compared with the rest of the cell, and so their interaction had been disrupted — the treatment was, in fact, working.
In this case, it wasn't inspiration that was lacking — it was instruction. The researchers had used a proven and validated tool, but in a way inappropriate for the problem at hand. A hard-working and dedicated student had wasted around two months at the microscope trying to make the treatment 'work'. Between us, we figured out the problem in just a few minutes' discussion.
Unfortunately, this scenario is becoming all too common in many fields of science: researchers, particularly those in training, use commercial or even lab-built automated tools inappropriately because they have never been taught the details about how they work. Twenty years ago, a scientist wanting to computerize a procedure had to write his or her own program, which forced them to understand every detail. If using a microscope, he or she had to know how to make every adjustment. Today, however, biological science is replete with tools that allow young scientists simply to press a button, send off samples or plug in data — and have a result pop out. There are even high-throughput plate-readers that e-mail the results to the researcher.
Teaching style has not adapted to address this cultural change — we rarely explain to our students how the new automated tools work, how to use them effectively and how to troubleshoot when it seems that things have gone wrong. And young scientists often don't realize that they need to ask questions. As a result, they waste time by using a technique improperly or, equally tragically, miss something exciting when they assume that a strange result means that they did something wrong and they never follow it up.
We need to do a better job of teaching students how techniques work before they start using them.
Of course, the researcher can talk to experts about what might be going wrong, but with 600 scientists (mostly young) using the central microscopy facility at Vanderbilt University, for example, this is not an efficient way to resolve the problem. We need to do a better job of teaching students how techniques work before they start using them.
Automation has its good points, of course. Biomedical discovery has been accelerated by automated computational analysis, expert core facilities and laboratory kits, which give investigators access to technical approaches that go beyond their own training. The interdisciplinarity of modern biomedical research makes it almost impossible for one person to understand the subtleties of all the procedures on which they rely1.
What is missing are the time and resources for students to learn enough about how their equipment and techniques work to be able to use them to best advantage2. This situation fails both the students and the broader scientific community by leading to uneven training and levels of competence. As educators, it is incumbent on us to teach our students not only the 'hows' of a particular technique, but also the underlying 'whys'. In the past, much of this practical training was conducted by a lab's principal investigator, who is now spending increasing time chasing funding. To fill this gap, many students seek out training themselves, for example through summer courses such as those offered by the Marine Biological Laboratory in Woods Hole, Massachusetts. More and more of these intensive short courses are being offered worldwide, but they are always oversubscribed.
We should make this kind of instruction available to all students by refocusing graduate education to emphasize better the fundamental concepts and practice of experimental techniques. This would necessarily include hands-on labs involving state-of-the-art equipment and instruction from experts with proven success in using the techniques. Many labs and institutions (including Vanderbilt) have begun to develop this type of course. The extra instruction will initially slow students' progress in the lab, but it will better serve them (and their labs and their research fields) in the long run.
The biggest obstacle to hands-on laboratory courses is their cost and the effort required to stage them effectively. Institutional support is therefore crucial, especially given that the appropriate instructors are often well funded researchers who need to be recompensed for their time. In addition, all graduate students should be supported by training grants3, which fund the needed instruction and also relieve some of the pressure on students to deliver research results in their first years of graduate school. And schools should consider admitting only as many graduate students as they have the resources to give the necessary laboratory instruction.
The research community must take more responsibility for teaching the coming generations not only how to formulate hypotheses, design research approaches and write manuscripts, but also how to build, implement and troubleshoot their experiments at the most basic level.
Nature Methods 8, 983 (2011).
Mervis, J. Science 328, 678 (2010).
Stephan, P. Nature 484, 29–31 (2012).
About this article
Cite this article
Piston, D. Understand how it works. Nature 484, 440–441 (2012). https://doi.org/10.1038/484440a