ilastik: interactive machine learning for (bio)image analysis


We present ilastik, an easy-to-use interactive tool that brings machine-learning-based (bio)image analysis to end users without substantial computational expertise. It contains pre-defined workflows for image segmentation, object classification, counting and tracking. Users adapt the workflows to the problem at hand by interactively providing sparse training annotations for a nonlinear classifier. ilastik can process data in up to five dimensions (3D, time and number of channels). Its computational back end runs operations on-demand wherever possible, allowing for interactive prediction on data larger than RAM. Once the classifiers are trained, ilastik workflows can be applied to new data from the command line without further user interaction. We describe all ilastik workflows in detail, including three case studies and a discussion on the expected performance.

Access options

Rent or Buy article

Get time limited or full article access on ReadCube.


All prices are NET prices.

Fig. 1: User labels provided to various ilastik workflows and the corresponding ilastik output.
Fig. 2: Nuclei segmentation.
Fig. 3: A combination of pixel and object classification workflows.
Fig. 4: Segmentation of the peripheral endoplasmic reticulum from FIB–SEM image stacks by the carving workflow.


  1. 1.

    Simpson, R., Page, K. R. & De Roure, D. Zooniverse: observing the world’s largest citizen science platform. In Proc. 23rd International Conference on World Wide Web. 1049–1054 (ACM, 2014).

  2. 2.

    Hughes, A. J. et al. Gartner. a tool for rapid, flexible, crowd-based annotation of images. Nat. Methods 15, 587–590 (2018).

    CAS  Article  Google Scholar 

  3. 3.

    Sommer, C., Straehle, C., Köthe, U. & Hamprecht, F. A. ilastik: interactive learning and segmentation toolkit. In Proc. 8th IEEE International Symposium on Biomedical Imaging. 230–233 (IEEE, 2011).

  4. 4.

    Erickson, B. J., Korfiatis, P., Akkus, Z. & Kline, T. L. Machine learning for medical imaging. RadioGraphics 37, 505–515 (2017).

    Article  Google Scholar 

  5. 5.

    Geurts, P., Irrthum, A. & Wehenkel, L. Supervised learning with decision tree-based methods in computational and systems biology. Mol. BioSyst. 5, 1593–1605 (2009).

    CAS  Article  Google Scholar 

  6. 6.

    Tarca, A. L., Carey, V. J., Chen, X., Romero, R. & Drăghici, S. Machine learning and its applications to biology. PLoS Comp. Biol. 3, 1–11 (2007).

    Article  Google Scholar 

  7. 7.

    Breiman, L. Random forests. Mach. Learn. 45, 5–32 (2001).

    Article  Google Scholar 

  8. 8.

    Fernández-Delgado, M., Cernadas, E., Barro, S. & Amorim, D. Do we need hundreds of classifiers to solve real world classification problems? J. Mach. Learn. Res. 15, 3133–3181 (2014).

    Google Scholar 

  9. 9.

    Pedregosa, F. et al. Scikit-learn: machine learning in python. J. Mach. Learn. Res. 12, 2825–2830 (2011).

    Google Scholar 

  10. 10.

    Streichan, S. J., Hoerner, C. R., Schneidt, T., Holzer, D. & Hufnagel, L. Spatial constraints control cell proliferation in tissues. Proc. Natl Acad. Sci. USA 111, 5586–5591 (2014).

    CAS  Article  Google Scholar 

  11. 11.

    Schindelin, S. et al. Fiji: an open-source platform for biological-image analysis. Nat. Methods 9, 676–682 (2012).

    CAS  Article  Google Scholar 

  12. 12.

    Tu, Z. & Bai, X. Auto-context and its application to high-level vision tasks and 3d brain image segmentation. IEEE Trans. Pattern Anal. Mach. Intel. 32, 1744–1757 (2010).

    Article  Google Scholar 

  13. 13.

    Raote, I. et al. Tango1 builds a machine for collagen export by recruiting and spatially organizing copii, tethers and membranes. eLife 7, e32723 (2018).

    Article  Google Scholar 

  14. 14.

    Straehle, C. N., Köthe, U., Knott, G. W. & Hamprecht, F. A. in Medical Image Computing and Computer-Assisted Intervention – MICCAI 2011, (eds Fichtinger, G. et al.) 653–660 (Springer, 2011).

  15. 15.

    Straehle, C., Köthe, U., Briggman, K., Denk, W. & Hamprecht, F. A. Seeded watershed cut uncertainty estimators for guided interactive segmentation. Proc. CVPR 2012. 765–772 (CVPR, 2012).

  16. 16.

    Maco, B. Correlative in vivo 2 photon and focused ion beam scanning electron microscopy of cortical neurons. PLoS ONE 8, e57405 (2013).

    CAS  Article  Google Scholar 

  17. 17.

    Korogod, N., Petersen, C. C. & Knott, G. W. Ultrastructural analysis of adult mouse neocortex comparing aldehyde perfusion with cryo fixation. eLife 4, e05793 (2015).

    Article  Google Scholar 

  18. 18.

    Gonzalez-Tendero, A. et al. Whole heart detailed and quantitative anatomy, myofibre structure and vasculature from x-ray phase-contrast synchrotron radiation-based micro computed tomography. Cardiovas. Imag. 18, 732–741 (2017).

    Google Scholar 

  19. 19.

    Jorstad, A., Blanc, J. & Knott, G. Neuromorph: a software toolset for 3d analysis of neurite morphology and connectivity. Front. Neuroanat. 12, 59 (2018).

    Article  Google Scholar 

  20. 20.

    Nixon-Abell, J. et al. Increased spatiotemporal resolution reveals highly dynamic dense tubular matrices in the peripheral ER. Science 354, 6311 (2016).

    Article  Google Scholar 

  21. 21.

    Stalling, D., Westerhoff, M. & Hege, H.-C. in The Visualization Handbook (eds Hansen, C. D. & Johnson, C. R.) Ch. 38, 749–767 (Elsevier, 2005).

  22. 22.

    Andres, B., Kappes, J. H., Beier, T. B., Köthe, U. & Hamprecht, F. A. Probabilistic image segmentation with closedness constraints. In International Conference on Computer Vision. 2611–2618 (IEEE, 2011).

  23. 23.

    Beier, T., Hamprecht, F. A. & Kappes, J. H. Fusion moves for correlation clustering. In IEEE Conference on Computer Vision and Pattern Recognition. 3507–3516 (IEEE, 2015).

  24. 24.

    Beier, T. et al. Multicut brings automated neurite segmentation closer to human performance. Nat. Methods 14, 101–102 (2017).

    CAS  Article  Google Scholar 

  25. 25.

    Fiaschi, L., Koethe, U., Nair, R. & Hamprecht, F. A. Learning to count with regression forest and structured labels. In Proc. 21st International Conference on Pattern Recognition. 2685–2688 (IEEE, 2012).

  26. 26.

    Schiegg, M., Hanslovsky, P., Kausler, B. X., Hufnagel, L. & Hamprecht, F. A. Conservation tracking. In 2013 IEEE International Conference on Computer Vision. 2928–2935 (IEEE, 2013).

  27. 27.

    Haubold, C. et al. Segmenting and tracking multiple dividing targets using ilastik. In Focus on Bio-Image Informatics. 199–229 (Springer, 2016).

  28. 28.

    Lou, X. & Hamprecht, F. A. Structured learning from partial annotations. Proc. 29th International Conference on Machine Learning 1519–1526 (Omnipress, 2012).

  29. 29.

    Haubold, C., Aleš, J., Wolf, S. & Hamprecht, F. A. in Computer Vision – ECCV 2016 (eds Leibe, B. et al.) 566–582 (Springer, 2016).

  30. 30.

    Wolff, C. et al. Multi-view light-sheet imaging and tracking with the mamut software reveals the cell lineage of a direct developing arthropod limb. eLife 7, e34410 (2018).

    Article  Google Scholar 

  31. 31.

    Berthold, M. R. et al. in Studies in Classification, Data Analysis, and Knowledge Organization (Gaul, W. et al.) 319–326 (Springer, 2007).

  32. 32.

    Carpenter, A. E. et al. Cellprofiler: image analysis software for identifying and quantifying cell phenotypes. Genome Biol. 7, R100 (2006).

    Article  Google Scholar 

  33. 33.

    Arganda-Carreras, I. et al. Trainable weka segmentation: a machine learning tool for microscopy pixel classification. Bioinformatics 33, 2424–2426 (2017).

    CAS  Article  Google Scholar 

  34. 34.

    Sommer, C., Hoefler, R., Samwer, M. & Gerlich, D. W. A deep learning and novelty detection framework for rapid phenotyping in high-content screening. Mol. Biol. Cell 28, 3428–3436 (2017).

    CAS  Article  Google Scholar 

  35. 35.

    Luengo, I. et al. Survos: super-region volume segmentation workbench. J. Struct. Biol. 198, 43–53 (2017).

    Article  Google Scholar 

  36. 36.

    Hilsenbeck, O. et al. faster: a user-friendly tool for ultrafast and robust cell segmentation in large-scale microscopy. Bioinformatics 33, 2020–2028 (2017).

    CAS  Article  Google Scholar 

  37. 37.

    Belevich, I., Joensuu, M., Kumar, D., Vihinen, H. & Jokitalo, E. Microscopy image browser: a platform for segmentation and analysis of multidimensional datasets. PLoS Biol. 14, 1–13 (2016).

    Article  Google Scholar 

  38. 38.

    Marée, R. et al. Collaborative analysis of multi-gigapixel imaging data using cytomine. Bioinformatics 32, 1395–1401 (2016).

    Article  Google Scholar 

  39. 39.

    Neumann, B. Phenotypic profiling of the human genome by time-lapse microscopy reveals cell division genes. Nature 464, 721–727 (2010).

    CAS  Article  Google Scholar 

  40. 40.

    Linkert, M. et al. Metadata matters: access to image data in the real world. J. Cell Biol. 189, 777–782 (2010).

    CAS  Article  Google Scholar 

Download references


We gratefully acknowledge support by the HHMI Janelia Visiting Scientist Program, European Union via the Human Brain Project SGA2, the Deutsche Forschungsgemeinschaft (DFG) under grants HA-4364/11-1 (F.A.H., A.K.), HA 4364 9-1 (F.A.H.), HA 4364 10-1 (F.A.H.), KR-4496/1-1 (A.K.), SFB1129 (F.A.H.), FOR 2581 (F.A.H.), and the Heidelberg Graduate School MathComp. We are also extremely grateful to other contributors to ilastik: N. Buwen, C. Decker, B. Erocal, L. Fiaschi, T. Fogaca Vieira, P. Hanslovsky, B. Heuer, P. Hilt, G. Holst, F. Isensee, K. Karius, J. Kleesiek, E. Melnikov, M. Novikov, M. Nullmeier, L. Parcalabescu, O. Petra and S. Wolf, and to B. Werner for vital assistance to the project. Finally, we would like to thank the authors of the three case studies for sharing their images with us.

Author information




S.B., D.K., T.K., C.N.S., B.X.K., C.H., M.S., J.A., T.B., M.R., K.E., J.I.C., B.X., F.B., A.W., C.Z., U.K, F.A.H. and A.K. all contributed to the software code and documentation. A.K. and F.A.H. drafted the manuscript, to which all authors contributed.

Corresponding authors

Correspondence to Fred A. Hamprecht or Anna Kreshuk.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Peer review information Rita Strack was the primary editor on this article and managed its editorial process and peer review in collaboration with the rest of the editorial team.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Berg, S., Kutra, D., Kroeger, T. et al. ilastik: interactive machine learning for (bio)image analysis. Nat Methods 16, 1226–1232 (2019).

Download citation


Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing