Modern microscopy produces massive image datasets that enable detailed multi-scale analysis and can combine several modalities. Visualizing, exploring and sharing such data are challenges both during the execution of a research project and after publication to enable open access. To this end we have developed MoBIE, a Fiji1 plugin for multi-modal big image data sharing and exploration. It supports visualization of multi-scale data of heterogeneous dimensionality (that is, combined 2D, 3D or 4D data) and several-terabyte image data, as well as the exploration of image segmentations, corresponding measurements and annotations. MoBIE uses next-generation image file formats, such as OME-Zarr2, that enable access to multi-scale data on local or cloud storage, permitting the transparent sharing and publication of data without the need to run a web service. In addition, MoBIE allows users to easily configure and share fully reproducible ‘views’ of their data. MoBIE has enabled integration of multiple modalities and open access for data from different domains of the life sciences. This includes data from studies in developmental biology3 (Fig. 1a), correlative microscopy, high-throughput screening microscopy4, plant biology and spatial transcriptomics5 (all Fig. 1c). Further applications can be found in Supplementary Note 8 and Supplementary Figs. 1–4. Video tutorials for MoBIE are available at https://www.youtube.com/@MoBIE-Viewer and documentation at https://mobie.github.io/.
MoBIE is implemented using existing tools wherever possible, in particular BigDataViewer6 (BDV) and ImgLib2 (ref. 7). It extends BDV’s functionality to efficiently stream data from remote storage, conveniently manage an arbitrary number of images and add a menu to select and control sources, views and navigation. It inherits and extends BDV’s functionality for on-the-fly transformations, which enable registering multi-modal data without the need to resample it. Transformations can be obtained externally or within MoBIE itself. MoBIE also supports joint visualization of 2D and 3D (time-lapse) data and different image-blending modes. We have further added interactive display of segmentation results with support for different color maps, 3D object rendering, and display of tabular data associated with the individual segments, which enables fast visual inspection of derived measurements — for example, cell sizes or gene expression levels. Tables can also be associated with regions — for example, high-throughput screening microscopy data arranged on a plate (Fig. 1c). We support spot data with associated tables — for example, gene detections in spatial transcriptomics (Fig. 1c). The tables can be searched, sorted and used for coloring or to generate scatter plots. The objects in the image, the corresponding table row and the point in the scatter plot are linked so that they can be navigated in synchrony. We have also implemented an annotation mode that can be used to add table columns with user observations. Users can import and export table data from tab-separated value files — for example, to visualize analysis results within MoBIE. The full MoBIE state can be captured as a view to share it with collaborators or to create an interactive figure panel. Fig. 1a and Fig. 1c are created with MoBIE and are available as interactive views. See Supplementary Note 3 for a detailed description of all MoBIE features.
To access data with MoBIE, it must be organized according to the MoBIE project specification (see also Supplementary Note 5). A ‘project’ consists of ‘datasets’ that contain all data that can be loaded in a single MoBIE instance: image data, tabular data and a JSON file specifying the dataset layout. The project can be stored either locally or on object storage; metadata and tables can also be hosted on GitHub for version control. The Fiji plugin can access projects from these configurations, which enables the use of MoBIE to access and share data throughout the full life cycle of a project: when the data is accessible only to a single researcher or institution, accessible to a research collaboration or published and freely accessible (see also Fig. 1b). MoBIE projects can be created either using the Fiji plugin, providing a convenient graphical user interface, or a python library, providing full flexibility and support for large data.
Overall, MoBIE enables scientists to seamlessly access, explore and share their massive microscopy data through all stages of a project in a convenient Fiji plugin. We are working with online archives to support on-demand access of public image data through MoBIE and have provided some proofs of concept (Supplementary Note 9). Furthermore, we are convinced that many of MoBIE’s features are of general use and aim to contribute its functionality into core components such as BDV. To promote cross-tool accessibility, we plan to standardize select MoBIE specifications — for example, transformations, views and tables — within OME-Zarr. This integration will enable other tools, such as Viv8, webKnossos9 or Neuroglancer10 (see Supplementary Note 1 for an overview of related tools), to access the same data and views as MoBIE.
Schindelin, J. et al. Nat. Methods 9, 676–682 (2012).
Moore, J. et al. Nat. Methods 18, 1496–1498 (2021).
Vergara, H. M. et al. Cell 184, 4819–4837.e22 (2021).
Pape, C. et al. Bioessays 43, e2000257 (2021).
Lohoff, T. et al. Nat. Biotechnol. 40, 74–85 (2022).
Pietzsch, T., Saalfeld, S., Preibisch, S. & Tomancak, P. Nat. Methods 12, 481–483 (2015).
Pietzsch, T., Preibisch, S., Tomancák, P. & Saalfeld, S. Bioinformatics 28, 3009–3011 (2012).
Manz, T. et al. Nat. Methods 19, 515–516 (2022).
Boergens, K. M. et al. Nat. Methods 14, 691–694 (2017).
Maitin-Shepard, J. et al. google/neuroglancer https://doi.org/10.5281/zenodo.5573294 (2021).
We would like to thank T. Pietzsch, J. Bogovic and S. Saalfeld for invaluable help with ImgLib2 and BigDataViewer development; the European Molecular Biology Laboratory IT department, in particular J. Moscardo, for supporting the S3 infrastructure; S. Culley for suggesting the name MoBIE; G. E. Girona for designing the MoBIE icon; J. Hartmann for providing the zebrafish data; A. Wolny for providing the plant root data; S. Ghazanfar for providing the spatial transcriptomics data; and C. Uwizeye and J. Decelle for providing the plankton data. C.P. has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement number 824087. C.T. acknowledges support by grant number 2020-225265 from the Chan Zuckerberg Initiative DAF, an advised fund of Silicon Valley Community Foundation. J.M. also received funding from the Chan Zuckerberg Initiative DAF for work on OME-NGFF through grant number 2019-207272 and on Zarr through grant numbers 2019-207338 and 2021-237467. D.A. is supported by a grant from the 728 European Research Council (NeuralCellTypeEvo 788921).
The authors declare no competing interests.
Peer review information
Nature Methods thanks Ozgun Gokce and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.
About this article
Cite this article
Pape, C., Meechan, K., Moreva, E. et al. MoBIE: a Fiji plugin for sharing and exploration of multi-modal cloud-hosted big image data. Nat Methods 20, 475–476 (2023). https://doi.org/10.1038/s41592-023-01776-4