A versatile and customizable low-cost 3D-printed open standard for microscopic imaging

Modern microscopes used for biological imaging often present themselves as black boxes whose precise operating principle remains unknown, and whose optical resolution and price seem to be in inverse proportion to each other. With UC2 (You. See. Too.) we present a low-cost, 3D-printed, open-source, modular microscopy toolbox and demonstrate its versatility by realizing a complete microscope development cycle from concept to experimental phase. The self-contained incubator-enclosed brightfield microscope monitors monocyte to macrophage cell differentiation for seven days at cellular resolution level (e.g. 2 μm). Furthermore, by including very few additional components, the geometry is transferred into a 400 Euro light sheet fluorescence microscope for volumetric observations of a transgenic Zebrafish expressing green fluorescent protein (GFP). With this, we aim to establish an open standard in optics to facilitate interfacing with various complementary platforms. By making the content and comprehensive documentation publicly available, the systems presented here lend themselves to easy and straightforward replications, modifications, and extensions.

. Educational chart Exemplary slides to show the basic properties of a simple projector or a smartphone-based microscope for the use in schools. The optical layout acts as a printed template for the different cubes. Students can conveniently place the cubes in these place-holders to create the microscope and observe an image using their eyes or cellphones and discuss the results.

5/44
Supplementary Video  Long-term (48 h) image series with the incubator microscope (10 ×, 0.32 NA objective) at a 46 frame-rate of 1 frame/minute. Incubator-contained measurement of isolated human blood 47 monocytes. The aim was to document differentiation of monocytes to macrophages and analyse 48 their movement pattern without stimulation.  The video shows long-term imaging of the differentiation of blood-born monocytes to macrophages. 51 Within the time span of seven days the monocytes increase size and are "looking" around. Obvious 52 are the filopodia surround the cells. Moving macrophages become fusiform, elongate and follow 53 their protrusions with the cell body.  Reconstruction of the complex refractive index of unlabelled cheek cells using the annular 56 intensity diffraction tomography algorithm (aIDT). A number of LEDs on an LED-ring placed 57 at a distance to the sample of ≈ 74 mm such that local illumination is approximated as a series 58 of plane waves varying in azimuth. The inverse filtering process can reconstruct a 3D stack of 59 the permittivity distribution. The acquisition was performed using a cellphone camera (Huawei 60 P20 Pro, China) further described in Chapter 7.6.  Through-focus series of a Drosophila larva. Due to the large depth of field of the 4×, NA=0.17 63 objective, a data stack was acquired by moving the light sheet through a fixed sample i.e. by 64 changing the angle of the kinematic mirror in the illumination path. The GFP-expressing 65 drosophila larva was focussed by the detection path and the illumination plane was then moved 66 through it by changing the tilt of the kinematic mirror. Although the whole three-dimensional 67 sample is in focus, only the illuminated parts yield signal being imaged onto the camera. The 68 video was acquired with a cellphone camera (Huawei P20 Pro, China) and a 20× eyepiece. 69 Alternatively, one can move the whole sample through the fixed light sheet aligned to the 70 focus-plane using the sample-stage equipped with a flexure bearing. This was done in the video 71 of the GFP-expressing zebrafish larva. The video was acquired with a Raspberry Pi camera 72 with a lens and a 20× eyepiece.  The conversion from a simple bright field into a light sheet microscope can be accomplished 75 within less than five minutes using TheBOX. The modules can easily be reused for different 76 imaging modalities. The components are pre-aligned and remain their position when packed 77 again, useful for transporting the whole system. 78 6 Supplementary Video 6. 79 Long-term measurements of MDCK-cells at room temperature over night (8 h) in an 35 mm 80 petri-dish at a UC2 workshop , which took place in Oslo, where participants were able to bring 81 their samples.  Time-series imaging at ≈ 1 f ps of fixed but mobile E. coli bacteria using the infinity-corrected 84 fluorescence microscope (see Supp. Section 7.4). The ATTO647-labelled E. coli were illuminated 85 with a coherent entertainment laser (λ red = 635/637 nm, P laser = 200 mW ) move in aqueous 86 solution due to Brownian motion and can nicely be observed with the low-SNR RGB camera 87 from the Raspberry Pi (v2.1). During the ca. 10 minutes experiment, some bacteria start 88 adhering to the cover glass. By increasing the laser intensity inside the UC2 GUI, a dominant 89 bleaching of the bacteria can be observed. One aspect which is missing in many open-source and open-science projects is the ability to 93 interact with the project in order to introduce own modifications to individual needs. During 94 our study we found that one major requirement in order to provide users easy access to the 95 resources and to make it attractive to start developing on an open project -like the proposed 96 UC2 system -is an easy to understand documentation. It should provide an intuitive way into 97 the project to reduce the inhibition threshold to be engaged. 98 Inspired by the recently discontinued modular cellphone project ARA by Google Inc. [1] we 99 created a comprehensive document called the Module Developer Kit (MDK, GitHub repository) 100 which describes the good practice of cube-design and customized inserts. This includes the 101 CAD-files for common CAD software like Autodesk Inventor 2019 (Autodesk ®Inventor LT™) 102 OpenSCAD (www.openscad.org, v2019.05), as well as schematics to port the design to other 103 software tools. It emphasizes the idea of having the UC2-system as a supporting base-structure 104 or skeleton to become a common standard for a large variety of different components of different 105 manufactures. Having a "zoo" or library of modules developed by an active community which 106 are useful for many people guarantees a long lifetime of the project. All files can be found in our 107 hard-and software repository [2, 3].

108
At first we introduce the naming-convention of the UC2 system to give a better understanding 109 of the module hierarchy. These terms are defined in the table (1) below and illustrated in Fig. 4. 110 Based on these modules and inserts, a complex optical system can be created.  imprecision of the printer. The centro-symmetrical cube is designed so that the beam is guided 117 vertically and through the centre of the cube sides. The free space in the cube's interior is large 118 enough to accommodate common optical lab-ware (e.g. 1" cage system from Thorlabs, Edmund 119 optics, Qioptics etc) and other components using customized adapters. Ferromagnetic worm 120 and flat-head screws (DIN ISO 912, M3×18mm, DIN ISO906, M3×5mm, galvanized) also used 121 to hold the cube together, connect to magnetically 5 mm ball magnets sitting in the baseplate. 122 Figure 5. Basic empty cube 1×1 The basic cube consist of two parts, the frame and the lid which is hold together by a set of ferro-magnetic M3 screws. These screws attach to the ball-magnets inside the base-plate. The inner structure of the cube allows inserting a customized hardware plugin in all directions.
3 The Baseplate

123
The Baseplate (unit-size 50 × 50 mm, magnet-to-magnet distance 40 × 40 mm, Fig. 6) is the 124 "skeleton" of the UC2 framework and holds the different modules in place and provides a 125 straight optical axis. The neodymium ball-magnets are press-fit into the 3D printed baseplate 126 thus creating a stable mechanical connection to the 3D printed cubes. Although the design 127 is mechanically over-defined with its 4-point interface, it represents a compromise between a 128 simple design process for optical assemblies and mechanical stability and versatility. The cubes 129 allow convenient orthogonal alignment along an optical axis and are easier to stack compared to 130 triangular pyramid or hexagonal units. Mechanical imprecisions e.g. due to faulty 3D printing 131 can be compensated by adjusting the screws.

132
To provide electric components with power, wires added to the screws sitting in the cubes 133 and to the conducting chromium ball-magnets can ensure an electric connection at a minimum 134 number of visible cables since they are hidden inside the cube. In order to extend the grid in all 135 room-directions the base-plate has holes at all faces to join multiple plates via screws together. 136 Additional M6 holes enables adaption to optical tables, support boards or breadboards to assure 137 stable and long-lasting mount.

139
The cube inserts can be fully customized to adapt external elements, thus underlining the 140 idea of creating an open-standard. The online-repository provides all relevant dimensions and 141 CAD-Design templates for Autodesk Inventor and OpenSCAD to quick-start development with 142 UC2. Additionally, a number of video-tutorials can be found in online video platforms. With 143 this we invite people to develop their own modules and contribute with their designs to the UC2 144 system.

145
Inserts are slid into the cube which allows to adjust the position along the optical axis. By having 146 dedicated rulers and spacers, one can make sure, that the insert is parallel to the cubes' face 147 and optical dimensions can be reproduced. The two CAD-files below show examples for inserts 148 at an angle of 0 • and 45 • w.r.t. the optical axis, which fits into the standard centro-symmetric 149 1 × 1-cube.  Figure 7. Generic design for a cube-insert Since the cube is centro-symmetric, an insert can be rotated in all directions. The figures show exemplary insert-designs for a 0 • -and 45 • -version, meant for a Thorlabs cage module and a mirror respectively. The smooth but slightly layer-structured 3D-printed surfaces allow an easy sliding mechanism, but keep components in a fixed position in the same time. Newly designed cubes-adapter or inserts simply need to follow the dimensions are visualized in the simplified version of the CAD drawing.

151
The core idea of the modularity inside the UC2 system is based around the Fourier-optical 152 principle, meaning that adjacent lenses are placed so that focal planes of adjacent optical parts 153 are coinciding in order to minimize effects like aberration and vignetting. This requires a 154 focal-to-focal distance of multiples of 50 mm. Determining Fourier-and image planes as optical 155 interfaces enables sub-grouping of the whole system into modules and optical building blocks. 156 The optical axis always goes through the centre of and perpendicular to an open cube facet. 157 Beam-folding by 90 • in all directions (i.e. X, Y , Z) can be assured using mirrors.

158
In case of more complicated assemblies like the openISM module, it is advisable to design a 159 monolithically printed block to assure higher precision and robustness. The outgoing plane 160 (i.e. image or Fourier plane) should again adapt to the following plane from the next cube/module. 161 162

13/44
A simple example is given by a Keppler telescope illustrated in Fig. 8 which can be 163 accomplished by using two lenses (f 1 = 50 mm, f 2 = 100 mm) with a distance of d 1,2 = 150 mm 164 between their principle planes. The cellphone microscope shown in Fig. 8 gives another example 165 how simple it is to create an imaging system, where the tube length of typical finite corrected 166 objective lenses (e.g. d tl = 160 mm) is reproduced by the two folding mirrors and a spacer before 167 the intermediate image gets relayed by the ocular and imaged by the cellphone camera.
168 Figure 8. Good practice for UC2 assemblies -The core-units are the UC2 optical building blocks on a grid of integer 50 mm (top). By combining two lenses L 1 , f 1 = 100 mm and L 2 , f 2 = 50 mm one can create a Keplerian telescope (middle); A more complex assembly can be created using objective lenses, LED matrices and oculars to create a smartphone microscope (below). To create reproducible long-term measurements with the incubator-enclosed microscope, we 172 created a Python-based GUI which runs on a Raspberry Pi equipped with a 7-inch touch-screen. 173 A detailed description on how the system needs to be installed can be found in the dedicated 174 software-repository. The user interface based on the kivy-framework (v1.11.0, [4]) is visualized 175 in Fig. 9 and allows the control of several hardware elements such as individual addressing 176 of LEDs in the LED-matrix, movement of motors connected to the system (e.g. along X, Y, 177 Z) and intensity control of the fluorescent illumination. In addition to that, the software also 178 allows the scheduling of long-time experiments. This includes the choice of the illumination 179 modality (e.g. DPC, Fluorescence, bright-field, dark-field,etc), the timing of image capture and 180 the overall duration of the experiment. The images captured using automatic settings such as 181 auto-exposure and auto white balance (AWB), are saved as JPEG-compressed photos on the 182 internal SD-card in order to save memory. 183 Figure 9. Basic settings for the GUI The GUI is divided in the hardware-control section a) and experiment configuration panel in which the user defines long term time lapse e.g. for the incubator-enclosed or light sheet microscope. c) an exemplary workflow of a typical biological experiment over multiple days is visualized.
The software can be used to control wired as well as wireless components connected to the 184 Raspberry either via I 2 C or WiFi. A dedicated UC2-I 2 C-device adapter created in Python 185 as shown in Fig. 10 preserves the modular nature of the UC2 system since the command-set 186 15/44 sent by the Raspberry Pi to any I 2 C or MQTT device in the same network follows a modular 187 instruction set. 188 Figure 10. Schematics of the I 2 C device adapter The Raspberry Pi acts as a I 2 C master device which sends controlling commands to all slaves in the same network created by the four-wired signal (5V: power-signal, GND: ground-signal, SDA: signal data, SCL: signal clock). UC2 relies on low-cost Arduino Nanos which convert the I 2 C commands into hardware control operations for motors, LEDs or anything else controllable through microcontrollers.
The MQTT-based wireless communication system visualized in Fig. 11 has the advantage 189 that each device can control any other device. This is advantageous, for example, when one 190 mobile phone is used as an image capture device and another mobile phone is used as a remote 191 control for a setup. Since the devices can be reached from remote places through the internet, 192 adjusting or readout of parameters could theoretically be done from any place which supports 193 internet access.

194
Good practice for the TCP-IP based MQTT-network connection is to setup a dedicated 195 WiFi-Router (Netgear Nighthawk R7000) which handles the different connections. A MQTT 196 broker (i.e. server) can be created using either a cellphone or the Raspberry Pi by using open-197 source software such as Moquette [5] or Mosquitto[6]. We also developed a stand-alone Android 198 application (APP) which incorporates the MQTT broker as well as the MQTT client in order to 199 use the system independent from any external devices (e.g. in the field). The source-code can 200 also be found in our software repository. 201 Figure 11. Schematics of the MQTT Connection All devices are connected to the same network (e.g. WiFi hotspot) and MQTT-broker (e.g. server) which can be represented by a Raspberry Pi. The MQTT-based network protocol allows multiple devices to be controlled remotely. Each MQTT client (e.g. ESP32) reacts on a sent command. The aim of this biological study was the long-term observation of macrophages in vitro under 204 different environmental effects. Other than putting an incubator-enclosed on a microscope 205 stage, we decided to place the whole microscope in a bench-top incubator-enclosed (Heraeus 206 Instruments, Germany) which ensures suitable conditions for living organisms (e.g. Temperature, 207 acidity control via CO 2 -level). We formulate requirements for long-term biological imaging as 208 follows: The resulting prototype which was created based on these requirements using the UC2-system 218 is shown in Fig. 12 and in more detail in the online repository. It results a simple optical path 219 derived from an inverted compound microscope with a finite-corrected objective lens (Generic 220 17/44 brand, 10×, NA=0.3) yielding in a theoretical resolution of 1.8 µm with a coherently illuminated 221 sample (e.g. only one LED on the optical axis). In order to reduce the overall size of this device, 222 we reduced the tube length (e.g. distance between the objective lens and the intermediate image 223 plane/camera sensor) from 160 mm to ≈ 100 mm which results in a longer working distance and 224 reduced effective magnification. The optical resolution using the Raspberry Pi camera (V2.1, 225 Sony IMX 219, d pixel = 1.12 µm, Bayer-pattern, t exp = 100ms, UK) gives d min < 2.3 µm and an 226 effective magnification of ≈ 7× were quantified by imaging a USAF chart as visualized in Fig. 13. 227 To achieve multi-modal imaging, we used a 8 × 8 RGB LED array (Adafruit #1487), where only 228 a subset of the available LEDs are within the NA of the detecting objective lens. The selection 229 of the LEDs was done through a customized GUI on the Raspberry Pi while the visible contrast 230 was maximized. For fluorescent illumination we decided to use a dark-field-like epi illumination. 231 A module, sandwiched between the objective lens (e.g. Z-stage) and the sample, hosts a number 232 of high-power LEDs sitting on a star-LED, while the resulting dark-field illumination blocks the 233 zeroth-order which makes selecting the emission filter more cost-efficient, since only the residing 234 thus weaker stray-light has to be filtered out [7].

235
All electric components are connected to a micro-controller which was an Arduino Nano in the 236 wired (e.g. I 2 C) and an ESP32 in the wireless (e.g. MQTT) control-mode. The magnetically 237 fixed LED-matrix can be easily removed to gain space during exchange of cell culture media or 238 in case malfunctioned hardware needs to be replaced. Being free in the choice of the distance 239 between the sample and the illumination unit provides additional space for wires and tubes for 240 applications like flow-cytometry or lab-on-a-chip.  Figure 12. Scheme of the inverted microscope used in the incubator-enclosed -A LED-array allows the selection of the illumination angle and enables quantitative phase imaging (QPI). The objective lens inside the Z-stage can be moved up-and down by a stepper-motor controlled by an ESP32, while the optical path is folded using a simple mirror to form an image on the Raspberry Pi camera. The camera is connected to a Raspberry Pi equipped with a 7 inch touch-screen at an overall price-tag of ≈ 300 Euro To be able to focus the sample during the acquisition series we designed a customized 243 monolithic Z-stage (see our GitHub repository) which is inspired by the open-flexure design 244 of Bowman et al. [8]. It is based on a spiral-bearing where a level-arm pushes the objective 245 up and down using a stepper-motor (China, 28BYJ-48). This way, the Z-stage produces no 246 radial shift while it is moving. To ensure, that the sample stays in place, it is fixed by using a 247 magnetic clamp, which simplifies its removal to replace the cell culture medium in the culture 248 dish. The sample stage and Z-stage were printed using ABS to ensure sufficient thermal stability. 249 Other components were made of PLA. An additional module which hosts a pair of low-cost 250 stages allows XY positioning of the sample (see xyz-assembly) with a precision of ≈ 20 µm which 251 can be further optimized with micro-stepping. Yet, to keep things simple, we did not use this 252 motorized XY sample stage in our experiments.

253
For in-vitro measurement, we placed the microscope into a standard S2 biological laboratory. 254 We disinfected the microscope by spraying it with 70% ethanol. After setting up the microscope, 255 the imaging parameters were selected and the microscope ran for several days before the data 256 was transferred from the Raspberry Pi to an external storage medium for further processing. 257 For the details on cell-preparation, see section 10.1.

258
The 8 × 8 RGB LED array also enables quantitative phase imaging based on the work by 259 Tian&Waller[9], which is achieved by capturing a series of obliquely illuminated phase-objects 260 and performing a deconvolution with the corresponding point spread functions. This feature was 261 not used during the long-term acquisition since the contrast was sufficient and the additional 262 effort to compute each result-image was rather high. To monitor long-term changes in the 263 in-vitro experiments with low contrast cells (e.g. unlabelled macrophages) we relied on oblique 264 illumination to exploit the phase gradient caused by the transparent cells.

265
All design files including the bill-of-material (≈ 300 Euro)and an illustrated step-by-step assembly 266 tutorial can be found here. The iterative development process of the brightfield (BF) microscope required multiple design 273 changes and exchanges of the modules which had to be transported from the optical laboratory 274 to the UKJ every time. The approx. 6km distance was covered by bicycle, whereby strong 275 vibrations did not affect the imaging quality and stability of the microscope in the field. We 276 directly used the transportation as a stress-test of system-robustness by only roughly packing it 277 into a bag and then carrying the light-weight systems by bicycle. Even after many transports, 278 the systems are still taking images of identical quality. Due to the use of 3D printed thermoplastic material (PLA, ABS), certain parts tend to 281 bend during long-term experiments. By choosing ABS over PLA in places, where components 282 experience larger tension, like the z-stage and the base-plate, the problem can successfully be 283 compensated. We found, that once the Z-stage settled, it experiences almost no deformation 284 over long time. One stage equipped with a 10×, NA = 0.3 objective lens was in focus even after 285 3 month, which can be appreciated during the 7-day measurement in Supp. Videos 1 and 2, 286 where no automatic or manual refocusing was performed.

287
Even though we decided to use ABS for printing stages, which proved to yield sufficient long 288 term stability in an incubator after the initial thermal equilibration period, we decided to test 289 the thermal stability of PLA, which is easier to print, especially on low-budget printers. However, 290 over time a deformation of the material can be observed, mainly caused by the heavy microscope 291 objective lens. Even after adding additional supporting material, the objective pulled the Z-step 292 mechanism downwards. In Fig. 13d) we show an exemplary drift plot in (X/Y/combined; green, 293 orange, purple) over several days with the autofocus switched on (see next section for more 294 information) in a bench top incubator-enclosed at 37 • C and 100% humidity. An about 200 µm 295 can be observed over a measurement period of about 2 days. The slope is almost linear after 296 the warm-up phase within the first 2 hours. A comparison with the same measurement with 297 autofocus turned off leads to a constant drop of focus quality as indicated in Fig. 13e).

298
Although 3D printing gives the opportunity to build the system anywhere in the world, it is 299 not yet on the optimal level of reproducibility. Two same printers will always give a slightly 300 different result and the variation between the many brands of 3D printer can be significant. 301 In order to compensate for this, one needs to iterate over many versions of the same design, 302 20/44 producing a lot of plastic waste. We found the method to be extremely useful for development 303 but less beneficial in the production phase. For long-term measurements in biological laboratories, it was of great importance to obtain sharp 306 images over long periods of time. This led to the development of a software-based autofocus 307 algorithm that regularly refocuses the objective lens during experiments. The simple algorithm, 308 as shown in figure 13b) performs a full scan along the Z-axis between a minimum and maximum 309 position, maximizing the image sharpness: argmax z var(I(z) ⊗ g), where var(·) indicates the 310 variance in each intensity image at a Z-position I(z). Low-pass filtering using a convolution 311 with a Gaussian kernel g helps to remove noise that may result in unwanted high frequencies 312 dominating over in-focus structures. Alternatively the direct spatial filter (i.e. Tenengrad) [10] 313 image sharpness metric can be used.

314
The red path in fig. 13d) shows the focus quality over time. Every hour the microscopes 315 refocussed, as seen by the periodic spike structure. From minute 1750 the autofocus lost its 316 focus completely, which is also visible in the drift plot (violet). This was most probably caused 317 by opening the incubator door. The system restored the focus after about two hours. Even 318 though the autofocus routine worked to our satisfaction, it was finally not needed as the ABS 319 material in combination with the relatively low NA provided sufficient stability for our long-term 320 experiments. Figure 14. Setting up the incubator-enclosed microscope -a) The microscope fits inside a small box and can conveniently be printed and assembled and transported to the bio-lab using a commuter bag on a bike. In b) we show a customized application, where a microfluidic Ibidi µ-chip with endothelial/macrophage perfused co-culture was placed on the incubator-enclosed microscope, before all cables were connected. c) The next step requires setting up experimental details such as duration and interval, as well as illumination settings using the touch-screen on the GUI. d) due to their small footprint multiple devices fit inside a single incubator for multiplexed experiments.

321
Especially in long-term experiments it is of great importance that environmental vibrations 322 are minimized. The bench-top incubator (Heraeus Instruments, Germany) was placed on an 323 ordinary lab-bench which experiences low-frequency vibrations resulting from footsteps. The 324 fluctuations of the FOV in all long-term experiments (see Supplement 1) were reduced to an 325 acceptable level using a heavy metal-plate as a base for the microscope during the experiments. 326

327
To provide additional information about the optical resolution of fluorescent imaging, we provide 328 a benchmark between a cutting-edge research microscope (Zeiss Axiovert TV, Germany) with 329 an emCCD camera (Andor iXon3 DU-897) equipped with an oil immersion objective (Zeiss, 330 100×, NA1.46, TIRF, Germany) and our UC2 fluorescence microscope based on infinity optics as 331 indicated in Fig. 15. The setup in Fig. 15 Figure 15. Fluorescence microscope based on infinity optics -The setup shows the arrangement of modules according to a typical inverted microscope equipped with infinity optics. Therefore a laser (635/637 nm) is expanded and focused into the BFP of the objective lens. The resulting plane wave excites the fluorescently (mCLING ATTO647n) labelled sample (E. Coli ). Using the UC2-based setup the Raspberry Pi a) and cellphone camera b) are compared to a research-grade microscope c) (Zeiss Axiovert TV) equipped with an emCCD camera (ANDOR iXon3 DU-897). The improved SNR in case of the monochromatic cellphone camera sensor clearly resolves the bacteria membrane, which cannot be seen in case of the Raspberry Pi camera.
As a testing sample we rely on ATTO647N (SYSY, Germany) labelled E. Coli fixed on a 343 coverslip (for protocol see Supp. Notes 10.6). The qualitative comparison in Fig. 15a)-c) between 344 the inverted research microscope and the UC2 setup (see Fig. 15) shows an increased noise level 345 in case of the Raspberry Pi camera Fig. 15a) which results in a loss of fine structures, like the 346 bacterial membrane which is clearly visible in case of the cellphone camera in Fig. 15b). We 347 relied on RAW-frame acquisition to avoid unwanted artifacts due to denoising, background-level 348 subtraction or compression. In case of the Raspberry Pi, we used a custom-written Python 349 program based on the picamera library (v1.13), which saves the RAW Bayer pattern, where 350 we extracted the red-channel. Similarly, we used a custom-written Android application, which 351 captures unprocessed RAW frames from the Huawei P20 monochromatic camera available 352 under https://GitHub.com/bionanoimaging/cellSTORM-ANDROID. We set similar acquisition 353 parameters in both experiments, being t exp = 6.6 ms and ISO = 800 in case of the Raspberry 354 Pi and ISO = 1000 in case of the cellphone camera, since ISO = 1000 is not available in the 355 Raspberry Pi.    Figure 16 shows an exemplary plot generated by the cal readnoise routine of the Nanoimag-376 ingPack applied to a) cellphone and b) Raspberry Pi camera images. The monochrome sensor 377 showed a significantly lower read noise level. In both cameras, a linear dependence of the variance 378 on the mean was observed, as one would expect from scientific grade sensor.  To exemplify that our modular optical system can be used with a variety of different open-382 source image processing algorithms, we choose the freely available code for the annular intensity 383 diffraction tomography aIDT from Li et al. [15]. The algorithm is especially interesting since it 384 only requires a series of images with a varying illumination direction k illu , and the algorithm 385 can self-calibrate the illumination direction -ideal for a system which may experience slight 386 misalignment over time. Other than methods like Fourier Ptychography Microscopy (FPM), the 387 detection requires only illumination angles close to the edge of the detection pupil (i.e. close to 388 dark-field illumination). We added an RGB LED-ring (Adafruit, #1643), where each LED can 389 be addressed individually using a microcontroller (e.g. Arduino Nano, Espressif ESP32). We 390 used only the green-channel to produce quasi-monochromatic light and acquired a set of images 391 of fixed endothelial cells using a cellphone (Huawei P20 Pro, BI-CMOS Sony, IMX 286, China). 392 It was of great importance to acquire the data in RAW-mode since the automatic calibration 393 routine of the aIDT -algorithm failed when the images were compressed (e.g. JPEG).
394 Figure 17. Scheme of the aIDT assembly using an LED ring and cellphone camera -The LED ring illuminated the phase sample from 16 different angles, which produces a series of images feeding the inverse model. This algorithm can recover a focus-stack of the quantitative phase. The cellphone can send MQTT commands to the LED-ring to synchronize illumination frame acquisition.
The optical setup agreed to the incubator-enclosed microscope described in section 7.1, where 395 we used a 10×, NA = 0.3 objective lens. The illumination NA had to be slightly less than the 396 detection NA to position all illuminating plane waves inside the detection pupil which follows 397 into the requirement of NA illu ≤ NA det . Since the LED-ring has a radius of r ring = 16 mm, the 398 NA illu is governed by the distance between the LED and the sample d sample : which requires a distance of d sample ≥ 54 mm and was adjusted experimentally to about 74 mm, 400 a smaller effective NA of illumination. To this aim, an additional layer in the base-plate (not 401 shown in Fig. 17)  plane illuminates a (fluorescently labelled) sample perpendicularly to the detection direction, 407 gained lots of attention during the last decade. It provides gentle 3D-imaging of volumetric 408 in-vivo and ex-vivo samples [17,18]. Though this concept of optical sectioning in order to 409 increase the optical resolution along the detection axis is straightforward, it becomes even 410 more obvious if one experiences it in a hands-on experiment. Therefore, we started a series of 411 workshops to demonstrate the working principle of these microscopes to their users, available 412 with a comprehensive alignment tutorial in online repository. The overall openSPIM-inspired 413 setup visualized in Fig. 18 is kept simple in order to give users the chance to build these setups 414 on their own. This simple configuration proved itself to be optimal for workshops. To improve 415 the imaging quality, an eyepiece and a smartphone can be used for image acquisition. The setup hosts a blue laser pointer (λ c = 445 nm) as the illumination source, which gets ex-419 panded by a telescope. This telescope first focusses the incoming parallel light using a cellphone 420 lens (Apple, iPhone 5, NA = 0.24, f = 3.2 mm, ≈ 5Euro) before being collimated by a second 421 lens (f = 20 mm) to achieve a magnification of ≈ 6×. This beam gets shaped by a cylindrical 422 lens (Comar optics, f = 63 mm) to create the 1D line-profile before it passes a kinematic mirror 423 28/44 mount cube featuring ball-magnets sitting on 3 ferromagnetic M3 screws, followed by a magnetic 424 plate (e.g. galvanized steel, 30 × 40 mm). The light sheet is further focussed by the illuminating 425 objective (e.g. 4×, NA = 0.14) into the sample. The resulting light sheet inside the sample plane 426 has a theoretical thickness of 200 µm based on rather pessimistic assumptions on the profile of 427 the laser-diode, while the actually measured thickness is around 50 µm and thus slightly better 428 than the depth of field (DOF) of the objective lens d z ≈ 60 µm. When using a static light sheet, 429 aligned onto the in-focus plane of the detection path, the Z-stack is obtained by moving the 430 stage that carries the sample and acquiring an image for each step. The 3-dimensional image is 431 then reconstructed.

432
The sheet illuminates the fluorescent sample sitting on a movable sample stage. The sample 433 stage is equipped with a stepper motor (China, 28BYJ-48), which pushes a magnetic plate 434 sitting on a flexure bearing. The step size is governed by the pitch of the screw and the smallest 435 step size of the motor which leads to a reproducible step size of ≈ 25 µm. The sample holder, 436 which can accommodate syringes with samples embedded in agarose, is equipped with 3 ball 437 magnets to position the sample coarsely such that the sample is in focus of the imaging objective 438 lens. Additionally, a 3D-printed water chamber can be placed on the moving sample stage to 439 reduce scattering and aberration of the illuminating as well as the detection beam path.

440
The detection path (Fig. 18, green) follows a typical finite-corrected microscope scheme as also 441 illustrated in Supplementary 7.6, where either a 4×, NA = 0.14 or a 10×, NA = 0.3 objective 442 lens was used. As detector we choose either the Raspberry Pi camera module (V2.1) without a 443 lens (Configuration 1 in Fig. 18) or a Raspberry Pi/cellphone camera with a lens but combined 444 with an eyepiece (No Name, 20×, China) (Configuration 2 in Fig. 18). For all volumetric images 445 presented in this manuscript, a Raspi camera equipped with the objective lens together with the 446 eyepiece were chosen. As an emission filter we relied on a gel-filter (Lee, #010, medium yellow). 447 Z-stacks can be conveniently acquired using the GUI running on the Raspberry Pi. It automati-448 cally moves the sample step by step and acquires an image for N Z-positions.
449 Figure 19. Light sheet microscope a) Complete light sheet setup with an eyepiece in front of the Raspberry Pi camera to increase the FOV. b) 3D reconstruction of a zebra fish embryo head and c) 3D reconstruction of a drosophila larvae from a z-stack obtained with this setup. The rendering was performed in Clearvolume [10].

Alignment of the setup 450
We provide a detailed description of the alignment procedure in our GitHub repository. Addi-451 tionally the Video Supplement 5 gives an introduction on how to convert the incubator-enclosed 452 into a light sheet microscope within 5 minutes. We assume all parts inside the FullBOX are already assembled and work properly. For each part 456 we give a detailed build and assembly instruction in our GitHub repository. In the following 457 paragraphs we will call the incubator microscope IM and light sheet microscope without the 458 eyepiece (e.g. Raspberry Pi camera with removed objective lens) LM.    Figure 20. Conversion Process In order to build the light sheet microscope (right) using UC2 components, one can reuse several parts from the incubator microscope (left). All other parts can be found in the "FullBOX".

481
Many consumer-grade electronics like video-projectors (e.g. digital mirror devices, DMD) or 482 movie screens enabled entertainment "on-the-go" for a very low price, compared to scientific 483 instruments, due to mass production. Besides wide-field projection systems based on liquid 484 crystals on silica (LCoS), liquid crystal display (LCD) or DMD displays, more exotic laser-485 scanning-based systems (e.g. Sony MP.CL1A, Japan, ≈ 300 Euro) enabled us to create a 486 UC2-ready image-scanning microscopes (openISM) for around 300 Euro.

487
The laser scanner, equipped with a small Micro-Electro-Mechanical System (MEMS) scans a 488 set of RGB (λ blue = 450nm, λ green = 530 nm and λ red = 650 nm) laser-beams over the 2D (e.g. 489 X/Y) plane with a frame-rate of 60 f ps at a spatial resolution of 1920 × 720 pixel 2 to create an 490 aerial image. A customized UC2 module enables the integration to our 50 × 50 mm 2 standard.

495
In Post-processing, all illumination spots (per frame) are treated in parallel. For each spot, a tile 496 -meaning pinhole of multipixel size -is placed around its centre and gets extracted. Pixels at a 497 distance to the nearest illumination spot then get displaced towards it by half its distance to 498 account for the most-probable fluorophore position considering the current illumination and its 499 detection position. Finally, the signal is integrated and written into the final image at the position 500 where the tile-centre was placed. This procedure leads to a resolution of a factor up to √ 2 [21, 22] 501 compared to standard confocal microscopy. The optical sectioning of this processing scheme is de-502 termined by the size of the virtual pinhole, the extracted region around each illumination position. 503 504 Figure 21. Scheme of the image scanning microscope (openISM) -The light path shown in the schematics above starts with the laser-scanning projector, where the beam gets collimated using the lens L 1 and re-imaged into the pupil of the objective lens using lens L 2 . This telescope magnifies the mirror by a factor of 5. The detection path (green) follows a typical infinity-corrected microscope where either a CMOS (e.g. IDS, BASLER) or cellphone camera combined with an eye-piece.
All measurements in this manuscript are acquired using the monochromatic cellphone camera 505 inside the Huawei P20 Pro. The design-files and additional explanation can be found here. The optical setup shown in Fig. 21 is straight forward. The resonating MEMS in the projector 508 needs to be imaged into the pupil plane of the microscope objective lens. In order to get 509 high-resolution images, the pupil is ideally over-filled by the image of the scan mirror. We 510 assumed a diameter of the aluminium mirror of d mirror = 1.5 mm and a pupil diameter of around 511 d pupil = 5.5 mm which requires a telescope, created by a lens f 1 = 30 mm and a following tube 512 lens f 2 = 180 mm. The low-cost infinity-corrected microscope objective (Optika, 20×, NA = 0.4, 513 N-plan) was placed in a motorized Z-stage to allow focussing the objective relative to the sample. 514 A set of different dichroimatic-mirror cubes with suitable filters (excitation/dichromatic/emission-515 filter: Comar Optics, 465 IK/510 IY/526 IB) allows the switching between different fluorophores 516 and excitation wave-lengths. The detection path was generated by a f T L = 180 mm tube lens 517 before a 20× mono ocular images to infinity. This way a cellphone-camera can create a sharp 518 image if the focus is set to infinity. The effective pixel size depends on the selection of the 519 cellphone and results in d pix ≈ 150 nm when using the Huawei P20 Pro.

520
Since the laser-scanner was not meant to be used for scientific instrumentation, technical details 521 nave not been provided, making interaction with it cumbersome. Also, the uncommon pixel 522 number of 1920 × 720 pixels 2 leads to an unknown interpolation of the image provided by graphic 523 cards, thus not resulting in "true" pixel information (i.e. one-to-one pixel relationship). We 524 solved this by using a Macbook (Apple, 13-inch, MPXQ2D/A, USA) using an USB-C to HDMI 525 adapter at a display-resolution of 720p in combination with a customized Python script which 526 generates and displays the ISM-patterns. The monochromatic cellphone camera (Huawei P20 527 Pro, China) was driven using the open-source software FreeDCam ([23]), where the exposure 528 time t exp = 1/60 s matches the frame rate of the laser-scanning projector in order to reduce 529 temporal bouncing effects of between frame rate and laser round trip. 530 7.10 Quantitative Imaging using openKOEHLER 531 An alternative to incoherent imaging methods, where fluorescently labelled cells are captured, 532 is given by quantitative phase-imaging (QPI). This modality is very attractive for biological 533 samples, because it is a label-free method thus obviating the time consuming labelling procedures 534 which are sometimes also altering the behaviour or appearance of the subject of observation. 535 Based on the adaptive illumination scheme described in [24], we incorporated a low-cost HDMI 536 video-projector (40 Euro, Generic brand, China), which adds a fully adaptive illumination source 537 to the system (see Fig. 22). The LCD-panel inside the projector was placed in a customized 538 UC2 module, which includes the high-power LED for the illumination, the controlling PCB 539 which translates the incoming HDMI video-signal for the 320 × 240 RGB 2,4 inch TFT screen 540 (ILI9341, China) and a set of lenses to ensure correct Koehler illumination.

541
The LCD plane is a conjugated plane of the pupil plane of the microscopic objective lens and 542 can create different illumination schemes such as oblique illumination, (quantitative) differential 543 phase contrast (qDPC), Fourier Ptychography Microscopy, and Dark-field by addressing a specific 544 bitmap pattern on the 2D plane. Each pixel produces a plane-wave in the sample-plane if it is 545 in the "on"-state and can transfer a specific range of object frequencies. The (approximately) 546 incoherent super-position of all (coherent) camera-plane images of the object describes the 547 detected image according to the "Abbe"-method (see. [25]). 548 Figure 22. Ready-to-print openKOEHLER module -The openKOEHLER module accommodates an LCD optically conjugate to the objective pupil plane to create an adaptive illumination source. The module can be controlled using a standard computer equipped with an HDMI port. By varying the pattern in the LCD-plane, the contrast of transparent cells captured by a cellphone camera can be optimized.

549
The optical system, shown in Fig. 22 follows Koehler illumination [25], where the condenser 550 aperture plane is imaged into the pupil plane of the detecting objective and a field-stop is imaged 551 into the sample plane to reduce stray-light from out-of-focus and regions. Taking the high-power 552 white LED equipped with a collimating lens and adding two injection-moulded aspherical lenses 553 (Thorlabs ACL3026U, f = 26 mm, NA = 0.55) L 1 and L 2 images the LCD-plane representing 554 the aperture plane into the pupil plane of the objective and further images a field-stop in the 555 sample plane.

556
A variation of the pattern displayed on the LCD controlled as a secondary display (e.g. HDMI-557 connection) directly influences the visible contrast. Depending on the displayed pattern (e.g. 558 small point, disk, annulus, etc), the degree of coherence can be chosen freely. All files to replicate 559 this experiment can be found here. The modular concept of building optical setups has proven to be very useful in demonstrations 562 of various principles in microscopy and image formation in general. To exploit this potential, we 563 designed TheBOX. The comprehensive toolbox provides components for explaining the basics of 564 ray optics, diffraction and different microscopy modalities. It comes in two versions, Simple and 565 Full. The SimpleBOX contains only passive components and covers the optical experiment of 566 secondary and high school level. The FullBOX is equipped with electronics like microcontrollers 567 (ESP32) and a Raspberry Pi microcomputer including a 7-inch touch-screen, keyboard and a 568 camera module. In addition to basic experiments, this advanced box can create a compound 569 microscope, a light sheet setup and other setups which are suitable for the everyday life in 570 the biological lab. The complete list of setups can be found in the online repository. The 571 target groups for TheBOX are schools and other institution that provide courses on optics and 572 34/44 microscopy. Thanks to its low price (600 Euro), a school/institution/course organiser can acquire 573 or build multiple boxes and each participant can therefore have access to a hands-on experience, 574 which is nowadays typically not the case. With this we try to provide the "Arduino for optics" 575 meaning that the setup-time for getting started is heavily reduced by the plug-and-play nature 576 of the optical building blocks. 577 We have successfully tested the concept of TheBOX in a series of workshops which are exemplary 578 documented for the Inline Holographic Microscope and the Light sheet hackathon for the 579 "International Day of Light" (IDOL) in the "Lichtwerkstatt Jena" and HHMI Research Institute 580 on building a light sheet setup based on UC2 toolbox from scratch. TheBOX has proven to be a 581 useful tool for microscopists, physicists, biologists and people generally interested in optics and 582 microscopy at all skill-levels to learn e.g. the concept of Fourier optics or study organisms at a 583 cellular level.

584
A set of trial-runs in Thuringian high schools (Carl-Zeiss-Gymnasium Jena, Montessori Schule 585 Jena, Königin-Luise-Gymnasium Erfurt) to learn how the system can be used for "STEAM"-586 education (Science, Technology, Engineering, Art and Mathematics) yielded very positive 587 feedback. This lead to interdisciplinary projects, where students study the application of the 588 UC2 system to e.g. track micro-plastic in drinking water using special fluorescent markers. A 589 series of tutorials on how to print, order and assemble can be found here.  [H] FullBOX is an extended version of the SimpleBOX which adds active components like 599 motors, LEDs and electronics to the cubes making them "smart". Using micro computers like 600 the Raspberry Pi and microcontrollers like the ESP32 or Arduino, can create more complex and 601 fully autonomous setups ready for the every-day measurement in the optical lab or for the use in 602 high schools and universities. The overall material cost is in the range of 600 Euro from known 603 online retailers. A list of achievable experiments is given by table 4. We provide a comprehensive bill-of-material for all assemblies, applications and boxes in an 608 interactive spreadsheet. The prices heavily depend on the choice of the retailer and distributor 609 and can be observed in Table 6. 610   The core-idea of the UC2-system is to be open, so that it can be used by a large number of 614 people. In best case, users do not only use the system, but participate actively in the iterative 615 design process by suggesting new applications, finding errors. This can conveniently be done 616 using for example the issue-tracking feature embedded in the GitHub repository. Alternatively 617 private messages, feedback-rounds on workshops or discussions through social media channels 618 such as Twitter can be used for feedback mechanism. After promoting the principle of the UC2 619 system in a number of talks and workshops, many people started replicating the system. Since 620 we cannot keep track of the number of downloads and actually printed systems, it is hard to track 621 how many people apart from us actually built and used it. Nevertheless, we found the scientific 622 community on Twitter, where we created a dedicated UC2-Twitter account (@openUC2)as a 623 helpful measure and feedback mechanism to track issues, ideas, improvements and to give a 624 rough estimate how many systems are in actual use (exemplary shown in Fig. 23). With the 625 MDK provided through our GitHub repository, we invite people to start developing their own 626 modules and post their own designs by e.g. by forking the repository or sharing it through 627 different channels like Twitter.

604
628 Figure 23. Publicly announced UC2 workshops and use-cases Even though it is hard to track how many UC2 systems are in actual use, we collected an exemplary over-view of some user-feedback and successfully assembled UC2 setups.
From the workshops we found, that it is of great importance, that the entrance threshold is 629 set very low to attract new users to start developing using the UC2 system. In this way, the 630 documentation should be self-explanatory and thus act as a decentralized multiplier.